00:00:00.001 Started by user sys_sgci 00:00:00.008 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-upstream/autotest.groovy 00:00:00.008 The recommended git tool is: git 00:00:00.009 using credential 00000000-0000-0000-0000-000000000002 00:00:00.011 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.020 Fetching changes from the remote Git repository 00:00:00.022 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.033 Using shallow fetch with depth 1 00:00:00.033 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.033 > git --version # timeout=10 00:00:00.041 > git --version # 'git version 2.39.2' 00:00:00.041 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.051 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.051 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.065 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.077 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.089 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.089 > git config core.sparsecheckout # timeout=10 00:00:02.098 > git read-tree -mu HEAD # timeout=10 00:00:02.112 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.130 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.130 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.249 [Pipeline] Start of Pipeline 00:00:02.263 [Pipeline] library 00:00:02.264 Loading library shm_lib@master 00:00:02.265 Library shm_lib@master is cached. Copying from home. 00:00:02.296 [Pipeline] node 00:00:02.303 Running on ME3 in /var/jenkins/workspace/autotest-per-patch 00:00:02.311 [Pipeline] { 00:00:02.326 [Pipeline] cleanWs 00:00:02.334 [WS-CLEANUP] Deleting project workspace... 00:00:02.334 [WS-CLEANUP] Deferred wipeout is used... 00:00:02.340 [WS-CLEANUP] done 00:00:02.345 [Pipeline] stage 00:00:02.349 [Pipeline] { (Prologue) 00:00:02.460 [Pipeline] withCredentials 00:00:02.469 > git --version # timeout=10 00:00:02.479 > git --version # 'git version 2.39.2' 00:00:02.491 Masking supported pattern matches of $GIT_USERNAME or $GIT_PASSWORD or $GIT_ASKPASS 00:00:02.493 [Pipeline] { 00:00:02.501 [Pipeline] retry 00:00:02.503 [Pipeline] { 00:00:02.670 [Pipeline] sh 00:00:02.948 + git ls-remote https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master 00:00:04.335 [Pipeline] } 00:00:04.356 [Pipeline] // retry 00:00:04.361 [Pipeline] } 00:00:04.382 [Pipeline] // withCredentials 00:00:04.392 [Pipeline] httpRequest 00:00:04.408 [Pipeline] echo 00:00:04.409 Sorcerer 10.211.164.101 is alive 00:00:04.418 [Pipeline] httpRequest 00:00:04.421 HttpMethod: GET 00:00:04.422 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.423 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.424 Response Code: HTTP/1.1 200 OK 00:00:04.424 Success: Status code 200 is in the accepted range: 200,404 00:00:04.425 Saving response body to /var/jenkins/workspace/autotest-per-patch/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.567 [Pipeline] sh 00:00:04.842 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.861 [Pipeline] httpRequest 00:00:04.878 [Pipeline] echo 00:00:04.879 Sorcerer 10.211.164.101 is alive 00:00:04.914 [Pipeline] httpRequest 00:00:04.919 HttpMethod: GET 00:00:04.919 URL: http://10.211.164.101/packages/spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:04.920 Sending request to url: http://10.211.164.101/packages/spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:04.920 Response Code: HTTP/1.1 404 Not Found 00:00:04.921 Success: Status code 404 is in the accepted range: 200,404 00:00:04.921 Saving response body to /var/jenkins/workspace/autotest-per-patch/spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:04.931 [Pipeline] sh 00:00:05.210 + rm -f spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:05.225 [Pipeline] retry 00:00:05.227 [Pipeline] { 00:00:05.248 [Pipeline] checkout 00:00:05.256 The recommended git tool is: NONE 00:00:05.266 using credential 00000000-0000-0000-0000-000000000002 00:00:05.271 Cloning the remote Git repository 00:00:05.274 Honoring refspec on initial clone 00:00:05.273 Cloning repository https://review.spdk.io/gerrit/a/spdk/spdk 00:00:05.273 > git init /var/jenkins/workspace/autotest-per-patch/spdk # timeout=10 00:00:05.279 Using reference repository: /var/ci_repos/spdk_multi 00:00:05.279 Fetching upstream changes from https://review.spdk.io/gerrit/a/spdk/spdk 00:00:05.279 > git --version # timeout=10 00:00:05.282 > git --version # 'git version 2.25.1' 00:00:05.282 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:05.285 Setting http proxy: proxy-dmz.intel.com:911 00:00:05.285 > git fetch --tags --force --progress -- https://review.spdk.io/gerrit/a/spdk/spdk refs/changes/69/24169/2 +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:12.224 Avoid second fetch 00:00:12.266 Checking out Revision eccb800bca70052ec16b1ebd1dcce851e4555cbb (FETCH_HEAD) 00:00:12.474 Commit message: "nvmf: add nvmf_update_mdns_prr" 00:00:12.478 First time build. Skipping changelog. 00:00:12.196 > git config remote.origin.url https://review.spdk.io/gerrit/a/spdk/spdk # timeout=10 00:00:12.198 > git config --add remote.origin.fetch refs/changes/69/24169/2 # timeout=10 00:00:12.199 > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:12.223 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:12.259 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:12.265 > git config core.sparsecheckout # timeout=10 00:00:12.267 > git checkout -f eccb800bca70052ec16b1ebd1dcce851e4555cbb # timeout=10 00:00:12.472 > git rev-list --no-walk 91a2673cbcc7c473de56b447d1a78855d04da2b5 # timeout=10 00:00:12.479 > git remote # timeout=10 00:00:12.480 > git submodule init # timeout=10 00:00:12.516 > git submodule sync # timeout=10 00:00:12.545 > git config --get remote.origin.url # timeout=10 00:00:12.551 > git submodule init # timeout=10 00:00:12.585 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 00:00:12.588 > git config --get submodule.dpdk.url # timeout=10 00:00:12.591 > git remote # timeout=10 00:00:12.593 > git config --get remote.origin.url # timeout=10 00:00:12.596 > git config -f .gitmodules --get submodule.dpdk.path # timeout=10 00:00:12.599 > git config --get submodule.intel-ipsec-mb.url # timeout=10 00:00:12.601 > git remote # timeout=10 00:00:12.603 > git config --get remote.origin.url # timeout=10 00:00:12.605 > git config -f .gitmodules --get submodule.intel-ipsec-mb.path # timeout=10 00:00:12.607 > git config --get submodule.isa-l.url # timeout=10 00:00:12.609 > git remote # timeout=10 00:00:12.611 > git config --get remote.origin.url # timeout=10 00:00:12.614 > git config -f .gitmodules --get submodule.isa-l.path # timeout=10 00:00:12.616 > git config --get submodule.ocf.url # timeout=10 00:00:12.619 > git remote # timeout=10 00:00:12.621 > git config --get remote.origin.url # timeout=10 00:00:12.623 > git config -f .gitmodules --get submodule.ocf.path # timeout=10 00:00:12.625 > git config --get submodule.libvfio-user.url # timeout=10 00:00:12.627 > git remote # timeout=10 00:00:12.629 > git config --get remote.origin.url # timeout=10 00:00:12.631 > git config -f .gitmodules --get submodule.libvfio-user.path # timeout=10 00:00:12.633 > git config --get submodule.xnvme.url # timeout=10 00:00:12.635 > git remote # timeout=10 00:00:12.637 > git config --get remote.origin.url # timeout=10 00:00:12.639 > git config -f .gitmodules --get submodule.xnvme.path # timeout=10 00:00:12.642 > git config --get submodule.isa-l-crypto.url # timeout=10 00:00:12.644 > git remote # timeout=10 00:00:12.646 > git config --get remote.origin.url # timeout=10 00:00:12.649 > git config -f .gitmodules --get submodule.isa-l-crypto.path # timeout=10 00:00:12.652 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.652 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.652 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.652 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.653 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.653 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.653 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.656 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.656 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi libvfio-user # timeout=10 00:00:12.657 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.657 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.657 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi xnvme # timeout=10 00:00:12.657 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi ocf # timeout=10 00:00:12.657 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.657 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi dpdk # timeout=10 00:00:12.657 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.657 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l # timeout=10 00:00:12.657 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.657 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi intel-ipsec-mb # timeout=10 00:00:12.658 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.658 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l-crypto # timeout=10 00:00:40.454 [Pipeline] dir 00:00:40.454 Running in /var/jenkins/workspace/autotest-per-patch/spdk 00:00:40.456 [Pipeline] { 00:00:40.473 [Pipeline] sh 00:00:40.756 ++ nproc 00:00:40.756 + threads=8 00:00:40.757 + git repack -a -d --threads=8 00:00:48.865 + git submodule foreach git repack -a -d --threads=8 00:00:48.865 Entering 'dpdk' 00:00:50.760 Entering 'intel-ipsec-mb' 00:00:51.019 Entering 'isa-l' 00:00:51.019 Entering 'isa-l-crypto' 00:00:51.277 Entering 'libvfio-user' 00:00:51.277 Entering 'ocf' 00:00:51.534 Entering 'xnvme' 00:00:51.534 + find .git -type f -name alternates -print -delete 00:00:51.534 .git/objects/info/alternates 00:00:51.534 .git/modules/ocf/objects/info/alternates 00:00:51.534 .git/modules/intel-ipsec-mb/objects/info/alternates 00:00:51.534 .git/modules/isa-l-crypto/objects/info/alternates 00:00:51.534 .git/modules/xnvme/objects/info/alternates 00:00:51.534 .git/modules/libvfio-user/objects/info/alternates 00:00:51.534 .git/modules/dpdk/objects/info/alternates 00:00:51.534 .git/modules/isa-l/objects/info/alternates 00:00:51.545 [Pipeline] } 00:00:51.566 [Pipeline] // dir 00:00:51.571 [Pipeline] } 00:00:51.591 [Pipeline] // retry 00:00:51.599 [Pipeline] sh 00:00:51.879 + hash pigz 00:00:51.879 + tar -cf spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz -I pigz spdk 00:00:54.420 [Pipeline] httpRequest 00:00:54.428 HttpMethod: PUT 00:00:54.429 URL: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:54.431 Sending request to url: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:57.837 Response Code: HTTP/1.1 200 OK 00:00:57.842 Success: Status code 200 is in the accepted range: 200 00:00:57.847 [Pipeline] echo 00:00:57.848 00:00:57.848 Locking 00:00:57.848 Waited 0s for lock 00:00:57.848 Everything Fine. Saved: /storage/packages/spdk_eccb800bca70052ec16b1ebd1dcce851e4555cbb.tar.gz 00:00:57.848 00:00:57.852 [Pipeline] sh 00:00:58.134 + git -C spdk log --oneline -n5 00:00:58.134 eccb800bc nvmf: add nvmf_update_mdns_prr 00:00:58.134 4835eb82b nvmf: consolidate listener addition in avahi_entry_group_add_listeners 00:00:58.134 719d03c6a sock/uring: only register net impl if supported 00:00:58.134 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:58.134 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:00:58.156 [Pipeline] setCustomBuildProperty 00:00:58.168 [Pipeline] setCustomBuildProperty 00:00:58.180 [Pipeline] catchError 00:00:58.182 [Pipeline] { 00:00:58.203 [Pipeline] sh 00:00:58.487 + git -C spdk describe --tags --abbrev=0 origin/master 00:00:58.502 [Pipeline] sh 00:00:58.783 + git -C spdk describe --tags --abbrev=0 --exclude=LTS HEAD 00:00:58.798 [Pipeline] echo 00:00:58.800 Branch: master 00:00:58.805 [Pipeline] fileExists 00:00:58.825 [Pipeline] readJSON 00:00:58.841 [Pipeline] } 00:00:58.869 [Pipeline] // catchError 00:00:58.882 [Pipeline] sh 00:00:59.163 + /var/jenkins/workspace/autotest-per-patch/jbp/jenkins/jjb-config/jobs/scripts/get-pkgdep-jobs.sh /var/jenkins/workspace/autotest-per-patch/spdk 00:00:59.182 [Pipeline] } 00:00:59.211 [Pipeline] // stage 00:00:59.233 [Pipeline] catchError 00:00:59.235 [Pipeline] { 00:00:59.259 [Pipeline] stage 00:00:59.262 [Pipeline] { (Pre tests) 00:00:59.302 [Pipeline] parallel 00:00:59.315 [Pipeline] { (Branch: check-format-docker-autotest) 00:00:59.316 [Pipeline] { (Branch: check-so-deps-docker-autotest) 00:00:59.318 [Pipeline] { (Branch: doc-docker-autotest) 00:00:59.319 [Pipeline] { (Branch: build-files-docker-autotest) 00:00:59.344 [Pipeline] retry 00:00:59.346 [Pipeline] { 00:00:59.352 [Pipeline] retry 00:00:59.354 [Pipeline] { 00:00:59.360 [Pipeline] retry 00:00:59.362 [Pipeline] { 00:00:59.368 [Pipeline] retry 00:00:59.370 [Pipeline] { 00:00:59.394 [Pipeline] build 00:00:59.397 Scheduling project: check-format-docker-autotest 00:00:59.406 [Pipeline] build 00:00:59.409 Scheduling project: check-so-deps-docker-autotest 00:00:59.418 [Pipeline] build 00:00:59.421 Scheduling project: doc-docker-autotest 00:00:59.429 [Pipeline] build 00:00:59.433 Scheduling project: build-files-docker-autotest 00:01:07.540 Starting building: check-format-docker-autotest #26570 00:01:07.543 Starting building: doc-docker-autotest #26764 00:01:07.545 Starting building: build-files-docker-autotest #26549 00:01:07.548 Starting building: check-so-deps-docker-autotest #26581 00:01:42.256 Build doc-docker-autotest #26764 completed: SUCCESS 00:01:42.259 [Pipeline] } 00:01:42.293 [Pipeline] // retry 00:01:42.299 [Pipeline] } 00:01:58.978 Build check-format-docker-autotest #26570 completed: FAILURE 00:01:58.998 [Pipeline] echo 00:01:59.000 No retry patterns found. 00:01:59.001 [Pipeline] } 00:01:59.040 [Pipeline] // retry 00:01:59.048 [Pipeline] error 00:01:59.055 [Pipeline] } 00:01:59.061 Failed in branch check-format-docker-autotest 00:03:28.390 Build build-files-docker-autotest #26549 completed: SUCCESS 00:03:28.393 [Pipeline] } 00:03:28.417 [Pipeline] // retry 00:03:28.424 [Pipeline] } 00:03:57.674 Build check-so-deps-docker-autotest #26581 completed: SUCCESS 00:03:57.676 [Pipeline] } 00:03:57.712 [Pipeline] // retry 00:03:57.717 [Pipeline] } 00:03:57.771 [Pipeline] // parallel 00:03:57.778 [Pipeline] } 00:03:57.807 [Pipeline] // stage 00:03:57.815 [Pipeline] } 00:03:57.819 ERROR: Build check-format-docker-autotest #26570 failed 00:03:57.819 Setting overall build result to FAILURE 00:03:57.847 [Pipeline] // catchError 00:03:57.856 [Pipeline] catchError 00:03:57.858 [Pipeline] { 00:03:57.878 [Pipeline] stage 00:03:57.880 [Pipeline] { (Tests) 00:03:57.900 [Pipeline] unstable 00:03:57.903 WARNING: Previous stages failed 00:03:57.905 [Pipeline] } 00:03:57.934 [Pipeline] // stage 00:03:57.940 [Pipeline] } 00:03:57.967 [Pipeline] // catchError 00:03:57.976 [Pipeline] stage 00:03:57.978 [Pipeline] { (Autorun Post and Coverage) 00:03:57.997 [Pipeline] setCustomBuildProperty 00:03:58.019 [Pipeline] dir 00:03:58.020 Running in /var/jenkins/workspace/autotest-per-patch/doc-docker-autotest_26764 00:03:58.021 [Pipeline] { 00:03:58.044 [Pipeline] copyArtifacts 00:03:58.361 Copied 5 artifacts from "doc-docker-autotest" build number 26764 00:03:58.365 [Pipeline] writeFile 00:03:58.389 [Pipeline] } 00:03:58.420 [Pipeline] // dir 00:03:58.440 [Pipeline] dir 00:03:58.441 Running in /var/jenkins/workspace/autotest-per-patch/check-format-docker-autotest_26570 00:03:58.442 [Pipeline] { 00:03:58.467 [Pipeline] copyArtifacts 00:03:58.515 Copied 4 artifacts from "check-format-docker-autotest" build number 26570 00:03:58.519 [Pipeline] writeFile 00:03:58.537 [Pipeline] } 00:03:58.567 [Pipeline] // dir 00:03:58.669 [Pipeline] dir 00:03:58.669 Running in /var/jenkins/workspace/autotest-per-patch/build-files-docker-autotest_26549 00:03:58.671 [Pipeline] { 00:03:58.693 [Pipeline] copyArtifacts 00:03:58.739 Copied 4 artifacts from "build-files-docker-autotest" build number 26549 00:03:58.743 [Pipeline] writeFile 00:03:58.788 [Pipeline] } 00:03:58.828 [Pipeline] // dir 00:03:58.876 [Pipeline] dir 00:03:58.877 Running in /var/jenkins/workspace/autotest-per-patch/check-so-deps-docker-autotest_26581 00:03:58.878 [Pipeline] { 00:03:58.898 [Pipeline] copyArtifacts 00:03:58.944 Copied 4 artifacts from "check-so-deps-docker-autotest" build number 26581 00:03:58.948 [Pipeline] writeFile 00:03:58.997 [Pipeline] } 00:03:59.020 [Pipeline] // dir 00:03:59.029 [Pipeline] catchError 00:03:59.031 [Pipeline] { 00:03:59.051 [Pipeline] sh 00:03:59.329 + jbp/jenkins/jjb-config/jobs/scripts/post_gen_coverage.sh 00:03:59.330 + shopt -s globstar nullglob 00:03:59.330 + echo 'Start stage post_gen_coverage.sh' 00:03:59.330 Start stage post_gen_coverage.sh 00:03:59.330 + cd /var/jenkins/workspace/autotest-per-patch 00:03:59.330 + rm -rf /var/jenkins/workspace/autotest-per-patch/spdk/doc 00:03:59.330 + trap 'compress_coverage_and_docs; remove_partial_coverage_files && echo '\''End stage post_gen_coverage.sh'\''' EXIT 00:03:59.330 + move_artifacts 00:03:59.330 + local out_dirs 00:03:59.330 + out_dirs=(./**/output/) 00:03:59.330 + for dir in "${out_dirs[@]}" 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./build-files-docker-autotest_26549/output//doc.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./build-files-docker-autotest_26549/output//ut_coverage.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./build-files-docker-autotest_26549/output//llvm.tar.xz ]] 00:03:59.330 + mv ./build-files-docker-autotest_26549/output//build-repo-manifest.txt ./build-files-docker-autotest_26549/output//power.tar.xz ./build-files-docker-autotest_26549/output//test_completions.txt ./build-files-docker-autotest_26549/output//timing.txt ./build-files-docker-autotest_26549/output//.. 00:03:59.330 + rmdir ./build-files-docker-autotest_26549/output/ 00:03:59.330 + for dir in "${out_dirs[@]}" 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-format-docker-autotest_26570/output//doc.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-format-docker-autotest_26570/output//ut_coverage.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-format-docker-autotest_26570/output//llvm.tar.xz ]] 00:03:59.330 + mv ./check-format-docker-autotest_26570/output//build-repo-manifest.txt ./check-format-docker-autotest_26570/output//power.tar.xz ./check-format-docker-autotest_26570/output//test_completions.txt ./check-format-docker-autotest_26570/output//timing.txt ./check-format-docker-autotest_26570/output//.. 00:03:59.330 + rmdir ./check-format-docker-autotest_26570/output/ 00:03:59.330 + for dir in "${out_dirs[@]}" 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-so-deps-docker-autotest_26581/output//doc.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-so-deps-docker-autotest_26581/output//ut_coverage.tar.xz ]] 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./check-so-deps-docker-autotest_26581/output//llvm.tar.xz ]] 00:03:59.330 + mv ./check-so-deps-docker-autotest_26581/output//build-repo-manifest.txt ./check-so-deps-docker-autotest_26581/output//power.tar.xz ./check-so-deps-docker-autotest_26581/output//test_completions.txt ./check-so-deps-docker-autotest_26581/output//timing.txt ./check-so-deps-docker-autotest_26581/output//.. 00:03:59.330 + rmdir ./check-so-deps-docker-autotest_26581/output/ 00:03:59.330 + for dir in "${out_dirs[@]}" 00:03:59.330 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.330 + [[ -f ./doc-docker-autotest_26764/output//doc.tar.xz ]] 00:03:59.330 + tar -C ./doc-docker-autotest_26764/output/ -xf ./doc-docker-autotest_26764/output//doc.tar.xz 00:03:59.587 + rm ./doc-docker-autotest_26764/output//doc.tar.xz 00:03:59.587 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.587 + [[ -f ./doc-docker-autotest_26764/output//ut_coverage.tar.xz ]] 00:03:59.587 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:03:59.587 + [[ -f ./doc-docker-autotest_26764/output//llvm.tar.xz ]] 00:03:59.587 + mv ./doc-docker-autotest_26764/output//build-repo-manifest.txt ./doc-docker-autotest_26764/output//doc ./doc-docker-autotest_26764/output//power.tar.xz ./doc-docker-autotest_26764/output//test_completions.txt ./doc-docker-autotest_26764/output//timing.txt ./doc-docker-autotest_26764/output//.. 00:03:59.587 + rmdir ./doc-docker-autotest_26764/output/ 00:03:59.587 + unpack_cov_files 00:03:59.587 + local info_files 00:03:59.587 + info_files=(*/cov_*.info.xz) 00:03:59.587 + printf '%s\n' 00:03:59.587 + xargs -P0 -r -n1 xz -d 00:03:59.587 + fix_downstream_job_paths 00:03:59.587 + sed -i -e 's#^SF:/.\+/spdk/#SF:/var/jenkins/workspace/autotest-per-patch/spdk/#g' 00:03:59.587 sed: no input files 00:03:59.587 + compress_coverage_and_docs 00:03:59.587 + echo 'Start compress coverage and docs' 00:03:59.587 Start compress coverage and docs 00:03:59.587 + tar -C coverage -czf coverage_autotest-per-patch_126153.tar.gz ./ --remove-files 00:03:59.587 tar: coverage: Cannot open: No such file or directory 00:03:59.587 tar: Error is not recoverable: exiting now 00:03:59.604 [Pipeline] } 00:03:59.608 ERROR: script returned exit code 2 00:03:59.637 [Pipeline] // catchError 00:03:59.646 [Pipeline] catchError 00:03:59.647 [Pipeline] { 00:03:59.667 [Pipeline] dir 00:03:59.668 Running in /var/jenkins/workspace/autotest-per-patch/post_process 00:03:59.669 [Pipeline] {