00:00:00.001 Started by user sys_sgci 00:00:00.009 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-upstream/autotest.groovy 00:00:00.010 The recommended git tool is: git 00:00:00.010 using credential 00000000-0000-0000-0000-000000000002 00:00:00.013 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/autotest-per-patch_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.026 Fetching changes from the remote Git repository 00:00:00.028 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.040 Using shallow fetch with depth 1 00:00:00.040 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.040 > git --version # timeout=10 00:00:00.052 > git --version # 'git version 2.39.2' 00:00:00.052 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.064 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.064 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.126 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.140 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.155 Checking out Revision 308e970df89ed396a3f9dcf22fba8891259694e4 (FETCH_HEAD) 00:00:02.155 > git config core.sparsecheckout # timeout=10 00:00:02.167 > git read-tree -mu HEAD # timeout=10 00:00:02.187 > git checkout -f 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=5 00:00:02.210 Commit message: "jjb/create-perf-report: make job run concurrent" 00:00:02.210 > git rev-list --no-walk 308e970df89ed396a3f9dcf22fba8891259694e4 # timeout=10 00:00:02.822 [Pipeline] Start of Pipeline 00:00:02.836 [Pipeline] library 00:00:02.838 Loading library shm_lib@master 00:00:02.838 Library shm_lib@master is cached. Copying from home. 00:00:02.871 [Pipeline] node 00:00:02.883 Running on ME1 in /var/jenkins/workspace/autotest-per-patch 00:00:02.885 [Pipeline] { 00:00:02.901 [Pipeline] cleanWs 00:00:02.912 [WS-CLEANUP] Deleting project workspace... 00:00:02.912 [WS-CLEANUP] Deferred wipeout is used... 00:00:02.920 [WS-CLEANUP] done 00:00:02.926 [Pipeline] stage 00:00:02.930 [Pipeline] { (Prologue) 00:00:03.062 [Pipeline] withCredentials 00:00:03.074 > git --version # timeout=10 00:00:03.084 > git --version # 'git version 2.39.2' 00:00:03.103 Masking supported pattern matches of $GIT_USERNAME or $GIT_PASSWORD or $GIT_ASKPASS 00:00:03.105 [Pipeline] { 00:00:03.115 [Pipeline] retry 00:00:03.117 [Pipeline] { 00:00:03.445 [Pipeline] sh 00:00:03.729 + git ls-remote https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master 00:00:06.285 [Pipeline] } 00:00:06.307 [Pipeline] // retry 00:00:06.311 [Pipeline] } 00:00:06.331 [Pipeline] // withCredentials 00:00:06.341 [Pipeline] httpRequest 00:00:06.358 [Pipeline] echo 00:00:06.360 Sorcerer 10.211.164.101 is alive 00:00:06.370 [Pipeline] httpRequest 00:00:06.375 HttpMethod: GET 00:00:06.376 URL: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:06.376 Sending request to url: http://10.211.164.101/packages/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:06.378 Response Code: HTTP/1.1 200 OK 00:00:06.379 Success: Status code 200 is in the accepted range: 200,404 00:00:06.379 Saving response body to /var/jenkins/workspace/autotest-per-patch/jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:06.524 [Pipeline] sh 00:00:06.805 + tar --no-same-owner -xf jbp_308e970df89ed396a3f9dcf22fba8891259694e4.tar.gz 00:00:06.825 [Pipeline] httpRequest 00:00:06.842 [Pipeline] echo 00:00:06.843 Sorcerer 10.211.164.101 is alive 00:00:06.851 [Pipeline] httpRequest 00:00:06.855 HttpMethod: GET 00:00:06.856 URL: http://10.211.164.101/packages/spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:06.856 Sending request to url: http://10.211.164.101/packages/spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:06.859 Response Code: HTTP/1.1 404 Not Found 00:00:06.859 Success: Status code 404 is in the accepted range: 200,404 00:00:06.860 Saving response body to /var/jenkins/workspace/autotest-per-patch/spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:06.867 [Pipeline] sh 00:00:07.151 + rm -f spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:07.167 [Pipeline] retry 00:00:07.169 [Pipeline] { 00:00:07.189 [Pipeline] checkout 00:00:07.196 The recommended git tool is: NONE 00:00:07.207 using credential 00000000-0000-0000-0000-000000000002 00:00:07.213 Cloning the remote Git repository 00:00:07.216 Honoring refspec on initial clone 00:00:07.219 Cloning repository https://review.spdk.io/gerrit/a/spdk/spdk 00:00:07.219 > git init /var/jenkins/workspace/autotest-per-patch/spdk # timeout=10 00:00:07.225 Using reference repository: /var/ci_repos/spdk_multi 00:00:07.225 Fetching upstream changes from https://review.spdk.io/gerrit/a/spdk/spdk 00:00:07.225 > git --version # timeout=10 00:00:07.226 > git --version # 'git version 2.42.0' 00:00:07.226 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:07.228 Setting http proxy: proxy-dmz.intel.com:911 00:00:07.228 > git fetch --tags --force --progress -- https://review.spdk.io/gerrit/a/spdk/spdk refs/changes/60/23060/28 +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:12.353 Avoid second fetch 00:00:12.362 Checking out Revision 5b9fb8c025fd9f911fa2997242ef12cd504ca6c0 (FETCH_HEAD) 00:00:12.541 Commit message: "test/setup: add configuration script for dsa devices" 00:00:12.545 First time build. Skipping changelog. 00:00:12.344 > git config remote.origin.url https://review.spdk.io/gerrit/a/spdk/spdk # timeout=10 00:00:12.346 > git config --add remote.origin.fetch refs/changes/60/23060/28 # timeout=10 00:00:12.347 > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10 00:00:12.354 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:12.359 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:12.363 > git config core.sparsecheckout # timeout=10 00:00:12.365 > git checkout -f 5b9fb8c025fd9f911fa2997242ef12cd504ca6c0 # timeout=10 00:00:12.542 > git rev-list --no-walk 9e14f4d899df8d851a1080796316089b3e603e8c # timeout=10 00:00:12.548 > git remote # timeout=10 00:00:12.550 > git submodule init # timeout=10 00:00:12.574 > git submodule sync # timeout=10 00:00:12.598 > git config --get remote.origin.url # timeout=10 00:00:12.602 > git submodule init # timeout=10 00:00:12.624 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 00:00:12.625 > git config --get submodule.dpdk.url # timeout=10 00:00:12.627 > git remote # timeout=10 00:00:12.628 > git config --get remote.origin.url # timeout=10 00:00:12.630 > git config -f .gitmodules --get submodule.dpdk.path # timeout=10 00:00:12.631 > git config --get submodule.intel-ipsec-mb.url # timeout=10 00:00:12.633 > git remote # timeout=10 00:00:12.634 > git config --get remote.origin.url # timeout=10 00:00:12.635 > git config -f .gitmodules --get submodule.intel-ipsec-mb.path # timeout=10 00:00:12.637 > git config --get submodule.isa-l.url # timeout=10 00:00:12.638 > git remote # timeout=10 00:00:12.640 > git config --get remote.origin.url # timeout=10 00:00:12.641 > git config -f .gitmodules --get submodule.isa-l.path # timeout=10 00:00:12.642 > git config --get submodule.ocf.url # timeout=10 00:00:12.644 > git remote # timeout=10 00:00:12.645 > git config --get remote.origin.url # timeout=10 00:00:12.646 > git config -f .gitmodules --get submodule.ocf.path # timeout=10 00:00:12.648 > git config --get submodule.libvfio-user.url # timeout=10 00:00:12.649 > git remote # timeout=10 00:00:12.650 > git config --get remote.origin.url # timeout=10 00:00:12.652 > git config -f .gitmodules --get submodule.libvfio-user.path # timeout=10 00:00:12.653 > git config --get submodule.xnvme.url # timeout=10 00:00:12.654 > git remote # timeout=10 00:00:12.656 > git config --get remote.origin.url # timeout=10 00:00:12.657 > git config -f .gitmodules --get submodule.xnvme.path # timeout=10 00:00:12.659 > git config --get submodule.isa-l-crypto.url # timeout=10 00:00:12.660 > git remote # timeout=10 00:00:12.662 > git config --get remote.origin.url # timeout=10 00:00:12.663 > git config -f .gitmodules --get submodule.isa-l-crypto.path # timeout=10 00:00:12.665 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.665 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.665 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.665 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.666 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.666 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.666 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:12.667 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.667 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi dpdk # timeout=10 00:00:12.668 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.668 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi xnvme # timeout=10 00:00:12.669 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.669 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi intel-ipsec-mb # timeout=10 00:00:12.669 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.669 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi libvfio-user # timeout=10 00:00:12.670 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.670 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi ocf # timeout=10 00:00:12.671 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.671 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l-crypto # timeout=10 00:00:12.671 Setting http proxy: proxy-dmz.intel.com:911 00:00:12.671 > git submodule update --init --recursive --reference /var/ci_repos/spdk_multi isa-l # timeout=10 00:00:36.153 [Pipeline] dir 00:00:36.154 Running in /var/jenkins/workspace/autotest-per-patch/spdk 00:00:36.155 [Pipeline] { 00:00:36.167 [Pipeline] sh 00:00:36.440 ++ nproc 00:00:36.440 + threads=4 00:00:36.440 + git repack -a -d --threads=4 00:00:40.617 + git submodule foreach git repack -a -d --threads=4 00:00:40.617 Entering 'dpdk' 00:00:43.925 Entering 'intel-ipsec-mb' 00:00:44.182 Entering 'isa-l' 00:00:44.449 Entering 'isa-l-crypto' 00:00:44.449 Entering 'libvfio-user' 00:00:44.707 Entering 'ocf' 00:00:44.707 Entering 'xnvme' 00:00:44.980 + find .git -type f -name alternates -print -delete 00:00:44.980 .git/objects/info/alternates 00:00:44.980 .git/modules/xnvme/objects/info/alternates 00:00:44.980 .git/modules/dpdk/objects/info/alternates 00:00:44.980 .git/modules/ocf/objects/info/alternates 00:00:44.980 .git/modules/isa-l-crypto/objects/info/alternates 00:00:44.980 .git/modules/libvfio-user/objects/info/alternates 00:00:44.980 .git/modules/isa-l/objects/info/alternates 00:00:44.980 .git/modules/intel-ipsec-mb/objects/info/alternates 00:00:44.992 [Pipeline] } 00:00:45.015 [Pipeline] // dir 00:00:45.020 [Pipeline] } 00:00:45.041 [Pipeline] // retry 00:00:45.050 [Pipeline] sh 00:00:45.325 + hash pigz 00:00:45.325 + tar -cf spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz -I pigz spdk 00:00:48.601 [Pipeline] httpRequest 00:00:48.607 HttpMethod: PUT 00:00:48.607 URL: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:48.607 Sending request to url: http://10.211.164.101/cgi-bin/sorcerer.py?group=packages&filename=spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:52.363 Response Code: HTTP/1.1 200 OK 00:00:52.368 Success: Status code 200 is in the accepted range: 200 00:00:52.372 [Pipeline] echo 00:00:52.374 00:00:52.374 Locking 00:00:52.374 Waited 0s for lock 00:00:52.374 Everything Fine. Saved: /storage/packages/spdk_5b9fb8c025fd9f911fa2997242ef12cd504ca6c0.tar.gz 00:00:52.374 00:00:52.377 [Pipeline] sh 00:00:52.664 + git -C spdk log --oneline -n5 00:00:52.664 5b9fb8c02 test/setup: add configuration script for dsa devices 00:00:52.664 719d03c6a sock/uring: only register net impl if supported 00:00:52.664 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:00:52.664 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:00:52.664 6c7c1f57e accel: add sequence outstanding stat 00:00:52.683 [Pipeline] setCustomBuildProperty 00:00:52.692 [Pipeline] setCustomBuildProperty 00:00:52.701 [Pipeline] catchError 00:00:52.703 [Pipeline] { 00:00:52.723 [Pipeline] sh 00:00:53.010 + git -C spdk describe --tags --abbrev=0 origin/master 00:00:53.026 [Pipeline] sh 00:00:53.310 + git -C spdk describe --tags --abbrev=0 --exclude=LTS HEAD 00:00:53.325 [Pipeline] echo 00:00:53.327 Branch: master 00:00:53.330 [Pipeline] fileExists 00:00:53.343 [Pipeline] readJSON 00:00:53.356 [Pipeline] } 00:00:53.375 [Pipeline] // catchError 00:00:53.385 [Pipeline] sh 00:00:53.668 + /var/jenkins/workspace/autotest-per-patch/jbp/jenkins/jjb-config/jobs/scripts/get-pkgdep-jobs.sh /var/jenkins/workspace/autotest-per-patch/spdk 00:00:53.686 [Pipeline] } 00:00:53.713 [Pipeline] // stage 00:00:53.731 [Pipeline] catchError 00:00:53.732 [Pipeline] { 00:00:53.752 [Pipeline] stage 00:00:53.754 [Pipeline] { (Pre tests) 00:00:53.791 [Pipeline] parallel 00:00:53.803 [Pipeline] { (Branch: check-format-docker-autotest) 00:00:53.805 [Pipeline] { (Branch: check-so-deps-docker-autotest) 00:00:53.806 [Pipeline] { (Branch: doc-docker-autotest) 00:00:53.808 [Pipeline] { (Branch: build-files-docker-autotest) 00:00:53.830 [Pipeline] retry 00:00:53.832 [Pipeline] { 00:00:53.834 [Pipeline] retry 00:00:53.835 [Pipeline] { 00:00:53.838 [Pipeline] retry 00:00:53.839 [Pipeline] { 00:00:53.842 [Pipeline] retry 00:00:53.843 [Pipeline] { 00:00:53.858 [Pipeline] build 00:00:53.860 Scheduling project: check-format-docker-autotest 00:00:53.865 [Pipeline] build 00:00:53.867 Scheduling project: check-so-deps-docker-autotest 00:00:53.871 [Pipeline] build 00:00:53.872 Scheduling project: doc-docker-autotest 00:00:53.877 [Pipeline] build 00:00:53.878 Scheduling project: build-files-docker-autotest 00:01:00.060 Starting building: doc-docker-autotest #26706 00:01:00.063 Starting building: check-format-docker-autotest #26512 00:01:00.066 Starting building: build-files-docker-autotest #26491 00:01:00.069 Starting building: check-so-deps-docker-autotest #26523 00:01:33.704 Build doc-docker-autotest #26706 completed: SUCCESS 00:01:33.706 [Pipeline] } 00:01:33.742 [Pipeline] // retry 00:01:33.748 [Pipeline] } 00:02:05.213 Build check-format-docker-autotest #26512 completed: FAILURE 00:02:05.229 [Pipeline] echo 00:02:05.230 No retry patterns found. 00:02:05.232 [Pipeline] } 00:02:05.265 [Pipeline] // retry 00:02:05.272 [Pipeline] error 00:02:05.278 [Pipeline] } 00:02:05.282 Failed in branch check-format-docker-autotest 00:03:45.018 Build build-files-docker-autotest #26491 completed: SUCCESS 00:03:45.021 [Pipeline] } 00:03:45.060 [Pipeline] // retry 00:03:45.069 [Pipeline] } 00:04:05.524 Build check-so-deps-docker-autotest #26523 completed: SUCCESS 00:04:05.526 [Pipeline] } 00:04:05.550 [Pipeline] // retry 00:04:05.553 [Pipeline] } 00:04:05.587 [Pipeline] // parallel 00:04:05.592 [Pipeline] } 00:04:05.618 [Pipeline] // stage 00:04:05.624 [Pipeline] } 00:04:05.627 ERROR: Build check-format-docker-autotest #26512 failed 00:04:05.627 Setting overall build result to FAILURE 00:04:05.644 [Pipeline] // catchError 00:04:05.652 [Pipeline] catchError 00:04:05.653 [Pipeline] { 00:04:05.668 [Pipeline] stage 00:04:05.670 [Pipeline] { (Tests) 00:04:05.685 [Pipeline] unstable 00:04:05.688 WARNING: Previous stages failed 00:04:05.689 [Pipeline] } 00:04:05.710 [Pipeline] // stage 00:04:05.715 [Pipeline] } 00:04:05.734 [Pipeline] // catchError 00:04:05.741 [Pipeline] stage 00:04:05.742 [Pipeline] { (Autorun Post and Coverage) 00:04:05.756 [Pipeline] setCustomBuildProperty 00:04:05.770 [Pipeline] dir 00:04:05.771 Running in /var/jenkins/workspace/autotest-per-patch/doc-docker-autotest_26706 00:04:05.772 [Pipeline] { 00:04:05.789 [Pipeline] copyArtifacts 00:04:06.105 Copied 5 artifacts from "doc-docker-autotest" build number 26706 00:04:06.108 [Pipeline] writeFile 00:04:06.128 [Pipeline] } 00:04:06.153 [Pipeline] // dir 00:04:06.167 [Pipeline] dir 00:04:06.168 Running in /var/jenkins/workspace/autotest-per-patch/check-format-docker-autotest_26512 00:04:06.169 [Pipeline] { 00:04:06.188 [Pipeline] copyArtifacts 00:04:06.230 Copied 4 artifacts from "check-format-docker-autotest" build number 26512 00:04:06.233 [Pipeline] writeFile 00:04:06.257 [Pipeline] } 00:04:06.287 [Pipeline] // dir 00:04:06.370 [Pipeline] dir 00:04:06.370 Running in /var/jenkins/workspace/autotest-per-patch/build-files-docker-autotest_26491 00:04:06.371 [Pipeline] { 00:04:06.390 [Pipeline] copyArtifacts 00:04:06.443 Copied 4 artifacts from "build-files-docker-autotest" build number 26491 00:04:06.447 [Pipeline] writeFile 00:04:06.495 [Pipeline] } 00:04:06.529 [Pipeline] // dir 00:04:06.592 [Pipeline] dir 00:04:06.592 Running in /var/jenkins/workspace/autotest-per-patch/check-so-deps-docker-autotest_26523 00:04:06.594 [Pipeline] { 00:04:06.610 [Pipeline] copyArtifacts 00:04:06.660 Copied 4 artifacts from "check-so-deps-docker-autotest" build number 26523 00:04:06.664 [Pipeline] writeFile 00:04:06.691 [Pipeline] } 00:04:06.721 [Pipeline] // dir 00:04:06.731 [Pipeline] catchError 00:04:06.732 [Pipeline] { 00:04:06.748 [Pipeline] sh 00:04:07.031 + jbp/jenkins/jjb-config/jobs/scripts/post_gen_coverage.sh 00:04:07.031 + shopt -s globstar nullglob 00:04:07.031 + echo 'Start stage post_gen_coverage.sh' 00:04:07.031 Start stage post_gen_coverage.sh 00:04:07.031 + cd /var/jenkins/workspace/autotest-per-patch 00:04:07.031 + rm -rf /var/jenkins/workspace/autotest-per-patch/spdk/doc 00:04:07.031 + trap 'compress_coverage_and_docs; remove_partial_coverage_files && echo '\''End stage post_gen_coverage.sh'\''' EXIT 00:04:07.031 + move_artifacts 00:04:07.031 + local out_dirs 00:04:07.031 + out_dirs=(./**/output/) 00:04:07.031 + for dir in "${out_dirs[@]}" 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./build-files-docker-autotest_26491/output//doc.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./build-files-docker-autotest_26491/output//ut_coverage.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./build-files-docker-autotest_26491/output//llvm.tar.xz ]] 00:04:07.031 + mv ./build-files-docker-autotest_26491/output//build-repo-manifest.txt ./build-files-docker-autotest_26491/output//power.tar.xz ./build-files-docker-autotest_26491/output//test_completions.txt ./build-files-docker-autotest_26491/output//timing.txt ./build-files-docker-autotest_26491/output//.. 00:04:07.031 + rmdir ./build-files-docker-autotest_26491/output/ 00:04:07.031 + for dir in "${out_dirs[@]}" 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-format-docker-autotest_26512/output//doc.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-format-docker-autotest_26512/output//ut_coverage.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-format-docker-autotest_26512/output//llvm.tar.xz ]] 00:04:07.031 + mv ./check-format-docker-autotest_26512/output//build-repo-manifest.txt ./check-format-docker-autotest_26512/output//power.tar.xz ./check-format-docker-autotest_26512/output//test_completions.txt ./check-format-docker-autotest_26512/output//timing.txt ./check-format-docker-autotest_26512/output//.. 00:04:07.031 + rmdir ./check-format-docker-autotest_26512/output/ 00:04:07.031 + for dir in "${out_dirs[@]}" 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-so-deps-docker-autotest_26523/output//doc.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-so-deps-docker-autotest_26523/output//ut_coverage.tar.xz ]] 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./check-so-deps-docker-autotest_26523/output//llvm.tar.xz ]] 00:04:07.031 + mv ./check-so-deps-docker-autotest_26523/output//build-repo-manifest.txt ./check-so-deps-docker-autotest_26523/output//power.tar.xz ./check-so-deps-docker-autotest_26523/output//test_completions.txt ./check-so-deps-docker-autotest_26523/output//timing.txt ./check-so-deps-docker-autotest_26523/output//.. 00:04:07.031 + rmdir ./check-so-deps-docker-autotest_26523/output/ 00:04:07.031 + for dir in "${out_dirs[@]}" 00:04:07.031 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.031 + [[ -f ./doc-docker-autotest_26706/output//doc.tar.xz ]] 00:04:07.031 + tar -C ./doc-docker-autotest_26706/output/ -xf ./doc-docker-autotest_26706/output//doc.tar.xz 00:04:07.289 + rm ./doc-docker-autotest_26706/output//doc.tar.xz 00:04:07.289 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.289 + [[ -f ./doc-docker-autotest_26706/output//ut_coverage.tar.xz ]] 00:04:07.289 + for archive in "${dir}"/{doc,ut_coverage,llvm}.tar.xz 00:04:07.289 + [[ -f ./doc-docker-autotest_26706/output//llvm.tar.xz ]] 00:04:07.289 + mv ./doc-docker-autotest_26706/output//build-repo-manifest.txt ./doc-docker-autotest_26706/output//doc ./doc-docker-autotest_26706/output//power.tar.xz ./doc-docker-autotest_26706/output//test_completions.txt ./doc-docker-autotest_26706/output//timing.txt ./doc-docker-autotest_26706/output//.. 00:04:07.289 + rmdir ./doc-docker-autotest_26706/output/ 00:04:07.289 + unpack_cov_files 00:04:07.289 + local info_files 00:04:07.289 + info_files=(*/cov_*.info.xz) 00:04:07.289 + printf '%s\n' 00:04:07.289 + xargs -P0 -r -n1 xz -d 00:04:07.289 + fix_downstream_job_paths 00:04:07.290 + sed -i -e 's#^SF:/.\+/spdk/#SF:/var/jenkins/workspace/autotest-per-patch/spdk/#g' 00:04:07.290 sed: no input files 00:04:07.290 + compress_coverage_and_docs 00:04:07.290 + echo 'Start compress coverage and docs' 00:04:07.290 Start compress coverage and docs 00:04:07.290 + tar -C coverage -czf coverage_autotest-per-patch_126111.tar.gz ./ --remove-files 00:04:07.290 tar: coverage: Cannot open: No such file or directory 00:04:07.290 tar: Error is not recoverable: exiting now 00:04:07.306 [Pipeline] } 00:04:07.310 ERROR: script returned exit code 2 00:04:07.337 [Pipeline] // catchError 00:04:07.346 [Pipeline] catchError 00:04:07.348 [Pipeline] { 00:04:07.365 [Pipeline] dir 00:04:07.366 Running in /var/jenkins/workspace/autotest-per-patch/post_process 00:04:07.367 [Pipeline] {