From 56aad83307e46983a397236bd0959e634207f83e Mon Sep 17 00:00:00 2001 From: fanquake Date: Thu, 3 Oct 2024 10:28:06 +0100 Subject: [PATCH] ci: set a ctest timeout of 1200 (20 minutes) This should be long enough (with headroom) for our longest running tests, which even under MSAN, TSAN, Valgrind, etc max out at about 800s. i.e under Valgrind I see the longer runtimes as: ```bash 135/136 Test #8: bench_sanity_check_high_priority ..... Passed 371.19 sec 136/136 Test #122: coinselector_tests ................... Passed 343.39 sec ``` In the CI `tests` under TSAN: ```bash tests ................................ Passed 795.20 sec ``` and MSAN: ```bash tests ................................ Passed 658.48 sec ``` This will also prevent the current issue we are seeing of `ctest` running until it reaches the CI timeout, see #30969. However, we still need to figure out what underlying issue is causing the tests to (sometimes) run for so long, but in the mean time, this will stop `ctest` wasting our CI CPU. --- ci/test/03_test_script.sh | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/ci/test/03_test_script.sh b/ci/test/03_test_script.sh index 1c1b5fa545b..9cc0e8e864a 100755 --- a/ci/test/03_test_script.sh +++ b/ci/test/03_test_script.sh @@ -146,7 +146,7 @@ if [ "$RUN_CHECK_DEPS" = "true" ]; then fi if [ "$RUN_UNIT_TESTS" = "true" ]; then - DIR_UNIT_TEST_DATA="${DIR_UNIT_TEST_DATA}" LD_LIBRARY_PATH="${DEPENDS_DIR}/${HOST}/lib" CTEST_OUTPUT_ON_FAILURE=ON ctest "${MAKEJOBS}" + DIR_UNIT_TEST_DATA="${DIR_UNIT_TEST_DATA}" LD_LIBRARY_PATH="${DEPENDS_DIR}/${HOST}/lib" CTEST_OUTPUT_ON_FAILURE=ON ctest "${MAKEJOBS}" --timeout $((TEST_RUNNER_TIMEOUT_FACTOR * 30 )) fi if [ "$RUN_UNIT_TESTS_SEQUENTIAL" = "true" ]; then