Hi Maxim, thank you very much for the review and apologies for the late response, I have kept postponing it. Maxim Cournoyer writes: > Hi Tomas, > > Tomas Volf <~@wolfsden.cz> writes: > >> test_ssl does sometimes hang (at least when executed under faketime). It is >> somewhat unlikely to happen, and (on my machine) required a build with >> --rounds=32 to reproduce it. > > It'd be nice if upstream was made aware of this problem. Perhaps they > could come up with a fix for good. My experience with reporting bugs to this specific upstream is not the best, and I admit I expect the answer to "it hangs under faketime" would be "do not run it under faketime". > >> The workaround is to set somewhat lower timeout of 240s (expected test >> duration * 5 rounded up to whole minutes) and retry few times on failure. In >> this way, --rounds=64 finished successfully (on my machine). >> >> At the same time remove the timeout from the other tests, since it is not >> necessary (they do not hang), and one of them runs for ~270s (almost half the >> original timeout), so it could pose a problem on slow/overloaded machine. > > This means the tests may take up to 20 minutes, which is a bit too much > to my taste. During validation of the v2, the run-time of the check phase is: --8<---------------cut here---------------start------------->8--- phase `check' succeeded after 811.0 seconds --8<---------------cut here---------------end--------------->8--- That seems fine? We have software already that takes similar time (or even longer) to build. > > [...] > >> - ;; Note: The test_ssl test times out in the ci. >> - ;; Temporarily disable it until that is resolved. >> - ;; (invoke "faketime" "2022-10-24" >> - ;; "ctest" >> - ;; "-R" "^test_ssl$" >> - ;; "-j" jobs >> - ;; "--timeout" timeout >> - ;; "--output-on-failure") >> - ))))))) >> + (invoke "faketime" "2022-10-24" >> + "ctest" >> + "-R" "^test_ssl$" >> + "-j" jobs >> + ;; test_ssl sometimes hangs (at least when run under >> + ;; faketime), therefore set a time limit and retry >> + ;; few times on failure. >> + "--timeout" "240" >> + "--repeat" "until-pass:5" >> + "--output-on-failure")))))))) > > I think that a test sometimes hang is a good reason to leave it > disabled, report it to upstream, and reference the issue. The test can > be re-enabled when the issue is resolved and part of a new release. > > So in concrete terms, what I'd rather see here is a report of the > problems (requirement on faketime + propension to hang) to upstream, the > and an updated comment cross-referencing it (with the test kept > commented-out/disabled in the mean time). > > Does that make sense? As mentioned above, I do not expect upstream to care about this specific issue (faketime). However what I was successful in was convincing upstream to vastly increase the lifetime of the bundled SSL certificates used for testing. That means that the new version (2.0.11) will run the tests fine until ~end of the year 2297. Honestly I consider that to be an acceptable deadline to find a better long term solution (for example we could explore the use of time namespace for build environment). So I have submitted v2 which updates to the new version and removes all special casing for test_ssl, since it should not be required for more than quarter of a millennium. I hope this approach is acceptable. Have a nice day, Tomas -- There are only two hard things in Computer Science: cache invalidation, naming things and off-by-one errors.