-
-
Notifications
You must be signed in to change notification settings - Fork 368
Open
Description
While putting together #1513, I discovered that cargo test would report random tests failing.
The common factor seems to be that they're functional tests and the errors I've observed are either timeout waiting for port or Address already in use or ConnectionRefused, which very much feels like the test are racing with the server they depend on and with each other.
Here are some examples, each from a different run:
failures:
---- auth_file_accepts::case_1 stdout ----
[miniserve stdout] miniserve v0.31.0
[miniserve stderr] Sun, 03 Aug 2025 04:06:19 +0000 [ERROR] Failed to bind server to [::]:33675
[miniserve stderr] Sun, 03 Aug 2025 04:06:19 +0000 [ERROR] caused by: Address already in use (os error 98)
[miniserve stderr] Error: Failed to bind server to [::]:33675
[miniserve stderr] caused by: Address already in use (os error 98)
thread 'auth_file_accepts::case_1' panicked at tests/fixtures/mod.rs:179:13:
timeout waiting for port 33675
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
auth_file_accepts::case_1
test result: FAILED. 5 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 2.03s
---- bind_ipv4_ipv6::case_2 stdout ----
[miniserve stdout] miniserve v0.31.0
[miniserve stdout] Sun, 03 Aug 2025 04:09:32 +0000 [INFO] starting 12 workers
[miniserve stdout] Bound to [::]:43859
[miniserve stdout] Serving path /tmp/.tmpfnYGhM
[miniserve stdout] Available at (non-exhaustive list):
[miniserve stdout] http://[::1]:43859
[miniserve stdout]
[miniserve stdout] Sun, 03 Aug 2025 04:09:32 +0000 [INFO] Actix runtime found; starting in Actix runtime
[miniserve stdout] Sun, 03 Aug 2025 04:09:32 +0000 [INFO] starting service: "actix-web-service-[::]:43859", workers: 12, listening on: [::]:43859
thread 'bind_ipv4_ipv6::case_2' panicked at tests/fixtures/mod.rs:179:13:
timeout waiting for port 43859
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
bind_ipv4_ipv6::case_2
test result: FAILED. 12 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished in 2.04s
...and with my PR (#1513) which added a miniscule delay to program startup imposed by performing two network requests (for IPv4 and IPv6) to a "What is my public IP address?" service, validate_printed_urls started failing with this output:
failures:
---- validate_printed_urls::case_2 stdout ----
Error: reqwest::Error { kind: Request, url: "http://[::1]:35471/", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", [::1]:35471, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_5 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:38693/", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:38693, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_3 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:35529/", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:35529, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_4 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:33055/", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:33055, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_1 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:41519/", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:41519, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_6 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:43281/7f86b7", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:43281, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- validate_printed_urls::case_7 stdout ----
Error: reqwest::Error { kind: Request, url: "http://127.0.0.1:41225/prefix", source: hyper_util::client::legacy::Error(Connect, ConnectError("tcp connect error", 127.0.0.1:41225, Os { code: 111, kind: ConnectionRefused, message: "Connection refused" })) }
---- bind_ipv4_ipv6::case_2 stdout ----
[miniserve stdout] miniserve v0.31.0
[miniserve stdout] Sun, 03 Aug 2025 03:53:46 +0000 [INFO] starting 12 workers
[miniserve stdout] Bound to [::]:33347
[miniserve stdout] Serving path /tmp/.tmph1gmL1
[miniserve stdout] Available at (non-exhaustive list):
[miniserve stdout] http://[::1]:33347
[miniserve stdout]
[miniserve stdout] Depending on firewall/NAT settings, may be publicly accessible at:
[miniserve stdout] http://[HAND-REDACTED BECAUSE MY ACTUAL PUBLIC IP]:33347
[miniserve stdout]
[miniserve stdout] Sun, 03 Aug 2025 03:53:46 +0000 [INFO] Actix runtime found; starting in Actix runtime
[miniserve stdout] Sun, 03 Aug 2025 03:53:46 +0000 [INFO] starting service: "actix-web-service-[::]:33347", workers: 12, listening on: [::]:33347
thread 'bind_ipv4_ipv6::case_2' panicked at tests/fixtures/mod.rs:179:13:
timeout waiting for port 33347
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
failures:
bind_ipv4_ipv6::case_2
validate_printed_urls::case_1
validate_printed_urls::case_2
validate_printed_urls::case_3
validate_printed_urls::case_4
validate_printed_urls::case_5
validate_printed_urls::case_6
validate_printed_urls::case_7
test result: FAILED. 5 passed; 8 failed; 0 ignored; 0 measured; 0 filtered out; finished in 2.03s
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels