-
Notifications
You must be signed in to change notification settings - Fork 59
Description
This is a follow-up to #128 (comment) .
I currently have a setup running
- Coraza 3.3.3
on - Caddy 2.10.0
having built Caddy using xcaddy from scratch in a Ubuntu 24.04 environment.
Under "some certain circumstances", I have the issue that the Coraza firewall prematurely closes the client connection, logging the error message
WARN http.handlers.reverse_proxy aborting with incomplete response {"upstream": "localhost:3001",
"duration": 0.000945512, "request": {"remote_ip": "::1", "remote_port": "49520", "client_ip": "::1", "proto": "HTTP/1.1",
"method": "GET", "host": "localhost:3000", "uri": "/", "headers": {"User-Agent": ["curl/8.5.0"],
"Accept": ["*/*"], "X-Forwarded-For": ["::1"], "X-Forwarded-Proto": ["http"], "X-Forwarded-Host": ["localhost:3000"],
"Via": ["1.1 Caddy"]}}, "error": "short write"}
on requests, which are "a little bit larger than the usual rest".
I was able to bring down the case to a "minimal reproducible" state, which goes like this:
Configuration of Caddy:
{
order coraza_waf first
}
:3000 {
coraza_waf {
directives `
Include crs/coraza.conf
Include crs/crs-setup.conf
Include crs/@owasp_crs/*.conf
`
}
log stdout_accesslog {
output stdout
format console
}
reverse_proxy localhost:3001
}
Configuration of Coraza:
SecRuleEngine DetectionOnly
SecRequestBodyAccess On
SecRequestBodyLimit 131072
SecRequestBodyInMemoryLimit 131072
SecRequestBodyLimitAction ProcessPartial
SecResponseBodyAccess On
SecResponseBodyMimeType application/json text/json
SecResponseBodyLimit 2048
SecResponseBodyLimitAction ProcessPartial
Given that SecRuleEngine DetectionOnly, SecRequestBodyLimitAction ProcessPartial and SecResponseBodyLimitAction ProcessPartial are set, it is expected that the WAF will never break any connection prematurely (i.e. does never issue a 502, so protection is disabled and WAF is in "report-only" mode).
To host a backend at port 3001, I used various files for
cat filenamegoeshere | nc -l 3001
to inject possible responses (and to avoid any "strange handling the backend does which may confuse Coraza").
To trigger the request, I used
curl -v --raw http://localhost:3000/
(also to avoid any "strange handling the client does to confuse Caddy with").
Depending on the file, this could either let the test go through properly or cause the error message stated above.
My aim was to find two publishable, minimal-similar files where the first fails and the second still goes through.
I have packaged the two files into redacted.log.reduced4k.zip:
redacted.log.reduced4kcauses the error to occur.redacted.log.reduced4k.stillworkingdoes not cause the error to occur.
The difference of these two files is exactly one byte: The second chunk of redacted.log.reduced4k.stillworking is exactly one byte shorter (second last byte of the chunk). Any response I found which is longer than redacted.log.reduced4k (e.g. contains more chunks) will also trigger the error.
The combined response body size is 11,556 bytes.
The error is reproducible at a rate of 100%, so it's debuggable.
Originally, I only had much larger files with which I could reproduce the issue. I was able to reduce the sizes of the response significantly by decreasing the value of SecResponseBodyLimit. However, this does not hold true anymore! Decreasing this limit any further does not change the failure behavior anymore.
As I am not so deep into neither Coraza nor Golang (in general), can anyone of you have a look at this bug?
Thanks!
PS: Before there is a "I-am-important-security uproar": The response body might look like "sensitive data", but it isn't: It's purely invented/generated data with no real-world manifestation.