-
Notifications
You must be signed in to change notification settings - Fork 8
Fix proxy panic when origin returns compressed HTML/CSS responses #180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: feature/auction-integrations
Are you sure you want to change the base?
Fix proxy panic when origin returns compressed HTML/CSS responses #180
Conversation
|
I think in context of this we also need to think about possibilities of running out of memory using take_body_str() for larger responses. This will kill the request and throw a 503 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤔 We have functions to deal with proxying compressed content from publisher domains. See publisher.rs
I think best try to use the existing functions and refactor to be consistent for proxying any content either from creative or publisher domain.
|
I agree to be consistent and totally fine using existing functions, just not sure I've ever really considered super large pub domain response (I don't see where creative could overflow memory allocation), so more just wanted to clarify that Fastly Compute has a 128MB memory limit by default (but it can be increased), and while there's not much HTML in a single request that would hit this there may be buffering happening elsewhere trying to hold / pull the response together? |
b02d52c to
9de521a
Compare
d694498 to
eaee2cf
Compare
9de521a to
d0b99c1
Compare
aram356
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good enough for now if you address the changes. I think there is still opportunity to refactor so please create another ticket to consolidate proxying logic from publisher and creative
|
|
||
| impl StreamProcessor for CreativeCssProcessor { | ||
| fn process_chunk(&mut self, chunk: &[u8], is_last: bool) -> Result<Vec<u8>, io::Error> { | ||
| self.buffer.extend_from_slice(chunk); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔧 Let's do size check
if self.buffer.len() + chunk.len() > MAX_REWRITABLE_BODY_SIZE
| /// Create a new HTML processor with the given settings. | ||
| pub fn new(settings: &Settings) -> Self { | ||
| Self { | ||
| settings: settings.clone(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🔧 Clone is expensive. Do we need all settings?
| let config = PipelineConfig { | ||
| input_compression: compression, | ||
| output_compression: compression, // Preserve compression | ||
| chunk_size: 8192, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
⛏️ Should be const
| )); | ||
| } | ||
|
|
||
| if ct.contains("text/css") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
⛏️ This code inside if statement and code starting on line 180 looks similar
| let ct = ct_raw.to_ascii_lowercase(); | ||
| let compression = Compression::from_content_encoding(&content_encoding); | ||
|
|
||
| if ct.contains("text/html") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
⛏️ This code inside if statement and code starting on line 206 looks similar
Summary
Fixes a pre-existing bug where the first-party proxy panics when the origin server returns compressed (gzip, brotli, zstd) HTML or CSS content. This bug surfaced when mocktioneer was deployed to our Coolify instance, which serves responses with Content-Encoding: zstd or Content-Encoding: br when browsers send Accept-Encoding: gzip, deflate, br, zstd.
Problem
The proxy forwards the client's Accept-Encoding header to the origin server. When a browser requests content, it sends:
Accept-Encoding: gzip, deflate, br, zstd
The origin respects this and returns compressed content. However, finalize_proxied_response calls take_body_str() directly on the response body, which:
This worked in curl testing (which doesn't send Accept-Encoding by default) but failed 100% of the time in browsers.
Solution
Future Improvements
@aram356 should we add zstd suppor and replace the panicking functions?