Skip to content

Conversation

@mholt
Copy link
Member

@mholt mholt commented Feb 8, 2021

Adds 4 placeholders, one is actually outside reverse proxy though:

{http.request.duration} is how long since the server decoded the HTTP request (headers).
{http.reverse_proxy.upstream.latency} is how long it took a proxy upstream to write the response header.
{http.reverse_proxy.upstream.duration} is total time proxying to the upstream, including writing response body to client.
{http.reverse_proxy.duration} is total time spent proxying, including selecting an upstream and retries.

Obviously, most of these are only useful at the end of a request, like when writing response headers or logs.

See also: https://caddy.community/t/any-equivalent-of-request-time-and-upstream-header-time-from-nginx/11418

Closes #4012

Adds 4 placeholders, one is actually outside reverse proxy though:

{http.request.duration} is how long since the server decoded the HTTP request (headers).
{http.reverse_proxy.upstream.latency} is how long it took a proxy upstream to write the response header.
{http.reverse_proxy.upstream.duration} is total time proxying to the upstream, including writing response body to client.
{http.reverse_proxy.duration} is total time spent proxying, including selecting an upstream and retries.

Obviously, most of these are only useful at the end of a request, like when writing response headers or logs.

See also: https://caddy.community/t/any-equivalent-of-request-time-and-upstream-header-time-from-nginx/11418
@mholt mholt added this to the v2.4.0 milestone Feb 8, 2021
@mholt mholt self-assigned this Feb 8, 2021
@mholt mholt added the under review 🧐 Review is pending before merging label Feb 8, 2021
@francislavoie
Copy link
Member

Should add these the godocs, right?

@mholt
Copy link
Member Author

mholt commented Feb 8, 2021

Yeah, good point. Doing that now.

@MarioIshac
Copy link

MarioIshac commented Feb 12, 2021

Alright this is looking great, given this caddy.json I've successfully received {http.request.duration} and {http.reverse_proxy.upstream.latency} as headers. And, as expected, {http.reverse_proxy.upstream.duration} and {http.reverse_proxy.duration} are not available at header-time because Caddy has already sent the response by then.

Note that "buffer_requests": true was required to make {http.reverse_proxy.upstream.latency} independent of file uploading time from the client. When Caddy doesn't buffer a request (by default), it streams the request bytes to upstream, meaning that upstream is contacted (and {http.reverse_proxy.upstream.latency} starts going) before a client finished sending the request. In my case, this caused the upstream latency to look big (and actually roughly equal the request duration) even though it was my client's fault for being slow uploader.

What if {http.reverse.proxy.upstream.*} started going after the last byte of the request finished being sent upstream, instead of the first byte? Would this make it independent from buffer_requests? Don't have an opinion here, because not sure what the implications would be.

@mholt
Copy link
Member Author

mholt commented Feb 12, 2021

@MarioIshac Great, thanks for trying it out.

What if {http.reverse.proxy.upstream.*} started going after the last byte of the request finished being sent upstream, instead of the first byte?

I'm not sure that's a reasonable assumption we can make: AFAIK there's no reason a backend can't start sending the response before it has finished reading the request, and in fact I'm pretty sure quite a few servers work this way (and has been the cause of some bugs in the Go standard library in the past).

In my case, this caused the upstream latency to look big (and actually roughly equal the request duration) even though it was my client's fault for being slow uploader.

I think that's actually good information from these placeholders -- the small difference makes it easy to show that the client uploading is the bulk of the request time, but this deduction requires knowing how your backend works. A backend will either wait to download the entire body before writing a response, or it will start writing a response before the body has been fully read.

@francislavoie
Copy link
Member

I figure we can merge this?

@mholt
Copy link
Member Author

mholt commented Feb 15, 2021

Is the implementation of {http.reverse_proxy.upstream.latency} satisfactory though?

@MarioIshac
Copy link

Currently using the build artifact from this PR in prod, been working well. Let me know if there's anything more I can do to help get this merged, but don't have the expertise to comment on the implementation itself though.

@francislavoie
Copy link
Member

I think the only open question was whether we agree how .latency behaves, but if Matt's answer is satisfactory I think we can merge it.

Thanks!

@MarioIshac
Copy link

Matt's answer makes sense to me, especially after I looked into what types of servers would send response back before request is complete (sending "unauthorized" instead of waiting for file upload to complete, for example). Because this is an implementation detail of the backend, the backend can send the time it received the last byte of the request if desired outside of Caddy.

@francislavoie
Copy link
Member

I think this might resolve some of the concerns in #2268 as well @mholt?

@mholt
Copy link
Member Author

mholt commented Feb 22, 2021

@francislavoie Hmm, yes I believe it does. Between these 4 placeholders, they should be able to get the info they need. Thanks for linking that up.

I'll go ahead and merge this then. Thank you both!

@mholt mholt removed the under review 🧐 Review is pending before merging label Feb 22, 2021
@mholt mholt merged commit 6722ae3 into master Feb 22, 2021
@mholt mholt deleted the time-placeholders branch February 22, 2021 18:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: Placeholders (at header time) for request time & upstream time.

3 participants