-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
Is there an existing issue for this?
- I have checked for existing issues https://github.com/getsentry/sentry-javascript/issues
- I have reviewed the documentation https://docs.sentry.io/
- I am using the latest SDK release https://github.com/getsentry/sentry-javascript/releases
How do you use Sentry?
Sentry Saas (sentry.io)
Which SDK are you using?
@sentry/aws-serverless
SDK Version
10.15.0
Framework Version
No response
Link to Sentry event
No response
Reproduction Example/SDK Setup
https://github.com/callumgare/sentry-lambda-http-proxy-issue/tree/http-proxy-issue (note when cloning this is not the default branch so you'll have to also checkout the "http-proxy-issue" branch)
Steps to Reproduce
Deploy the stack in the reproduction repo above using the instructions in the README.
There should be 2 lambda urls printed in the console. Both lambdas are identical except one loads "@sentry/aws-serverless" from the Sentry's AWS lambda layer and other other from NPM. Send a request to each and for each one observe whether the error is received by sentry or not.
Expected Result
Both lambdas should result in sentry receiving an error.
Actual Result
The lambda which is setup to use load "@sentry/aws-serverless" from the Sentry AWS lambda layer does not successfully send the thrown error to sentry. However the other lambda, which uses "@sentry/aws-serverless" via npm installation instead of a lambda layer does send the error.
In the CloudWatch logs for the lambda layer based lambda the following is output:
Sentry Logger [log]: [https-proxy-agent] Creating `net.Socket`: %o { ALPNProtocols: [ 'http/1.1' ], host: '10.0.138.88', port: 3128 }
Sentry Logger [log]: [https-proxy-agent:parse-proxy-response] got proxy server response: %o %o HTTP/1.1 403 Forbidden {
server: 'squid/6.13',
'mime-version': '1.0',
date: 'Mon, 29 Sep 2025 03:23:10 GMT',
'content-type': 'text/html;charset=utf-8',
'content-length': '3490',
'x-squid-error': 'ERR_ACCESS_DENIED 0',
vary: 'Accept-Language',
'content-language': 'en',
'cache-status': 'ip-10-0-138-88.ap-southeast-2.compute.internal',
via: '1.1 ip-10-0-138-88.ap-southeast-2.compute.internal (squid/6.13)',
connection: 'close'
}
Sentry Logger [log]: [https-proxy-agent] Replaying proxy buffer for failed request
But in the logs for the npm based lambda this is output instead:
Sentry Logger [log]: [https-proxy-agent] Creating `net.Socket`: %o { ALPNProtocols: [ 'http/1.1' ], host: '10.0.138.88', port: 3128 }
Sentry Logger [log]: [https-proxy-agent:parse-proxy-response] got proxy server response: %o %o HTTP/1.1 200 Connection established
{}
Sentry Logger [log]: [https-proxy-agent] Upgrading socket connection to TLS
Additional Context
I believe the issue is connected to the fact that when "@sentry/aws-serverless" is imported from the Sentry AWS Lambda getSDKSource() returns "aws-lambda-layer" due to this but when "@sentry/aws-serverless" is imported from the npm installed module getSDKSource()
will return "npm".
When getSDKSource()
returns "aws-lambda-layer" then useLayerExtension
will be true here. This results in the tunnel
being set to "http://localhost:9000/envelope" here. That would be fine except because we're a http proxy sentry will send a request to the http proxy to be forwarded to "http://localhost:9000/envelope". If that proxy is on a different machine (which is likely for a http proxy since they're normally used at the edge of a network to control what traffic goes in and out) then "localhost" will refer to the machine that the proxy is running on and thus fail to reach the tunnel created by sentry.
Possible Fixes
Ideally the request would be proxied after it passes though the tunnel rather than before but another option might be to just not turn the lambda extension on by default if a http proxy is used.
Tip: React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1
or me too
, to help us triage it.
Metadata
Metadata
Assignees
Labels
Projects
Status