-
-
Save kingkool68/26aa7a3641a3851dc70ce7f44f589350 to your computer and use it in GitHub Desktop.
/** | |
* This is a simple AWS Lambda function that will look for a given file on S3 and return it | |
* passing along all of the headers of the S3 file. To make this available via a URL use | |
* API Gateway with an AWS Lambda Proxy Integration. | |
* | |
* Set the S3_REGION and S3_BUCKET global parameters in AWS Lambda | |
* Make sure the Lambda function is passed an object with `{ pathParameters : { proxy: 'path/to/file.jpg' } }` set | |
*/ | |
var AWS = require('aws-sdk'); | |
exports.handler = function( event, context, callback ) { | |
var region = process.env.S3_REGION; | |
var bucket = process.env.S3_BUCKET; | |
var key = decodeURI( event.pathParameters.proxy ); | |
// Basic server response | |
/* | |
var response = { | |
statusCode: 200, | |
headers: { | |
'Content-Type': 'text/plain', | |
}, | |
body: "Hello world!", | |
}; | |
callback( null, response ); | |
*/ | |
// Fetch from S3 | |
var s3 = new AWS.S3( Object.assign({ region: region }) ); | |
return s3.makeUnauthenticatedRequest( | |
'getObject', | |
{ Bucket: bucket, Key: key }, | |
function(err, data) { | |
if (err) { | |
return err; | |
} | |
var isBase64Encoded = false; | |
if ( data.ContentType.indexOf('image/') > -1 ) { | |
isBase64Encoded = true; | |
} | |
var encoding = ''; | |
if ( isBase64Encoded ) { | |
encoding = 'base64' | |
} | |
var resp = { | |
statusCode: 200, | |
headers: { | |
'Content-Type': data.ContentType, | |
}, | |
body: new Buffer(data.Body).toString(encoding), | |
isBase64Encoded: isBase64Encoded | |
}; | |
callback(null, resp); | |
} | |
); | |
}; |
@apotox You can try the above but you probably don't need the base64 encoding as the API Gateway only works with text. Trying to serve binary files through it would always return corrupted responses. There are also file size limits that API Gateway can deal with. I was using this to serve images and kept running into these issues.
To make lambda@edge work you can try using lambda to modify the HTML, then store it back in S3 and then send the new S3 URL back as the response. That worked for us in https://github.com/spiritedmedia/tachyon-edge
Good luck figuring out your issue.
Okay thank you,
my static web site has a lot of pages more than 28k, and I want something like Snippet Injection (like netlify)
i just used API gateway to handle and proxy request to a Lambda function, then i used this function to send a request to S3 webserver.
>
>
> const request = require("request")
> exports.handler = (event,c,callback) => {
>
> var options = {
> url: 'http://MY-S3-BUCKET.eu-central-1.amazonaws.com' + event.path,
> headers: {
> ...(event.headers),
> 'Host': 'http://MY-S3-BUCKET.eu-central-1.amazonaws.com'
> }
> };
> request(options,(err,res,body)=>{
> if(err){
> callback(null,{
> statusCode:404,
> body:"not found!"
> })
> }else{
> const response = {
> statusCode: 200,
> headers:{
> "Content-Type":res.headers['content-type']
> },
> body: body,
> };
> callback(null,response)
> }
> })
> };
when i get a response , i will insert a script in the body
hope it will work 👍
@apotox I was wondering if you managed to get Lambda to act as a proxy to serve S3 static website content? I tried your code above but I keep getting 404 body not found!
hi, how can I use it to serve a static website from s3 bucket, because I want to use lambda to add content to each 'text/Html' s3 object before serve it to the browser?
I tried lambda@Edge with CloudFront but unfortunately, it does not provide a way to change the responseBody
🆘