-
Star
(114)
You must be signed in to star a gist -
Fork
(20)
You must be signed in to fork a gist
-
-
Save homam/8646090 to your computer and use it in GitHub Desktop.
var AWS = require('aws-sdk'), | |
fs = require('fs'); | |
// For dev purposes only | |
AWS.config.update({ accessKeyId: '...', secretAccessKey: '...' }); | |
// Read in the file, convert it to base64, store to S3 | |
fs.readFile('del.txt', function (err, data) { | |
if (err) { throw err; } | |
var base64data = new Buffer(data, 'binary'); | |
var s3 = new AWS.S3(); | |
s3.client.putObject({ | |
Bucket: 'banners-adxs', | |
Key: 'del2.txt', | |
Body: base64data, | |
ACL: 'public-read' | |
},function (resp) { | |
console.log(arguments); | |
console.log('Successfully uploaded package.'); | |
}); | |
}); |
@evolross You can either open up an http request and download it as a file first, or pipe it.
Honestly, I think it's to your benefit to just do this with the bare node https api via pipes. If you want something quick and dirty for downloads, @root/request will get the job done. Or axios. Or whatever else people are using these days. They don't much matter. I rolled my own to be lightweight, request
compatible, and to have 0 dependencies.
Thanks solderjs, using a fileStream worked perfect for me.
Typescript snippet:
async putFileInBucket(awsS3: AWS.S3, bucketName: string, fileName: string, fileExtension: string, fileStream: fs.ReadStream): Promise<void> {
await awsS3.putObject({
Body: fileStream,
Bucket: bucketName,
Key: `${fileName}.${fileExtension}`
}).promise()
.catch(err: AWSError => {
throw new AWSException(err, "problem uploading file")
})
}
const fileStream = fs.createReadStream(path.join(__dirname, "sample.wav"))
await this.putFileInBucket(awsS3, bucketName, "myFile, "wav", fileStream)
Hi team,
download pdf file from website , zip those pdf files and upload those zip file S3 bucket.
some one help me on that issue.
Hi team,
download pdf file from website , zip those pdf files and upload those zip file S3 bucket.
some one help me on that issue.
using Lambda function with node.js
it works for me :)
uploadContentFromFilePath = (fileName) => {
const fileContent = fs.createReadStream(${fileName}
);
return new Promise(function (resolve, reject) {
fileContent.once('error', reject);
s3.upload(
{
Bucket: 'test-bucket',
Key: ${fileName + '_' + Date.now().toString()}
,
ContentType: 'application/pdf',
ACL: 'public-read',
Body: fileContent
},
function (err, result) {
if (err) {
reject(err);
return;
}
resolve(result.Location);
}
);
});
}
if u get error 'putObject is not defined' , you can write like this ,
var s3 = new AWS.S3();
s3.putObject({
Bucket: 'xxx',
Key: 'xxx',
Body: 'what you want to upload',
},function () {
console.log('Successfully uploaded package.');
});
Thank you, everybody, for your comments 🙌
@coolaj86 Thank you for your snippet, I just wonder how to use it with an input file form?
Thanks
By far the best simple and straightforward code/implementation and brief explanation I've found. Been stuck on this for 2-3 days lol Thanks guys - peace and love
How can I create and get s3 bucket id?
What about uploading via a URL? I'm trying to figure out how to pass an image URL from the web and upload that image to S3. Many free image API libraries require this (e.g. pixabay).