How to upload a string to s3 as gzip Nodejs
I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.
Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.
In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:
client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);
Basically what i'm missing is the right implementation of the uploadToS3 method
node.js amazon-s3 upload gzip
add a comment |
I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.
Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.
In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:
client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);
Basically what i'm missing is the right implementation of the uploadToS3 method
node.js amazon-s3 upload gzip
IsAWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.
– Michael - sqlbot
Jan 19 at 20:24
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Well, yes and no. You still need to do the gzipping, but sincevar params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to setContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just setsContent-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)
– Michael - sqlbot
Jan 20 at 14:22
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31
add a comment |
I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.
Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.
In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:
client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);
Basically what i'm missing is the right implementation of the uploadToS3 method
node.js amazon-s3 upload gzip
I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.
Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.
In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:
client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);
Basically what i'm missing is the right implementation of the uploadToS3 method
node.js amazon-s3 upload gzip
node.js amazon-s3 upload gzip
edited Jan 19 at 18:46
Mickey Hovel
asked Jan 19 at 18:34
Mickey HovelMickey Hovel
180216
180216
IsAWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.
– Michael - sqlbot
Jan 19 at 20:24
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Well, yes and no. You still need to do the gzipping, but sincevar params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to setContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just setsContent-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)
– Michael - sqlbot
Jan 20 at 14:22
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31
add a comment |
IsAWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.
– Michael - sqlbot
Jan 19 at 20:24
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Well, yes and no. You still need to do the gzipping, but sincevar params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to setContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just setsContent-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)
– Michael - sqlbot
Jan 20 at 14:22
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31
Is
AWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.– Michael - sqlbot
Jan 19 at 20:24
Is
AWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.– Michael - sqlbot
Jan 19 at 20:24
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Well, yes and no. You still need to do the gzipping, but since
var params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)– Michael - sqlbot
Jan 20 at 14:22
Well, yes and no. You still need to do the gzipping, but since
var params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)– Michael - sqlbot
Jan 20 at 14:22
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54270169%2fhow-to-upload-a-string-to-s3-as-gzip-nodejs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54270169%2fhow-to-upload-a-string-to-s3-as-gzip-nodejs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Is
AWS.S3.putObject()
from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.– Michael - sqlbot
Jan 19 at 20:24
Does it support uploading in gzip format ?
– Mickey Hovel
Jan 20 at 4:56
Well, yes and no. You still need to do the gzipping, but since
var params = { Body: <Binary String>,
uploading the (binary) gzip data isn't a problem. You would also probably want to setContentEncoding: 'gzip'
so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just setsContent-Encoding: gzip
in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)– Michael - sqlbot
Jan 20 at 14:22
I'd appreciate if you will answer with code sample, I will test and confirm answer :)
– Mickey Hovel
Jan 20 at 14:31