How to upload a string to s3 as gzip Nodejs












0















I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.



Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.



In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:



client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);


Basically what i'm missing is the right implementation of the uploadToS3 method










share|improve this question

























  • Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

    – Michael - sqlbot
    Jan 19 at 20:24











  • Does it support uploading in gzip format ?

    – Mickey Hovel
    Jan 20 at 4:56











  • Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

    – Michael - sqlbot
    Jan 20 at 14:22











  • I'd appreciate if you will answer with code sample, I will test and confirm answer :)

    – Mickey Hovel
    Jan 20 at 14:31
















0















I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.



Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.



In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:



client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);


Basically what i'm missing is the right implementation of the uploadToS3 method










share|improve this question

























  • Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

    – Michael - sqlbot
    Jan 19 at 20:24











  • Does it support uploading in gzip format ?

    – Mickey Hovel
    Jan 20 at 4:56











  • Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

    – Michael - sqlbot
    Jan 20 at 14:22











  • I'd appreciate if you will answer with code sample, I will test and confirm answer :)

    – Mickey Hovel
    Jan 20 at 14:31














0












0








0








I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.



Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.



In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:



client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);


Basically what i'm missing is the right implementation of the uploadToS3 method










share|improve this question
















I have a node.JS program that reads array of string from Redis.
I need to upload this string to my bucket in AWS s3, in a GZIP format, without creating a gzip file locally before uploading.



Basically I want kind of streaming the read strings from Redis to s3 bucket gzip compressed.



In addition to the above problem, I'd like to know what is the efficient and recommended way to program in this way, so the file that will be stored in S3 will be max 64 MB, and incase there is additional data, additional file is created (limited to max 64MB as well).
In the below example I show how I read from the Redis key the value to be stored in s3 gzip limit to 64 MB:



client.lrange(key, 0, -1, (error, arrayStringValue) => {
if (arrayStringValue == null || !arrayStringValue.length) {
client.del(key);
return console.log("removing key, no values");
}
if (error) reject(error);
console.log("finish iterating");
impressionsRecorded = true;
//compress
uploadToS3(arrayStringValue, "bucket/key", null);


Basically what i'm missing is the right implementation of the uploadToS3 method







node.js amazon-s3 upload gzip






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 19 at 18:46







Mickey Hovel

















asked Jan 19 at 18:34









Mickey HovelMickey Hovel

180216




180216













  • Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

    – Michael - sqlbot
    Jan 19 at 20:24











  • Does it support uploading in gzip format ?

    – Mickey Hovel
    Jan 20 at 4:56











  • Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

    – Michael - sqlbot
    Jan 20 at 14:22











  • I'd appreciate if you will answer with code sample, I will test and confirm answer :)

    – Mickey Hovel
    Jan 20 at 14:31



















  • Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

    – Michael - sqlbot
    Jan 19 at 20:24











  • Does it support uploading in gzip format ?

    – Mickey Hovel
    Jan 20 at 4:56











  • Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

    – Michael - sqlbot
    Jan 20 at 14:22











  • I'd appreciate if you will answer with code sample, I will test and confirm answer :)

    – Mickey Hovel
    Jan 20 at 14:31

















Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

– Michael - sqlbot
Jan 19 at 20:24





Is AWS.S3.putObject() from the JavaScript SDK not "the right implementation" ...? Not sure what you are looking for, if not that.

– Michael - sqlbot
Jan 19 at 20:24













Does it support uploading in gzip format ?

– Mickey Hovel
Jan 20 at 4:56





Does it support uploading in gzip format ?

– Mickey Hovel
Jan 20 at 4:56













Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

– Michael - sqlbot
Jan 20 at 14:22





Well, yes and no. You still need to do the gzipping, but since var params = { Body: <Binary String>, uploading the (binary) gzip data isn't a problem. You would also probably want to set ContentEncoding: 'gzip' so clients understand what to do with the data. (This doesn't cause the content to be gzipped, it just sets Content-Encoding: gzip in the response headers when the object is fetched, which most clients will use to automatically/transparently decompress the payload.)

– Michael - sqlbot
Jan 20 at 14:22













I'd appreciate if you will answer with code sample, I will test and confirm answer :)

– Mickey Hovel
Jan 20 at 14:31





I'd appreciate if you will answer with code sample, I will test and confirm answer :)

– Mickey Hovel
Jan 20 at 14:31












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54270169%2fhow-to-upload-a-string-to-s3-as-gzip-nodejs%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54270169%2fhow-to-upload-a-string-to-s3-as-gzip-nodejs%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Homophylophilia

Updating UILabel text programmatically using a function

Cloud Functions - OpenCV Videocapture Read method fails for larger files from cloud storage