Redirect output of console to a file on AWS S3












7















Say I have a website that return me JSON data when I send a GET request using curl. I want to re-direct the output of curl to AWS S3. A new file should be created on S3 for it.



Currently I am able to redirect the output to store it locally.



curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json")


I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ?



Assume :




  1. AWS S3 access key and secret key are already set.

  2. Location to store file : mybucket/$(date +"%Y-%m-%d_%H-%M.json"










share|improve this question























  • One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

    – Spandan Brahmbhatt
    Feb 16 '17 at 20:47
















7















Say I have a website that return me JSON data when I send a GET request using curl. I want to re-direct the output of curl to AWS S3. A new file should be created on S3 for it.



Currently I am able to redirect the output to store it locally.



curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json")


I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ?



Assume :




  1. AWS S3 access key and secret key are already set.

  2. Location to store file : mybucket/$(date +"%Y-%m-%d_%H-%M.json"










share|improve this question























  • One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

    – Spandan Brahmbhatt
    Feb 16 '17 at 20:47














7












7








7








Say I have a website that return me JSON data when I send a GET request using curl. I want to re-direct the output of curl to AWS S3. A new file should be created on S3 for it.



Currently I am able to redirect the output to store it locally.



curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json")


I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ?



Assume :




  1. AWS S3 access key and secret key are already set.

  2. Location to store file : mybucket/$(date +"%Y-%m-%d_%H-%M.json"










share|improve this question














Say I have a website that return me JSON data when I send a GET request using curl. I want to re-direct the output of curl to AWS S3. A new file should be created on S3 for it.



Currently I am able to redirect the output to store it locally.



curl -s -X GET 'http://website_that_returns_json.com' > folder_to_save/$(date +"%Y-%m-%d_%H-%M.json")


I have AWS CLI and s3cmd installed. How would I redirect the output of create to create a new file on AWS S3 ?



Assume :




  1. AWS S3 access key and secret key are already set.

  2. Location to store file : mybucket/$(date +"%Y-%m-%d_%H-%M.json"







amazon-s3 aws-cli






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Feb 16 '17 at 20:40









Spandan BrahmbhattSpandan Brahmbhatt

1,1023823




1,1023823













  • One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

    – Spandan Brahmbhatt
    Feb 16 '17 at 20:47



















  • One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

    – Spandan Brahmbhatt
    Feb 16 '17 at 20:47

















One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

– Spandan Brahmbhatt
Feb 16 '17 at 20:47





One way I can think of is to save the file locally and then use - aws s3 cp local_copy s3_path. But is there an efficient way (not saving an intermediate file) to do this ?

– Spandan Brahmbhatt
Feb 16 '17 at 20:47












2 Answers
2






active

oldest

votes


















7














The AWS Command-Line Interface (CLI) has the ability to stream data to/from Amazon S3:




The following cp command uploads a local file stream from standard input to a specified bucket and key:




aws s3 cp - s3://mybucket/stream.txt


So, you could use:



curl xxx | aws s3 cp - s3://mybucket/object.txt


However, it's probably safer to save the file locally and then copy it to Amazon S3.






share|improve this answer
























  • Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

    – Zero3
    Aug 12 '18 at 21:11













  • It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

    – Eric Seastrand
    Oct 14 '18 at 0:00



















0














In case you'd like to run the command on the remote, use aws ssm send-command.



Then to redirect the output of that command to S3, you can use --output-s3-bucket-name parameter.



Here is Bash script to run PowerShell script on the remote and upload it into S3 bucket:



instanceId="i-xyz"
bucketName="bucket_to_save"
bucketDir="folder_to_save"
command="Invoke-WebRequest -UseBasicParsing -Uri http://example.com).Content"
cmdId=$(aws ssm send-command --instance-ids "$instanceId" --document-name "AWS-RunPowerShellScript" --query "Command.CommandId" --output text --output-s3-bucket-name "$bucketName" --output-s3-key-prefix "$bucketDir" --parameters commands="'${command}'")
while [ "$(aws ssm list-command-invocations --command-id "$cmdId" --query "CommandInvocations.Status" --output text)" == "InProgress" ]; do sleep 1; done
outputPath=$(aws ssm list-command-invocations --command-id "$cmdId" --details --query "CommandInvocations.CommandPlugins.OutputS3KeyPrefix" --output text)
echo "Command output uploaded at: s3://${bucketName}/${outputPath}"
aws s3 ls "s3://${bucketName}/${outputPath}"


To output the uploaded S3 files, run:



aws s3 ls s3://${bucketName}/${outputPath}/stderr.txt && aws s3 cp --quiet s3://${bucketName}/${outputPath}/stderr.txt /dev/stderr
aws s3 cp --quiet s3://${bucketName}/${outputPath}/stdout.txt /dev/stdout





share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42284178%2fredirect-output-of-console-to-a-file-on-aws-s3%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    7














    The AWS Command-Line Interface (CLI) has the ability to stream data to/from Amazon S3:




    The following cp command uploads a local file stream from standard input to a specified bucket and key:




    aws s3 cp - s3://mybucket/stream.txt


    So, you could use:



    curl xxx | aws s3 cp - s3://mybucket/object.txt


    However, it's probably safer to save the file locally and then copy it to Amazon S3.






    share|improve this answer
























    • Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

      – Zero3
      Aug 12 '18 at 21:11













    • It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

      – Eric Seastrand
      Oct 14 '18 at 0:00
















    7














    The AWS Command-Line Interface (CLI) has the ability to stream data to/from Amazon S3:




    The following cp command uploads a local file stream from standard input to a specified bucket and key:




    aws s3 cp - s3://mybucket/stream.txt


    So, you could use:



    curl xxx | aws s3 cp - s3://mybucket/object.txt


    However, it's probably safer to save the file locally and then copy it to Amazon S3.






    share|improve this answer
























    • Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

      – Zero3
      Aug 12 '18 at 21:11













    • It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

      – Eric Seastrand
      Oct 14 '18 at 0:00














    7












    7








    7







    The AWS Command-Line Interface (CLI) has the ability to stream data to/from Amazon S3:




    The following cp command uploads a local file stream from standard input to a specified bucket and key:




    aws s3 cp - s3://mybucket/stream.txt


    So, you could use:



    curl xxx | aws s3 cp - s3://mybucket/object.txt


    However, it's probably safer to save the file locally and then copy it to Amazon S3.






    share|improve this answer













    The AWS Command-Line Interface (CLI) has the ability to stream data to/from Amazon S3:




    The following cp command uploads a local file stream from standard input to a specified bucket and key:




    aws s3 cp - s3://mybucket/stream.txt


    So, you could use:



    curl xxx | aws s3 cp - s3://mybucket/object.txt


    However, it's probably safer to save the file locally and then copy it to Amazon S3.







    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Feb 16 '17 at 21:50









    John RotensteinJohn Rotenstein

    71.9k781125




    71.9k781125













    • Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

      – Zero3
      Aug 12 '18 at 21:11













    • It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

      – Eric Seastrand
      Oct 14 '18 at 0:00



















    • Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

      – Zero3
      Aug 12 '18 at 21:11













    • It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

      – Eric Seastrand
      Oct 14 '18 at 0:00

















    Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

    – Zero3
    Aug 12 '18 at 21:11







    Sidenote: It is also possible to use Amazon's CLI with other S3 compatible services using the --endpoint-url <server URL> flag. Also, it appears like this tool properly streams the file to the server without caching the whole file in memory first, which is important for large files.

    – Zero3
    Aug 12 '18 at 21:11















    It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

    – Eric Seastrand
    Oct 14 '18 at 0:00





    It works in reverse too! You can use - as the output file to read a file. I use it to view gzipped SQL backups like this: aws s3 cp "s3://bucket_where_I_keep_db_backups/Sat Oct 13 23:53:27 UTC 2018.sql.gz" - | gzip -d | less

    – Eric Seastrand
    Oct 14 '18 at 0:00













    0














    In case you'd like to run the command on the remote, use aws ssm send-command.



    Then to redirect the output of that command to S3, you can use --output-s3-bucket-name parameter.



    Here is Bash script to run PowerShell script on the remote and upload it into S3 bucket:



    instanceId="i-xyz"
    bucketName="bucket_to_save"
    bucketDir="folder_to_save"
    command="Invoke-WebRequest -UseBasicParsing -Uri http://example.com).Content"
    cmdId=$(aws ssm send-command --instance-ids "$instanceId" --document-name "AWS-RunPowerShellScript" --query "Command.CommandId" --output text --output-s3-bucket-name "$bucketName" --output-s3-key-prefix "$bucketDir" --parameters commands="'${command}'")
    while [ "$(aws ssm list-command-invocations --command-id "$cmdId" --query "CommandInvocations.Status" --output text)" == "InProgress" ]; do sleep 1; done
    outputPath=$(aws ssm list-command-invocations --command-id "$cmdId" --details --query "CommandInvocations.CommandPlugins.OutputS3KeyPrefix" --output text)
    echo "Command output uploaded at: s3://${bucketName}/${outputPath}"
    aws s3 ls "s3://${bucketName}/${outputPath}"


    To output the uploaded S3 files, run:



    aws s3 ls s3://${bucketName}/${outputPath}/stderr.txt && aws s3 cp --quiet s3://${bucketName}/${outputPath}/stderr.txt /dev/stderr
    aws s3 cp --quiet s3://${bucketName}/${outputPath}/stdout.txt /dev/stdout





    share|improve this answer




























      0














      In case you'd like to run the command on the remote, use aws ssm send-command.



      Then to redirect the output of that command to S3, you can use --output-s3-bucket-name parameter.



      Here is Bash script to run PowerShell script on the remote and upload it into S3 bucket:



      instanceId="i-xyz"
      bucketName="bucket_to_save"
      bucketDir="folder_to_save"
      command="Invoke-WebRequest -UseBasicParsing -Uri http://example.com).Content"
      cmdId=$(aws ssm send-command --instance-ids "$instanceId" --document-name "AWS-RunPowerShellScript" --query "Command.CommandId" --output text --output-s3-bucket-name "$bucketName" --output-s3-key-prefix "$bucketDir" --parameters commands="'${command}'")
      while [ "$(aws ssm list-command-invocations --command-id "$cmdId" --query "CommandInvocations.Status" --output text)" == "InProgress" ]; do sleep 1; done
      outputPath=$(aws ssm list-command-invocations --command-id "$cmdId" --details --query "CommandInvocations.CommandPlugins.OutputS3KeyPrefix" --output text)
      echo "Command output uploaded at: s3://${bucketName}/${outputPath}"
      aws s3 ls "s3://${bucketName}/${outputPath}"


      To output the uploaded S3 files, run:



      aws s3 ls s3://${bucketName}/${outputPath}/stderr.txt && aws s3 cp --quiet s3://${bucketName}/${outputPath}/stderr.txt /dev/stderr
      aws s3 cp --quiet s3://${bucketName}/${outputPath}/stdout.txt /dev/stdout





      share|improve this answer


























        0












        0








        0







        In case you'd like to run the command on the remote, use aws ssm send-command.



        Then to redirect the output of that command to S3, you can use --output-s3-bucket-name parameter.



        Here is Bash script to run PowerShell script on the remote and upload it into S3 bucket:



        instanceId="i-xyz"
        bucketName="bucket_to_save"
        bucketDir="folder_to_save"
        command="Invoke-WebRequest -UseBasicParsing -Uri http://example.com).Content"
        cmdId=$(aws ssm send-command --instance-ids "$instanceId" --document-name "AWS-RunPowerShellScript" --query "Command.CommandId" --output text --output-s3-bucket-name "$bucketName" --output-s3-key-prefix "$bucketDir" --parameters commands="'${command}'")
        while [ "$(aws ssm list-command-invocations --command-id "$cmdId" --query "CommandInvocations.Status" --output text)" == "InProgress" ]; do sleep 1; done
        outputPath=$(aws ssm list-command-invocations --command-id "$cmdId" --details --query "CommandInvocations.CommandPlugins.OutputS3KeyPrefix" --output text)
        echo "Command output uploaded at: s3://${bucketName}/${outputPath}"
        aws s3 ls "s3://${bucketName}/${outputPath}"


        To output the uploaded S3 files, run:



        aws s3 ls s3://${bucketName}/${outputPath}/stderr.txt && aws s3 cp --quiet s3://${bucketName}/${outputPath}/stderr.txt /dev/stderr
        aws s3 cp --quiet s3://${bucketName}/${outputPath}/stdout.txt /dev/stdout





        share|improve this answer













        In case you'd like to run the command on the remote, use aws ssm send-command.



        Then to redirect the output of that command to S3, you can use --output-s3-bucket-name parameter.



        Here is Bash script to run PowerShell script on the remote and upload it into S3 bucket:



        instanceId="i-xyz"
        bucketName="bucket_to_save"
        bucketDir="folder_to_save"
        command="Invoke-WebRequest -UseBasicParsing -Uri http://example.com).Content"
        cmdId=$(aws ssm send-command --instance-ids "$instanceId" --document-name "AWS-RunPowerShellScript" --query "Command.CommandId" --output text --output-s3-bucket-name "$bucketName" --output-s3-key-prefix "$bucketDir" --parameters commands="'${command}'")
        while [ "$(aws ssm list-command-invocations --command-id "$cmdId" --query "CommandInvocations.Status" --output text)" == "InProgress" ]; do sleep 1; done
        outputPath=$(aws ssm list-command-invocations --command-id "$cmdId" --details --query "CommandInvocations.CommandPlugins.OutputS3KeyPrefix" --output text)
        echo "Command output uploaded at: s3://${bucketName}/${outputPath}"
        aws s3 ls "s3://${bucketName}/${outputPath}"


        To output the uploaded S3 files, run:



        aws s3 ls s3://${bucketName}/${outputPath}/stderr.txt && aws s3 cp --quiet s3://${bucketName}/${outputPath}/stderr.txt /dev/stderr
        aws s3 cp --quiet s3://${bucketName}/${outputPath}/stdout.txt /dev/stdout






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Feb 21 '18 at 13:08









        kenorbkenorb

        67.7k28405401




        67.7k28405401






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f42284178%2fredirect-output-of-console-to-a-file-on-aws-s3%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Liquibase includeAll doesn't find base path

            How to use setInterval in EJS file?

            Petrus Granier-Deferre