阅读背景:

Streaming-s3没有正确上传文件到aws

来源:互联网 

I am using nodejs to upload files to aws server. and found that file size is not properly there. I am getting 2.1KB.

我正在使用nodejs将文件上传到aws服务器。并发现文件大小不正确。我得到2.1KB。

here is my code

这是我的代码

var uploadFile = function (fileReadStream, awsHeader, cb) {

    //set options for the streaming module
    var options = {
        concurrentParts: 2,
        waitTime: 20000,
        retries: 2,
        maxPartSize: 10 * 1024 * 1024
    };
    //call stream function to upload the file to s3
    var uploader = new streamingS3(fileReadStream, config.aws.accessKey, config.aws.secretKey, awsHeader, options);
    //start uploading
    uploader.begin();// important if callback not provided.

    // handle these functions
    uploader.on('data', function (bytesRead) {
        //console.log(bytesRead, ' bytes read.');
    });

    uploader.on('part', function (number) {
        //console.log('Part ', number, ' uploaded.');
    });

    // All parts uploaded, but upload not yet acknowledged.
    uploader.on('uploaded', function (stats) {
        //console.log('Upload stats: ', stats);
    });

    uploader.on('finished', function (response, stats) {
        console.log(response);
        cb(null, response);
    });

    uploader.on('error', function (err) {
        console.log('Upload error: ', err);
        cb(err);
    });
};

although, I got file name in my aws bucket. but then I try to open the file it says failed.

虽然,我在我的aws桶中得到了文件名。但后来我尝试打开它说失败的文件。

I am trying to upload this file from url: https://arxiv.org/pdf/1701.00003.pdf

我正在尝试从url上传此文件:https://arxiv.org/pdf/1701.00003.pdf

1 个解决方案

#1


2  

There is no need anymore for an external module for uploading streams to s3. Now there is a new method in aws-sdk called s3.upload, which can upload an arbitrarily sized buffer, blob, or stream. You can check the documentation here

外部模块不再需要将流上传到s3。现在在aws-sdk中有一个名为s3.upload的新方法,它可以上传任意大小的缓冲区,blob或流。您可以在此处查看文档

The code I used:

我用过的代码:

const aws = require('aws-sdk');
const s3 = new aws.S3({
    credentials:{
        accessKeyId: "ACCESS_KEY",
        secretAccessKey: "SECRET_ACCESS_KEY"
    }
});
const fs = require('fs');
const got = require('got');

//fs stream test
s3.upload({
    Bucket: "BUCKET_NAME",
    Key: "FILE_NAME",
    ContentType: 'text/plain',
    Body: fs.createReadStream('SOME_FILE')
})
.on("httpUploadProgress", progress => console.log(progress))
.send((err,resp) => {
    if(err) return console.error(err);
    console.log(resp);
})


//http stream test
s3.upload({
    Bucket: "BUCKET_NAME",
    Key: "FILE_NAME",
    ContentType: 'application/pdf',
    Body: got.stream('https://arxiv.org/pdf/1701.00003.pdf')
})
.on("httpUploadProgress", progress => console.log(progress))
.send((err,resp) => {
    if(err) return console.error(err);
    console.log(resp);
})

To prove my point even further I tried the code with the pdf you posted in your question and here is the link for my test bucket showing that the pdf works as expected.

为了进一步证明我的观点,我尝试使用您在问题中发布的pdf代码,这里是我的测试桶的链接,显示pdf按预期工作。


分享到: