How to Stream Logs from AWS Lambda to S3 (Nodejs Example)

Barathirajan M
Technology Specialist
April 16, 2019
Rate this article
Views    10983

I hope we all aware of AWS lambda  a computing platform which runs on a need basis.  It supports many language executions like Golang, Python, Nodejs, Java, C#, Ruby.

Recently I was playing around with this to run some memory consuming jobs and I’m stuck when the job got failed as  I don’t have any clue for the behaviour. So, I planned to write logs with the assumption that we can perceive the failure reason.

Since Lambda function doesn’t provide any persistent storage mechanism to store the generated logs, it was quite difficult to access the generated logs as well. Hence, we planned to stream the logs to AWS S3.

In this article, we will have a brief of how to stream the logs generated while running the code in these execution environments to AWS S3 using Nodejs.

Stream Logs From AWS

AWS Lambda provides access to a storage location “/tmp” with the storage of 512 MB which is available only during the execution time.

So, The basic idea here is to write the logs in this “/tmp” folder using some standard loggers like Winston.

Once the execution has completed, we can upload the log files to s3 provided the lambda function has the execution roles assigned which are required to access the s3.

Steps to Follow:

1. Install Winston logger and winston-daily-rotate-file.

Command: npm i winston
Command: npm i winston-daily-rotate-file

2. Configure Winston so that it writes logs to “/tmp” folder.

const winston = require('winston');
const DailyRotateFile = require('winston-daily-rotate-file');

let logger = winston.createLogger({
    transports: [
        newDailyRotateFile({
            filename: path.join("/tmp", `fileName.log`),
            datePattern: 'YYYY-MM-DD-HH-mm-ss',
            prepend: true,
            localTime: true,
            level: 'info',
            maxSize: '2m',
            maxFiles: '14d',
            colorize: true
        })],
    exitOnError: false
});

logger.info("This is Sample info log");
logger.error("This is Sample error log");

After configuring loggers as mentioned above, The logs will be written into “filename.log” in “/tmp” folder of AWS Lambda execution environment.

3. Install the AWS SDK for accessing s3.

Command: npm i aws-sdk

4. Ensure that the lambda function is assigned with the s3 execution roles.

At the end of lambda function execution (or) when you internally terminating the execution, read the files from “/tmp” and upload it to s3.

Please be aware that these written log files will vanish once the execution got completed.

const fs = require('fs');
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
 
function uploadLogFiles() {
    returnnewPromise((resolve, reject) => {
        fs.readdir("/tmp", function (err, filenames) {
            if (err) {
                console.log("ERROR IN READING LOG FILES FROM /tmp FOLDER");
                console.log(err);
                return;
            }
            console.log(filenames)
            filenames.forEach(function (filename) {
                fs.readFile("/tmp/" + filename, 'utf-8', function (err, content) {
                    if (err) {
                        console.log("ERROR IN LOOPING INDIVIDUAL FILES");
                        console.log(err);
                        reject(err);
                    }
                    constparams = {
                        Bucket: 'my-stream-lambda',
                        Key: filename,
                        Body: content
                    };
                    s3.upload(params, function (s3Err, data) {
                        if (s3Err) {
                            console.log("ERROR IN UPLOADING LOGS TO S3");
                            console.log(s3Err);
                            reject(s3Err);
                        }
                        console.log(`File uploaded successfully at ${data.Location}`);
                        resolve("LOGS UPLOADED SUCCESSFULLY");
                    });
                });
            });
        });
    });
}

Here we don’t need AWS account credentials for accessing s3 as we already assigned the s3 execution roles to AWS lambda function while creating it.

Subscribe To Our Newsletter
Loading

Leave a comment