Video transcoding has been made easier and effective using the Elastic Transcoder service from Amazon. In this post, we will see how to make use of Elastic Transcoder to transcode the videos we upload to Amazon S3 bucket. Also, we will make use of AWS Lambda which will invoke the Elastic Transcoder whenever a new Object is added to S3 bucket.

Create S3 Buckets

In this example, we will use two S3 buckets say input and output bucket. The object (video file to be transcoded) will be uploaded into input bucket. The output bucket will have the transcoded videos. For this example, the input bucket is created with the name input.videos-to-transcode and the output bucket is output.transcoded-videos. Both the buckets are created in the US-EAST region(N. Virginia).

Note: Since the Bucket name is unique through out, you have to choose a different bucket name other than the one mentioned in this example.

Elastic Transcoder

In Elastic Transcoder, pipelines are the queues to which the transcoding jobs can be added. Elastic Transcoder process the jobs in the order in which it is placed in the pipeline. Let’s configure the pipeline using the Elastic Transcoder console.

Create a pipeline

In the Elastic Transcoder console, choose Create a new Pipeline.

Configure the pipeline by providing the input and output bucket details as below.

You will now have the pipeline created with status as Active. Make a note of the pipeline id.

AWS Lambda Function

To summarize the things we have done so far, we have created input and output S3 buckets for placing the video files. Then we have configured the Elastic Transcoder pipeline. Now we need to connect these two pieces i.e S3 buckets and Elastic Transcoder so that when we upload some objects(video file) in S3 bucket the transcoding needs to be done for the video files. So how can we do that?

Here comes the AWS Lambda to our rescue. Yes. Lambda acts as a glue between the S3 bucket and the Elastic Transcoder. This is done by configuring the events for new object creation in S3 bucket which will trigger the Lambda function. Then the Lambda function will create the Elastic Transcoder Job and add that to the Transcoder pipeline we have created. Elastic Transcoder then process the Job and produces the transcoded videos in the output S3 bucket.

Creating IAM role(Execution role) for Lambda

Before creating the Lambda function, let’s create an IAM role which will be associated with Lambda when it executes.

Step 3: Compile and build the project using Maven goal as package. After successful build then run the Maven build again using the goal as package shade:shade. In the target folder of the project, you should have the jar videotranscoder-4.0.0.jar.

Deploy the Lambda Function

Now that we have written the Lambda function, we need to deploy it so that the function can be triggered when certain events occur.

Test the Lambda Function

Now finally it’s time to test the video transcoding. Upload the sample video file into the input S3 bucket. You should get the transcoded videos in the output S3 bucket.

Input Video File

Transcoded Output Video Files

Conclusion and What’s Next?

Congratulations! You have now done the video transcoding using Elastic Transcoder and AWS Lambda. We have seen how the Lambda function can be leveraged to achieve the transcoding functionality without configuring any servers by ourselves. This gives us the way to go ahead and explore more about the serverless architectures.

We can have a thick client now talking to NoSQL database directly or by calling the API Gateway endpoint which will invoke Lambda function. The Lambda function will then do the database operations. There are plenty of options available to design the applications using serverless way in the Cloud.

Let’s go serverless!

That’s it for this post. If you face any errors in the example explained in this post, check the Cloud watch logs which will have the error details captured during Lambda invocations.

If you have any questions or issues, please share it in the comments section.

Vignesh M

Java developer , AWS Certified Solutions Architect Associate and Cloud technology enthusiast. Currently working for Hexaware technologies. He believes that knowledge increases by sharing not by saving.

Related Posts

AWS provides an option to take snapshots of the EBS volume. EBS snapshots come handy when we want to recover our EBS volumes from any unforeseen disasters or failures. It would be even nice if Read more…

In this post, let’s see how to integrate Spring Boot with Amazon DynamoDB. I have been exploring the option of creating the REST API using Spring Boot backed by DynamoDB. I will share the information Read more…

Serverless is the architectural style in which the applications relies heavily on Backend as a Service(Baas) and Function as a Service(Faas). In Baas, the third party remote application services are integrated tightly into the client Read more…