Month: September 2017

In this post, we are going to learn how to send messages to AWS SQS from spark worker machines using assume role. We will cover send data using Spark Executors and STS assume roles to communicate with AWS SQS service.

Introduction In this post, we are going to learn how to safely launch Apache Spark job using AWS lambda function and SNS topic. We are going to use Assume Role in our lambda function. This way it is much safer

In this post, we are going to learn how to execute EC2 run commands from AWS Lambda. Using lambda function as an upper layer to run command can be useful to validate command (if that needs to be run) or

Introduction In this post, we will learn controlling system services using AWS Run Command. We will see how to control Apache service with an example. The same process can be applied to other services like Mysql, Tomcat, etc. In small IT

Introduction This post is useful for those who are new to AWS lambda and want a quick configuration and testing. In this post, we will cover both approaches (implement RequestHandler.handleRequest, custom handler) to create a lambda function. Follow this link to

In this post, we will learn how to safely manage AWS security in Apache Spark. Apache Spark provides various filesystem clients (s3, s3n, s3a) for reading and writing to and from Amazon S3. Because s3 block filesystem is deprecated and