This days I’m involve with Amazon’s AWS and since I am migrating my backends to Lumen I’m going to play a little bit with AWS and Lumen. Today I want to create a simple Lumen server to handle SNS notifications. One end-point to listen to SNS and another one to emit notifications. I also want to register logs within CloudWatch.

He starts with the Lumen backend, creating a simple applications that hooks in the AWS and Logging service providers and defines the two routes ("push" and "read"). The post also includes the code for both the AWS and logging service providers and the SnsController. The end result (available on GitHub) then sends a basic SNS message and the response is captured and sent to CloudWatch.

Today I want to create an UI5/OpenUI5 boilerplate that plays with Lumen backends. Simple, isn’t it? We only need to create a Lumen API server and connect our OpenUI5 application with this API server. But today I also want to create a Login also. The typical user/password input form. I don’t want to build it from scratch (a user database, oauth provider or something like that). Since this days I’m involved with Amazon AWS projects I want to try Amazon Cognito.

He then walks through the Cognito service and what it has to offer including user management and authentication handling. He starts with the OpenUI5 side, creating the basic application and login handling via the Congnito Javascript SDK. He then modifies this with some basic user handling and creates the view for the login form. He also includes functionality for password resets and the code required to inject the JWT into every request post-authentication.

Michael Dyrynda has a tutorial posted to his site starting off a new series showing how to create the functionality in your application to upload files to Amazon S3 from the browser. The tutorial is designed for those that don't already have something in their framework that allows for this upload handling.

I recently took on a freelance project that involved having to upload media files. This is a trivially simple task to accomplish if you're using something like Laravel, using out-of-the-box support for S3 storage.

In this particular case, however, I was dealing with files potentially multiple gigabytes in size. Although a simpler to implement, I didn't want to have to have users of the site upload the file to my application - and thus server - before having my server re-upload the file to S3.

In his case, he needed something that would allow for the upload of very large files without having to pass it through the backend server to get there. He starts by walking you through the setup on the S3 side, creating an IAM policy for the upload and a form that points to the instance. The form includes a "key" value that contains the filename for the end result. He also shows some of the other options that can be included like the policy to use a redirect location and a signature to verify the upload. He then shows the code required to make it work, creating an upload route and a main form page that generates the signature and policy information for the form based on configuration options.

Phillip Shipley has a post to his site for the AWS and Cloudflare users out there (or those that want to use these together) about an easy way to automatically deploy static sites.

Managing web servers can be a lot of work. Especially when it comes to configuring and maintaining SSL certs, server and software updates, etc. Let’s Encrypt has made the SSL part a lot easier, but it is still work and to me feels like overkill for something as simple as a static single-page-application. Especially when there are dead simple solutions like Amazon S3 that can be used to host and scale a static website without any server configuration or maintenance. Adding CloudFront with a free SSL certificate from Amazon’s Certificate Manager service make SSL painless too.

[...] In this article I’ll cover how to use Codeship’s continuous integration and deployment service to build/test your app, deploy it to S3, and then clear cached versions of it from CloudFront and Cloudflare.

He then shows how to use the Codeship service to do the actual deployment, broken down into a few steps:

Step 1: Setting up project in Codeship

Step 2: Configure Tests

Step 3: Configure Environment Variables

Step 4: Configure Deployment

Each step includes both screenshots and configuration examples you'll need to get the workflow set up and running for your site.

Sometimes when you’re building a project there are parts of the architecture that exist on production that don’t exist on your development machine. Those missing parts (like proprietary software that’s specific to your hosting provider) can sometimes mean unwelcome surprises when you deploy to production.

Recently as part of my work on Mergebot, I decided to address this. My local machine was missing the AWS Elastic Beanstalk Worker Environment SQS daemon (known as SQSD). AWS isn’t open source so there’s, unfortunately, no official way to replicate it. So I decided to build a small PHP command line (CLI) app to attempt to replicate its functionality. In this article, I’m going to cover some of the aspects of creating a command line app in PHP and explain how I implemented them for my replica SQSD CLI.

He starts off with a brief overview of the Laravel queue worker and how it compares to the SQSD functionality. He then starts in on the code to create the daemon (outside of a framework) and adding in the while loop to keep it running as a daemon making use of the SQSD Worker class as a base. The post ends with some instructions on packaging up the command line tool using the phar functionality already included in the PHP language.

Logging and information debugging can be approached from a multitude of different angles. Whether you use an application framework or coding from scratch it’s always comforting to have familiar components and tools across different projects. In our examples today, I am going to enable Amazon CloudWatch Logs logging with a PHP application. To accomplish this, I wanted to use an existing solution that is both already popular and well used, and that is standards compliant. For these reasons, we are going to use the open source log library, PHP Monolog (https://github.com/Seldaek/monolog).

They start by introducing the Monolog library for those not familiar with it and how it relates to the PSR-3 standard. The ultimate goal with their implementation is to allow for the logs to be shipped to CloudWatch and implement some alerting around them. The tutorial then kicks in and they show you how to use Composer to install Monolog and an add-on to interface with CloudWatch. Code is provided to set up the initial logger and how to have it to log messages to different places. They then move over to CloudWatch and define a filter for the JSON data to find successful logins to your application. They also show how to use this same functionality in a Laravel application, contained in a test route.

The Amazon Web Services blog has posted the second part of their series covering the automated deployment of encrypted web services with the AWS SDK. In this new tutorial (part two, part one is here) they continue with the deployment of services: AWS Elastic Beanstalk, Amazon Route 53 and Amazon CloudFront.

In the first post of this series, we focused on how to use Amazon Route 53 for domain registration and use Amazon Certificate Manager (ACM) to create SSL certificates. With our newly registered domain available for use, we can proceed to deploy and configure the services we need to host the www.dev-null.link website across an encrypted connection. Once complete, the infrastructure configuration will reflect the diagrams [included in the post].

The tutorial then walks you through each of the services you need to deploy and shares the code (using the AWS PHP SDK) to show how to automate the process. There's also a few screenshots included of various page results and admin UIs to help you be sure you're in the right place.

The TutsPlus.com site has continued their series of posts in the "Programming with Yii2" series with this new tutorial covering the use of the Amazon S3 service for sorting files remotely in your application.

In today's tutorial, I'll walk you through the basics of browsing, uploading and downloading files to and from Amazon's cloud-based S3 storage service. Essentially, I've created a simple storage model and controller as examples which you can extend for your needs.

He starts with a brief introduction to the S3 service (including a video from Amazon themselves) and what kinds of things it could be used for. He helps you get started via the AWS web GUI, creating an S3 "bucket" and viewing their contents. He shows how to get the credentials you'll need to connect to the bucket and defining them in the ini configuration file. The tutorial then shows how to use this AWS extension for Yii2 to connect to and work with the S3 bucket you've created. This includes browsing the content, uploading new files and downloading current ones.

The main highlight of re:Invent is always the keynotes and the new services and features announcements they make during the keynotes. One of the new services caught my attention, and I decided to give it a try. That service is AWS CodeBuild.

CodeBuild is designed to be used as part of the AWS CodePipeline, but it may also be used by itself. [...] Out of the box, CodeBuild provides some managed images that you may use to build your projects. These include environments for Android, Java, Python, Ruby, Golang, and Node.js. PHP is missing from this list, but since you’re able to use other images, I decided to see how easy it is to get up and running on CodeBuild with a PHP project. I chose to try out my ramsey/uuid library for a simple test.

He walks you through the creation of a new CodeBuild instance (complete with screenshots of the UI) and how to configure your project, explaining each of the settings as he goes. He includes the full build command he's using for the library running tests, a lint check and codesniffer checks for formatting. He shows how to get the project to build and what the UI will show when the build is successful (all green).

Handling file uploads sucks. Code-wise it's a fairly simple task, the files get sent along with a POST request and are available server-side in the $_FILES super global. Your framework of choice may even have a convenient way of dealing with these files, probably based on Symfony's UploadedFile class. Unfortunately it's not that simple.

[...] For most situations using S3 is a no brainer, but the majority of developers transfer their user's uploads to S3 after they have received them on the server side. This doesn't have to be the case, your user's web browser can send the file directly to an S3 bucket. You don't even have to open the bucket up to the public. Signed upload URLs with an expiry will allow temporary access to upload a single object.

He points out two advantages of this method: that you don't have to handle the upload part of file uploads and that it gives the user more control. He shares a video of the end result (a simple file upload frontend) and the code that you'll need to use the AWS PHP SDK to make it all work together. There's some configuration changes that'll need to be made on the S3 bucket side (like for CORS) but the code itself to make the connection is relatively simple. He does a great job of explaining every step of the way and includes the Javascript needed for the frontend as well.