Category: NodeJS

In my previous post in this series I covered linking GitHub and DockerHub and configuring the environment such that a build of a Docker image was triggered on updates to GitHub. In this final post of the series I will take you through the steps to pull the image from Docker Hub into OCCS in order to run the application. It should be noted that the image built on Docker Hub in my example is only the web tier that contains my Node.js project (APIs and SwaggerUI). The MongoDB component of my OCCS Stack is pulled directly from Docker Hub when my Stack containing the Web Tier and Database Tier services is deployed to OCCS. Continue reading “Exploring GitHub Docker Hub and OCCS Part 4”

In my previous post I detailed how I Dockerised the MedRec app. In this post I will show how I added MongoDB and defined a stack using Docker-Compose.

Add MongoDB layer using Docker-Compose

“Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a Compose file to configure your application’s services. Then, using a single command, you create and start all the services from your configuration. ”

A single command to create and start all the services in a configuration sounded pretty good to me. I definitely was keen on exploring docker-compose.

Add a docker-compose.yml file

Having proved that my web application runs up, I now need to address the persistence layer. The above Dockerfile contains the steps to create the required runtime platform for my node app, and installs the node application and package dependences (as specified in the package.json) file by doing the npm install. However if I tried to do a GET or a PUT my app will fail as it won’t find a MongoDB inside my container. I therefore still need a MongoDB somewhere in my environment to hold my application data. Continue reading “Exploring GitHub, DockerHub and OCCS Part 2”

In my previous post in this series I provided an Introduction describing the high level steps I planned to take.
In this post I will walk through the detailed steps to Dockerise the MedRec application.

Dockerise the MedRec APIs

Git clone the project repository

I used my Windows Surface Pro-4 with Oracle VM Virtual Box installed to host my development VM. I managed to source a VBox image that already had Ubuntu 16.04 and Docker installed so that helped get me started. In my development environment on my laptop, I created a directory under my home directory named gitprojects.
I cd into that directory.cd gitprojectsContinue reading “Exploring GitHub Docker Hub and OCCS Part 1”

As part of our MedRec API playpen initiative we had already developed some REST API’s using Node.js and leveraged MongoDB running on Oracle IaaS as the persistence layer. . This post describes what I did to dockerise the MedRec API application and eventually run it on the Oracle Container Cloud Service (OCCS).

The APIs have already been made available for interested parties to interact with via SwaggerUI. Of course developers could develop their own code (or any REST client) to interact with them. As a team we used a combination of the Oracle Developer Cloud Service (Git repository, Issue Tracker, Build Server etc) and also the public GitHub to provide public access to our project code. As the source code for the Node.js project containing the API’s was pushed to GitHub I simply did a clone of the Node application code in order to download and run it locally (MedRec API tutorials available here).

The application ran well enough on my local laptop which running Ubuntu 16.04, however I really wanted to be able to try to run the app and MongoDB as a Docker image/stack on my laptop. After I had the application successfully “Dockerised”, then I planned to deploy my application stack to the Oracle Container Cloud Service. I also wanted to explore the use of the GitHub / Docker Hub integration to build my image on Docker Hub, and then from within the Oracle Container Cloud Service (OCCS). With the application image available on Docker Hub, I could then pull my image from that source in order to run it up on OCCS.

A good blog can really help bring you up to speed quickly and help overcome inertia to get you started and I would like to acknowledge the help that Mauricio Payetta’s blog provided me.

In this blog, I am going to show you how to get started with the Loopback framework to easily auto-build REST APIs in NodeJS and persistence layer in a variety of options, including relational and non-relational databases e.g. In-memory DB, MongoDB, MySQL, Cassandra, Oracle, etc.

In terms of API design and development, Loopback allows you to work “top-down” or “bottom-up”. I am going to cover both approaches in this blog.

First, we are going to create an API model definition in place, as we are building the REST APIs, this exercise will give us a Swagger-based API definition. Alternatively, we are going to start from an existing Swagger definition and use it to implement NodeJS REST APIs pointing to a persistence layer of choice (in-memory DB, MongoDB, MySQL, DB2, Oracle, etc.). I personally prefer the “API First/Top Down” approach, as it gives me the option to properly design and test my APIs first and then, simply move to implementation phase, but this ultimately depends on situations, preferences and requirements.

How to use the Thinxtra devices and Sigfox network with the Oracle IoT cloud service.

Introduction

There are lots of activities happening today in the world of IoT (Internet of Things). The market is growing at a staggering pace. Oracle, of course, is providing services in this area, mainly to support our many great SaaS applications. Almost every application can benefit from data coming from devices on machines, automobiles, medical devices, human wearables and such. However, there are several issues people face:

Oracle IoT Cloud service is designed to help with these issues, but it is often overwhelming to get a given device’s data initially into the IoT cloud. Case and point is with the wonderful devices from Thinxtra which uses the Sigfox network.

Now days with the adoption of Serverless architectures, microservices are becoming a great way to breakdown problem into smaller pieces. One situation that is common to find, is multiple backend services running on technologies like NodeJS, Python, Go, etc. that need to be accessible via HTTPS. It is possible to enable these internal microservices directly with SSL over HTTPS, but a cleaner approach is to use a reverse proxy that front ends these microservices and provides a single HTTPS access channel, allowing a simple internal routing.

In this blog, I am showing how simple it is to create this front end with Nginx and leveraging “Let’s encrypt” to generate trusted certificates attached to it, with strong security policies, so that our website can score an A+ on cryptographic SSL tests conducted by third party organizations.

Blogroll

SolutionsANZ blog and contributors. All Rights Reserved. The views expressed in this blog are our own and do not necessarily reflect the views of Oracle Corporation. All content is provided on an 'as is' basis, without warranties or conditions of any kind, either express or implied, including, without limitation, any warranties or conditions of title, non-infringement, merchantability, or fitness for a particular purpose. You are solely responsible for determining the appropriateness of using or redistributing and assume any risks.

Follow Blog via Email

Enter your email address to follow this blog and receive notifications of new posts by email.