Dockerize and orchestrate your fullstack MEAN app easily

Simply because it’s worth knowing how easy it is to actually create a fullstack app and deploy it all on a single hosting using docker compose orchestration.

For my side projects, I’m using a very basic hosting provider which is a dedicated server for less than few bucks a month running ubuntu server.

I like using it to test side projects and for conveniency, I dockerize all the components and use docker compose to orchestrate all the containers.

The main goal here is to deploy quickly your app for testing and experimenting on the field, rather than dealing with distributed services.

Once you extract the value of your experiment and you see tractions for your app, then switching to other architecture is of course worthful.

In this article, we will cover 1.

how to create and dockerize a MEAN app and 2.

How to orcherstrate all services using docker-composeEven though I will add some code snippet for each element of the technical stack, the focus will be mainly given on the docker part.

This means that you should be familiar with mysql, angular and nodeJS and have all dependencies already installed (node, npm, angular-cli).

Pre-requisiteDepending of your Operating system, you will need to install Docker daemon and docker-compose on your Operating System.

create a MEAN appMEAN stands usually for MongoDB — Express — ANgular but in this article, I prefer using Mysql instead of MongoDB.

Let’s first structure our project with the following directory architecture :As you can see, I created a main directory called mean-docker and 3 sub-directories for each of the components of the projects.

In this manner, each sub-directory will have its own Dockerfile.

Let’s see what they look like.

1.

database DockerfileIn the database folder, create a file called Dockerfile and simply add the content below :# Create mysql server based on the official image from the dockerhubFROM mysql:5# Add a database & root passwordENV MYSQL_DATABASE meanENV MYSQL_ROOT_PASSWORD=somePassword# optionnaly, run initial scripts for creating tables COPY .

/sql-scripts/ /docker-entrypoint-initdb.

d/Once build and loaded as a container (that will cover later on), this will create a mysql server with the root password you defined.

Also, you can create a directory called sql-scripts if you need to create at launch time initial tables and prefilled values.

Here are some examples :# CreateTable.

sqlCREATE TABLE jobs (title varchar(25),description varchar(50));# InsertData.

sqlINSERT INTO jobs (title, description) VALUES ('dev', 'awesome job');2.

1 backend architectureWe will of course use nodeJs as a javascript runtime for our backend service.

We will rely on Express for the web application framework and of course mysql driver for querying our mysql database.

Here what the backend/package.

json looks like :{ "name": "backend", "version": "0.

0.

0", "private": true, "scripts": { "start": "node app.

js" }, "dependencies": { "body-parser": "~1.

18.

3", "cors": "^2.

8.

5", "express": "~4.

16.

4", "mysql": "^2.

16.

0" }}Going into the backend folder and running npm install will install all the dependencies listed in package.

json.

As you can see if the package.

json file, if we run npm start, nodeJS will load the file called app.

js that will contain our API as well as our code to interact with mysql.

Here is the content of app.

js :// app.

jsconst express = require('express');const path = require('path');const http = require('http');const bodyParser = require('body-parser');// api.

js for the routesconst api = require('.

api');const app = express();// body parsing middlewareapp.

use(bodyParser.

json());app.

use(bodyParser.

urlencoded({ extended: false }));// all routes are falling back into api.

jsapp.

use('/', api);// HTTP port settingconst port = process.

env.

PORT || '3000';app.

set('port', port);// HTTP server creationconst server = http.

createServer(app);// listening all incoming requests on the set portserver.

listen(port, () => console.

log(`backend running on port:${port}`));For properly structuring our backend, we have separated the server configuration/loading (app.

js) from the API logic (api.

js).

Below a very simple example of api.

js that will interact with our mysql database to retrieve and expose data ://api.

jsconst express = require('express');const router = express.

Router();const mysql = require('mysql');const cors = require('cors');var con = mysql.

createConnection({ host: "database", user: "root", port: '3306', password: "somePassword", database: "mean", charset : 'utf8'});var corsOptions = { origin: '*', optionsSuccessStatus: 200}// initial connectioncon.

connect(function(err) { if(err) console.

log(err);});// our simple get /jobs APIrouter.

get('/jobs', cors(corsOptions), (req, res) => { con.

query("SELECT * FROM jobs", function (err, result, fields) { if (err) res.

send(err); res.

send(result); console.

log(result); });});module.

exports = router;What is important here is that we named our database host as ‘database’.

This link is in fact described later on in our docker-compose.

yml file in which we will link the database service to the backend service so the backend will be able to interact as we described.

As you can see, for the example purpose, we have a single route GET /jobs that will return all the jobs stored in our mysql service.

2.

2 backend DockerfileNow that we have build our backend, let’s simple dockerize it by firstly creating a Dockerfile :# Create image based on the official Node 6 image from the dockerhubFROM node# Create a directory where our app will be placedRUN mkdir -p /usr/src/app# Change directory so that our commands run inside this new directoryWORKDIR /usr/src/app# Copy dependency definitionsCOPY package.

json /usr/src/app# Install dependeciesRUN npm install# Get all the code needed to run the appCOPY .

/usr/src/app# Expose the port the app runs inEXPOSE 3000# Serve the appCMD ["npm", "start"]Later on, when we will create our backend docker image, we don’t want to include all the local node_modules folder into the image (as it will be also created when npm install will be called by docker), so we can create a simple .

dockerignore file including :node_modules/Our final backend folder content should look like :Let’s now create an angular frontend and dockerize it.

3.

1 frontend architectureGo back to the root mean-docker folder and simply run the angular command to create an angular project :mean-docker$ ng new frontendThis will create all the angular structure.

Let’s focus simply on how to query our backend.

To to so, we will create a JobRepositoryService that will have a method querying the GET /jobs API and returning asynchronously an array of Job using RxJs :import { Injectable } from '@angular/core';import { HttpClient, HttpHeaders } from '@angular/common/http';import {Observable} from "rxjs/index";import {Job} from ".

/models/Job";@Injectable({ providedIn: 'root'})export class JobRepositoryService {constructor(private http: HttpClient) { }getJobs (): Observable<Job[]> { return this.

http.

get<Job[]>("http://localhost:3000/jobs"); }}Then, in any of our angular composent, we can inject this JobRepositoryService and subscribe to the getJobs Observable :getJobs(): void { this.

jobRepository.

getJobs() .

subscribe(result => { this.

jobs=result; console.

log("jobs "+JSON.

stringify(result)); }); }}3.

2 frontend DockerfileThe frontend Dockerfile looks very much like the backend Dockerfile as they both use a node image :# Create image based on the official Node image from dockerhubFROM node# Create a directory where our app will be placedRUN mkdir -p /usr/src/app# Change directory so that our commands run inside this new directoryWORKDIR /usr/src/app# Copy dependency definitionsCOPY package.

json /usr/src/app# Install dependeciesRUN npm install# Get all the code needed to run the appCOPY .

/usr/src/app# Expose the port the app runs inEXPOSE 4200# Serve the appCMD ["npm", "start"]The only difference here is that we expose our frontend on the regular 80 port.

Do not also forget to create the .

dockerignore file including the following content for the same reasons :node_modules/4.

Docker composeNow we have all our services expressed with their Dockerfile, we can describe the orchestratation declaring a simple docker-compose.

yml file located on the root mean-docker folder :version: '2' # specify docker-compose version# Define the services/containers to be runservices: frontend: #name of the first service build: frontend # specify the directory of the Dockerfile ports: – "4200:4200" # specify port forewarding container_name: front-container restart: always backend: #name of the second service build: backend # specify the directory of the Dockerfile ports: – "3000:3000" #specify ports forewarding container_name: back-container restart: always links: – database # link this service to the database service database: # name of the third service build: database # specify the directory of the Dockerfile container_name: database-container restart: alwaysWe have 3 distinct services in our MEAN stack.

Each of them having a Dockerfile.

So in the build node, we locate the folder where the Dockerfile is located.

We also define the port forwarding.

And most importantly, we create a link between the backend and the database container by specifying the links node.

Now, to create the images and launch the containers, we just have to run the following command :$ docker-compose up –build -dAnd to check the status of the containers, we have the following command :$ docker container psYou app is now running and you can access your frontend using localhost:4200 (if you encounter an error, you can try to enhance the ng start script from the frontend/package.

json by adding the following :"start": "ng serve –host=0.

0.

0.

0 –disable-host-check",ConclusionAs you can see, creating a dockerize MEAN stack is very easy.

You simply need to structure your project in a way that each of your service has its own Dockerfile.

Then, by using docker-compose, you can run your multi-container docker app and create links between services.

I’m personnaly using this orchestration approach for quickly creating MVPs in order to focus on the learning of the MVP experiment rather than focusing on scalability at this early stage.

.

. More details

Leave a Reply