Going live with NunDB

I needed a realtime database for two personal projects I was working, a time tracker (not public yet ) and http://ebelle.vilarika.com.br, the natural decision would be to use something like Firebase, but I was looking for a light open-source alternative turns out I did not found one as simple as I wished, so I decided to build one. My needs were simple, I need my frontend to be updated whenever the data change in the backend (And I do not want to implement a new WebSocket for each of my application). Here I will not give many details about how I build NunDB. You can read it [not public yet], but I will focus on how I went to production with it for my project Ebelle.

First of all, I needed it to be easy to deploy and use in different projects. Next, it would need to be observable. I would like to see how the CPU memory and network would behave in a real-life situation.

Deploy

My applications are all deployed with Docker, so it was natural for me to go in that direction (also because it makes it simple to go to k8s in the future). Initially, I created a simple Dockerfile that could deploy the application.

FROM rust:1.40 as builder
WORKDIR /usr/src/nun-db
COPY . .
RUN cargo install --path .

Direct and straightforward docker image that compile the source code, but that resulted in a 1G. So I added a simple docker base to distributions with that the final version of the looks like the following.

FROM rust:1.40 as builder
WORKDIR /usr/src/nun-db
COPY . .
RUN cargo install --path .

FROM debian:buster-slim
RUN apt-get update
RUN apt-get install -y libssl-dev
COPY --from=builder /usr/local/cargo/bin/nun-db /usr/local/bin/nun-db
CMD ["sh" , "-c", "nun-db -u ${NUN_USER} -p ${NUN_PWD} start"]

With that, my final version resulted in a docker image of 35Mb I still want it to be smaller, but that is good enough to continue.

Adding it to docker-file.yaml

version: '3'
services:
...
 nun-db:
 image: "mateusfreira/nun-db"
 ports:
 - "3012:3012" # Web socket
 - "3013:3013" # Http
 - "3014:3014" # Socker
 environment:
 - NUN_DBS_DIR=/nun_data
 - RUST_BACKTRACE=1 
 - NUN_USER=*****
 - NUN_PWD=*****
...

Setting up Nginx

server {
 server_name nun-db.vilarika.com.br;
 location / {
 proxy_pass http://<server-ip-here>:3012;
 proxy_set_header Host $host;
 proxy_set_header X-Real-IP $remote_addr;
 proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
 proxy_set_header X-Forwarded-Proto $scheme;
 proxy_read_timeout 600; # As this is a websocket service this needs to be big
 }
  # If you do not know cerbot you shoud visit https://certbot.eff.org and learn
 listen [::]:443 ssl; # managed by Certbot 
 listen 443 ssl; # managed by Certbot
  # ....
}
server {
 if ($host = nun-db.thedomain.com) {
 return 301 https://$host$request_uri;
 } # managed by Certbot


 listen 80;
 listen [::]:80;
 server_name nun-db.thedomain.com;
 return 404; # managed by Certbot
}

Front end connection to NunDB

I have a react application done with Redux, so I decided to go the easy route and connect NunDB-js to the redux store. First, the important thing is I connect to NunDB organization connection when the user logs in to the application, for that in the middleware I will watch for the login event to start the NunDB, and start watching for the ‘lastTreatment,’ for me that is the most important event it will trigger my watcher every time a treatment changes; all I do is trigger the event upateTreatmentsAction.

// nundb.js
import NunDb from 'nun-db';
import {
 LOGIN_SUCCESS_ACTION,
} from './redux/store/ducks/login';
import {
 upateTreatmentsAction,
} from './redux/store/ducks/calendar';

const dbStore = {
 nun: null,
};

const dbMiddleware = (store) => (next) => (action) => {
 next(action);
 const company = action.payload ? action.payload.company : action.company;
 if (action.type === LOGIN_SUCCESS_ACTION) {
 const {
 id,
 nunDbPwd,
 } = company;

 const nun = new NunDb('wss://nun-db.vilarika.com.br', `org-${id}`, nunDbPwd);
 nun.watch('lastTreatment', (event) => {
 store.dispatch(upateTreatmentsAction(event.value));
 });
 dbStore.nunDb = nun;
 }
};

export {
 dbMiddleware,
 dbStore,
};

Server-side change

My server-side is written in scala so that the code may looks different, but it is simple, I split the changes into three steps.

1) Add the Util class to talk to NunDb.

package code
package util

import code.model._
import dispatch._
import net.liftweb._
import json._
import common._
import http.js._
import JE._
import net.liftweb.util.Helpers
import net.liftweb.json._

import java.util.Date


object NunDbUtil extends net.liftweb.common.Logger {
 val authData = "auth ****** *****;" // Omitted for obvious reasons
 val host = "nun-db"// Internal docker container name
 lazy val apiv1 = :/(host, 3013)// port 3013 is the http port to nun-db

 /**
 *
 * This method is made to replicate any org we create in our Postgres database we replicate it to NunDB
 *
 */
 def createCompany (company: Company) = {
 info("Will create the org " + company.id.is.toString + " into nun-db")
 /**
 *
 * The http interface for NunDB today uses a semicolon-separated pattern, that is how we can execute multiples commands with one single request (NunDB does not use any kind of http session) 
 *
 */
 val jsonToSend = authData + " create-db org-" + company.id.is.toString + " " + company.nunDbPwd.is + "; use-db org-" + company.id.is.toString + " " + company.nunDbPwd.is + "; snapshot;"
 val http = new Http
 val request = NunDbUtil.apiv1/""
 val ret = http(request <<? Map() << jsonToSend as_str)
 info("nun-db request result : "+ ret.toString)
 }

 def updateTreatment (company:Company, id:Long) = {
 try {
 val jsonObj = Treatment.findByKey(id) match {
 case Full(treatment) => {
 compact(JsonAST.render(Treatment.encodeAsJson(treatment)))// Convert to JSON
 }
 case _ => {
 info("No treatment to update")
 "{ \"noTreatment\": true }"
 }

 }
 /**
 *
 * The http interface for NunDB today uses a semicolon-separated pattern, that is how we can execute multiples commands with one single request (NunDB does not use any kind of http session) 
 *
 */
 val jsonToSend = authData + " use-db org-" + company.id.is.toString + " " + company.nunDbPwd.is + "; set lastTreatment { \"id\": "+System.currentTimeMillis()+", \"value\" : " + jsonObj + "}"
 val http = new Http
 val request = NunDbUtil.apiv1/""
 val ret = http(request <<? Map() << jsonToSend as_str)
 info("Nun db request result : "+ ret.toString)
 } catch {
 case e : Exception => {
 error("Error in nun-db request", e)
 }
 }
 }
}

2) Add to the database layer the code to replicate changes to NunDB.

2.1) The model: I decided to model the relationship App DB to NunDB for each company in the database, I would create a database in NunDB, therefore I make sure to keep them separated and isolated, also protected by credentials. Therefore I added the createCompany to be called in each company creation actor.

2.2) Add the update to the model I wanted to replicate to NunDB


class Treatment extends UserEvent 
with LogicalDelete[Treatment] 
with PerCompany 
//... lots of others here .. I really liked to use this muti extends
with WithCustomer with net.liftweb.common.Logger {
// ...
 def save () = {
//...
 NunDbActor ! TreatmentUpdate(this.id, this.company.obj.get)// Send to an actor so we do not block the operation
 super.save()
 }


I decided to use an actor to make sure it does not impact the performance. This update will run many times a second within each actor is quite simple. It just calls the method updateTreatment.


package code
package actors

import net.liftweb._
import util._
import actor._
import code.model._
import code.util._

object NunDbActor extends LiftActor with net.liftweb.common.Logger {

 def treat(treatmentUpdate:TreatmentUpdate) {
 NunDbUtil.updateTreatment(treatmentUpdate.company, treatmentUpdate.id)
 }

 protected def messageHandler = {
 case a:TreatmentUpdate => treat(a)
 case _ => 
 }
}

case class TreatmentUpdate(id: Long, company: Company)


See it in action

Here I did a small demo on it, woking the calendar feature of my SaaS product.

Nun-db in action

Ready to rock

Now my app is working, and it is out there, stays tuned to the next step of NunDB.

PS

This is the first post of a series of posts about NunDB and my journey to develop and use it.

Written on June 14, 2020