How to Break a Monolith into Microservices

Microservices Level Up
Microservices and Logging
August 18, 2020
Show all

How to Break a Monolith into Microservices

Microservices Level Up

Converting An Application From Monolithic to Microservices

Hello friends, how are you? We are the end of our microservices article series. This article is especially important because we are going to discuss how to convert an existing monolithic application to microservices. It is in huge demand because there are a lot of existing monolithic applications and everyone wants to benefit from microservices. So let us begin.

Understanding the Microservices Ecosystem

It won’t make sense if you are not equipped with the knowledge of microservices ecosystem. We are about to convert a monolithic application to microservices, but why? The ecosystem is meant for encapsulating the business capability. It represents the objectives and responsibilities of the business in a particular domain. The microservices contain an API that the developers discover and use. They have an independent life cycle. Due to this, developers can independently build, test, and release the microservices.

The microservice ecosystem has an organized structure that comprises of autonomous long-standing teams. Each of them is responsible for multiple services. So if you are thinking that ‘micro’ in microservices refer to the small size of each service, it is pretty much wrong. The service in total could be huge. The size of every service matters as per the functionality. In some cases, the services are small in the beginning, but they turn out to be huge later on.

Let us begin with The Conversion

First of all, understand that the cost associated with decomposing an existing system to microservices is high. It can also be time-consuming because it could take many iterations. The developers and architects need to precisely analyze the need for the conversion. So please be clear with the requirements and the need. Now let us begin:

Step – 1: Preparation for a Simple and Fairly Decoupled Capability

A few things are required to begin the conversion. First of all, you need an on-demand access to the deployment environment. One has to build continuous delivery pipelines. These will help in building, testing, and deploying independent executable services. It will also help to secure, debug, and monitor a distributed architecture. Also, show and expect readiness maturity towards operational issues like : 

Number 1- Creation of service mesh 

Number 2- Dedicated infrastructure layer to run fast

Number 3- Reliable and secure network of microservices

Number 4- Container orchestration systems to provide a higher level of deployment infrastructure abstraction

Number 5- Evolution of continuous delivery systems such as GoCD to build, test and deploy microservices as containers

Begin with the capabilities that are already or partly decoupled from the monolith. These capabilities are the ones that don’t require many changes and aren’t client facing. They might not even need a data store. This becomes a sort of trial before switching over to major tasks. The delivery team needs to optimize delivery approaches, upskill team members, and build the minimum infrastructure for delivering independently deployable secure services.

Decouple simple edge services first. Then, follow a different approach for decoupling capabilities that are deeply embedded in the monolithic system. Do edge services first as the delivery teams need to ensure they won’t fail to operate the microservices properly. Use edge services for practicing the operational prerequisites. Then, switch over to splitting the monolith.

Step – 2: Early Splitting Of Sticky Capabilities 

At this stage, we assume that the delivery teams are confident enough to build microservices and solve sticky problems. Their capability is yet limited and they might need to go back to the monolith for decoupling the next set. This is because the capability within the monolith is leaky. It is not well defined as a domain concept with many of the monolith capabilities depending on it. So the developers need to find out sticky capabilities, deconstruct them into domain concepts and rectify them into separate services.

To solve the future capabilities, the team needs to tackle decoupling and deconstructing the notion of ‘session’. Till then, the capabilities will be entangled with the monolith through the leaky session concepts. Also, avoid creating a ‘session’ service outside the monolith. This will lead to tight coupling and it already exists within the monolith process.

Developers can extract microservices from the sticky capability one after the other, incrementally. For example, refactor ‘client wish list’ first and extract it to a new service. Then, refactor ‘client payment mode’, extract it and so on.

Step – 3: Vertical Decoupling and Releasing The Data Early

Successfully releasing the decoupling capabilities independently out of the monolith is important. The main thing is to guide every decision that developers make for performing the decoupling. The monolithic system has tightly integrated layers that are released together. Their dependencies can easily be broken. The decoupling attempt begins by extracting the user-facing components and providing developer-friendly APIs for the new UI. The data remains locked in one schema/storage. The approach provides quick wins like changing UI frequently. With core capabilities, the delivery teams can only move as fast as the slowest part of the data store. Till you don’t decouple data, the architecture is not microservices. Also if you keep data in the same data store, it is against the decentralized data management characteristic of microservices.

The strategy is to move out capabilities vertically, decouple the core capability with its data and redirect all front-end applications to the new APIs. With multiple applications writing and reading from the shared data, it could be a major hindrance to decoupling. For this, the delivery teams need to incorporate a data migration strategy that suits their environment. Depending on whether they are able to redirect and migrate all data reader/writers or not at the same time.

Step – 4: Decouple Important Stuff and Changes Frequently

Decoupling capabilities from the monolith are difficult. For example, with an online retail application, extracting the capability includes carefully extracting the capability’s data, logic, user-facing components, and redirecting them to the new service. Being a non-trivial amount of work, the developers should continuously evaluate the cost of decoupling. It is against the benefits they get. Regarding changes, the decoupled code parts undergoing continuous changes deliver value fast. The delivery teams analyze the code commit patterns and find out what has changed the most. Further, they overlay it with the product roadmap and portfolio to understand the most desired capabilities. The developers should communicate with business and product managers in understanding differentiating capabilities. For example, ‘client offers’ capability in an online shopping application undergoes continuous experimentations. It is a good candidate for decoupling.

Step – 5: Do not decouple code, decouple the capability

Developers can extract a service either by extracting code or by rewriting capability. There is a belief that monolith decomposition is a case of reusing the current implementation as it as and simple extracting it into a separate device. This is because we are biased towards the code that we design and write. It can hold the monolith decomposition effort back. The cost of extracting and reusing the code is high and technical managers should consider it. On the other side, the delivery teams have another option of retiring the old code. Rewriting helps in revisiting the business capability and initiating a conversation. This simplifies the legacy process as well.

The capability can be considered as a good candidate for reusing extraction. Generally, the teams are happy to rewrite the capability as a new service. Retiring the old code isn’t a bad idea! A large amount of boilerplate code deals with all sorts of environmental dependencies like accessing application configuration at runtime, accessing data stores, etc. So most of the boilerplate code has to be rewritten. Further, the existing capabilities weren’t build considering clear domain concepts. 

There are high chances that the legacy code would have gone through many changes. This could have a high code toxicity level and low level for reuse. Code reuse is a good idea only if the capability is relevant and with a clean domain concept. Else, opt for rewriting the capability.

Step – 6: Go Macro and Then Micro

Identifying domain boundaries in a legacy monolith is not an easy task. To begin with, you may opt for domain driven design techniques. It is quite rare to see a large monolith convert into really small services. The approach of identifying service boundaries could lead to a large number of anemic services.

The approach also creates a high-friction environment. It doesn’t lead to independent release and execution of the services. It only leads to a broken distributed system that is difficult to debug. There are some heuristics to how ‘micro’ should become a microservice. First, the size of the team, second, the time to rewrite the service, third, the behavior to encapsulate, etc. The size depends on the number of services and operations teams. Begin with larger services and break the service into multiple services.

Conclusion

The article covers the basic steps for converting a monolithic application to microservices based application. First of all, we understand the microservices ecosystem. It is important to know the ecosystem because you should know where you are heading, right? Then, we discuss the stepwise implementation for converting to the microservices based application.  It seems to be a tedious process and it certainly is! But it depends on the type of application.

Overall, converting from monolithic to microservices based application could be time-consuming and costly, but it completely depends on your requirements. Know the scope of your application, how far will it go and how good will it make? Offcourse, opting for a microservices based application is always a good idea and would certainly help in every aspect. Regarding our microservices series, I would like to thank you all for accessing it. I have thoroughly enjoyed our discussions. We shall meet soon with more interesting discussions. Till then, take care!

Here is the link to the previous article of this series.

James Lee
James Lee
James Lee is a passionate software wizard working at one of the top Silicon Valley-based startups specializing in big data analysis. In the past, he has worked on big companies such as Google and Amazon In his day job, he works with big data technologies such as Cassandra and ElasticSearch, and he is an absolute Docker technology geek and IntelliJ IDEA lover with strong focus on efficiency and simplicity.

Leave a Reply

Your email address will not be published.

LEARN HOW TO GET STARTED WITH DEVOPS

get free access to this free guide, downloaded over 200,00 times !

You have Successfully Subscribed!

Level Up Big Data Pdf Book

LEARN HOW TO GET STARTED WITH BIG DATA

get free access to this free guide, downloaded over 200,00 times !

You have Successfully Subscribed!

Jenkins Level Up

Get started with Jenkins!!!

get free access to this free guide, downloaded over 200,00 times !

You have Successfully Subscribed!