In this post we focus on modernisation focussed on monolithic databases
Data stores within an organisation can over time grow in complexity especially as engineers/database admins and business look to use the same data store in multiple ways for cost-effective solutions. We find that organisations optimise for cost by using common data stores to add on new extensions in the form of new tables, relationships, attributes etc. to serve new solution needs leading to growth inherent complexity. Interestingly, this measurable explicit early cost optimisation impacts agility and change leading to lost revenue which is implicit and hard to measure – which is why we feel it before we see it!
While the single data store paradigm can serve many solution contexts, it can become a bottleneck especially when rest of the enterprise is rapidly delivering change. Change as small as trying to change a few attribute types on a few tables can cascade massive programs of work due to the hidden impact to various consumers, this is why many organisations want to orient to more autonomous models with solution context specific data stores
The key to achieving this state is going from a monolithic data store to a more microservice architecture and the way to doing this is via techniques like Domain Driven Design (DDD) which starts with business and business capability led change vs a technology led decomposition
Approaching Monolith Decomposition
Where staring at a monolithic database decomposition problem there are two approaches to unravelling the complexity to start decomposing the database – the first approach is driven by technologist close to the beating heart of the problem and is motivated by finding technical decomposition solutions (SQL vs NoSQL, partitioning by region, microservices over a single database but presenting different services etc). This approach can render quick initial proof-of-concepts given the team is close to the monolith but it may struggle in the long run as it appears to be another “IT led refactoring exercise”

The second approach to this decomposition problem is to start with business capabilities with data needed to service those capabilities in specific solution contexts. We take this information model by context and break the monolith along business needs. Technology implementation comes later!
Using DDD to lead Monolith Decomposition Programs
We can approach monolith decomposition using Domain Driven Design (DDD) which is a fantastic design technique for orienting software engineering to business domains. How do we do this then? The steps for doing this is the same as doing strategic and tactical DDD
Some of the steps of this process are: discover and classify top-level business domains, finding capabilities for each of the business domains, mapping business capabilities to business processes and for each processes capturing the specific data requirements and actions (commands, queries). This process leads to a rich set of bounded contexts
The key to the business led decomposing is building context specific domain model which contain details about what information management happens in a specific solution context

This can look quite interesting as we look across business domains where some of the generic domain services like Customer, Document etc emerge as repeating themes. Lets take a note of these common services

Once we have a domain model to support business domains and contexts within domains then we can start to map this back to the monolith which requires working with an ERD or database model to map the domain models to. Notice how this is “cloning” and “mapping” vs breaking an existing database

The mapping of the domain model to the data model requires looking at the various contexts, domain entities and attributes and finding where they are managed in the monolithic database
What comes after domain model and mapping?
Our monolith decomposition journey has just started as we have a good design to build from. We can now take this design and start implementing changes incrementally using patterns like Strangler Fig pattern. All change should be test driven and we ought to be executing tests before starting to break things apart!
So my last note on monolith decomposition is we need to follow Domain Driven Design (DDD) with domain driven tests which help us ensure we can test for consistency while we make fundamental changes
Summary
Monolith to microservices is a journey of software re-engineering motivated by accelerated change with autonomy to engineering and business teams. Monolithic databases can drive interesting band-aid solutions which add to complexity, slow down business agility and lead to large complex transformations
The key to escaping this cycle is to start top-down, from first principles and lead with business capabilities along business lines. I like to say “what if this as a 3rd party software which we consumed to run the business? how would that engineering team model its data store and integrate ?”
Hope this post helps you with your journey with refactoring your monolithic database. I am keen to hear about your stories in this space
Good article. I feel like a bit more details could be provided on the actual migration journey after the DDD model and mapping (including options of breaking up the database into smaller partitions)
LikeLiked by 1 person
Thank you Alex, great topic for the next post! I will say the migration journey starts with tests & lots of fun workshops to orient ourselves to the outside-in!
LikeLike