Single Entity, Multiple Master: Manage Complexity with Domain Services Integration Pattern

We often think of enterprise systems as the master systems for one or more enterprise data entities. We believe the lifecycle of the entity is contained and managed by users in a system in an encapsulated and convenient manner. However, there are scenarios where multiple systems master an entity and this is often because they each do part of the lifecycle of the entity well and organisations (business users) like doing part of the work here, then moving it to another system to complete the other part and then moving to another part for the rest etc.

We assume data mastering is always in one system but it can be in multiple systems

In this post we examine how entities that are mastered across multiple systems can drive complex integrations and how simple cross system synchronisation type integrations or operational data store solutions do not resolve this complexity. We explore how state management can be key beside providing a unified view of the data

In this post, we look at a concrete example using an “application” object with a rich lifecycle and state transfers mastered across multiple systems of record

“Application Management”: The canonical use-case for entity with complex lifecycle

Applications are what we submit for university, insurance, loans etc as customers and then wait for a business and its people to process it, enrol us or pay us or provision something for us. Applications are a request from us for something and they start with a form and live a rich life within the processing business/organisation

Consider the example below where an Application moves from “draft” to “submitted” all the way to “closed” or “cancelled” states. This hypothetical application transitions states on submission, assignment, cancellation, payment etc actions as indicated on the lines

Sample application lifecycle and transition actions

Our simple use case would be when this lifecycle is managed in a single application as shown below, here a business user would simple log into the core system and manage the application

Simple use case with one system mastering the “full lifecycle” of our application entity

Use Case: Multiple Masters managing lifecycle of a single entity

In our case, we will imagine the entity is mastered across multiple systems as shown below and often the ask is to build enterprise integration to facilitate the automation internally …

Multiple master systems for Application object

…we are asked to build “cross-system synchronisations” to move the “application” entity as it’s state changes

Enterprise Integration ask

Challenge: Integration Complexity

The above ask often seems simple until we need to expand these to other use cases where other internal systems, digital channel applications like portals and mobile applications or even partner systems (b2b or m&a) need the “application data”

How do we serve to new use cases and contexts?

These new integrations to multiple master systems are point-to-point and tactical which we tend to avoid as good system integrators

Using Point-to-point integrations increases integration complexity

First approach at a solution: Use a common service

The first approach of point-to-point is often remediated with a service and layer approach similar to SOA style architecture. This ensured there was a single entry point and reduced the number of connection points however it lacked a consolidated view of the object and there was some complex routing baked into the enterprise, process or services layers based on the state

This worked fine but it struggled in a few areas like performance and ability to change independently – these solutions were simply not fast enough with the routing bits and there was some serious regression testing when things needed to change

Avoiding point-to-point and integration complexity with layers as in SOA style architectures

Second approach at a solution: Avoid layers

Our next approach is to build a domain service. Here there business domain is what the application is used for – Claims, Student Application, Loan and the key is that this service manages state and provides a service by combining downstream technological system views into a more business oriented view

Use a domain service to “collate” the application object and business logic smeared across the enterprise into a common service

In the domain service we surface the entire lifecycle of application object by listening to lifecycle events from the downstream systems, maintaining a model of the application locally and updating the model through downstream system queries from the domain service on system events. Our solution provides high-availability by compromising on read consistency and for writes i.e. request to changes to application data or application state transitions from digital, CRM, partner etc – we either batch the commands or send them right away to the system of record

Now when upstream consumers want an integrated answer, it is maintained in the domain service and built to be highly available giving up on some consistency (the application state might be slightly stale due to delayed consistency)

Also, when the upstream systems want to build new features from new attributes for the application change is simpler assuming all the application attributes live at the domain service and if not, the model can be easily expanded in the domain service by pulling more data from downstream systems


Data mastering is quite often simple with 1-1 mapping between systems and data objects. However, there are scenarios where one organisational data entity and its lifecycle are mapped across multiple systems and managed partially in each system. These scenarios create complex integrations when sharing data across the master systems and this complexity grows when expanding to other use cases like CRM, Digital channel and external Partner

Using a domain service architectural style to build an operational data store and manage state locally can help bring a single data view for distributed objects and they do this through downstream integrations and a local datastore to store the object

We give up real-time consistency when reading to delayed-consistency but high availability and use cases like “Claims application management”, “Student application”, “Loan application” etc we can separate the Command from Query to build a highly performant, nimble and change-able solution

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s