Try to search your question here, if you can't find : Ask Any Question Now ?

Data Model/Schema decoupling in Data Processing Pipeline suing Event Driven Architecture

HomeCategory: stackoverflowData Model/Schema decoupling in Data Processing Pipeline suing Event Driven Architecture
gaurav asked 3 weeks ago

I was wondering how Microservices in the Streaming Pipeline based on Event Driven Architecture can be truly decoupled from the data model perspective. We have implemented a data processing pipeline using Event-Driven Architecture where the data model is very critical. Although all the Microservices are decoupled from the business perspective, they are not truly decoupled as the data model is shared across all the services.

In the ingestion pipeline, we have collected data from multiple sources where they have a different data model. Hence, a normalizer microservice is required to normalize those data models to a common data model that can be used by downstream consumers. The challenge is Data Model can change for any reason and we should be able to easily manage the change here. However, that level of change can break the consumer applications and can easily introduce a cascade of modification to all the Microservices.

Is there any solution or technology that can truly decouple microservices in this scenario?

1 Answers
Best Answer
naveen answered 3 weeks ago
Your Answer

17 + 11 =

Popular Tags

WP Facebook Auto Publish Powered By : XYZScripts.com