Architecture, Design and Code Templates for Big Data Environment

NatCo of a Leading European Telecom

Business Challenge​

Our client, a NatCo of a leading European telecom, decided to implement a big data environment. Our task was to provide the know-how, services and components, for client product teams to be able to build efficient and resilient data pipelines/big data integrations.


Our Approach​

We made conceptual and technical knowledge and DevOps practices comprehensive and accessible to client product teams. We provided them with the best practices for the development of data pipelines and big data integration: data tracking/lineage, federated development, error handling and resubmissions, streaming and event related topics, CI/CD improvements, monitoring, reprocessing patterns and security. We evaluated emerging technologies and their alignment with our client’s group level technology guidelines. We also assessed and recommended functionalities to be migrated from on premise to cloud (AWS).



Upskilled product teams and an enhanced data pipeline development. We designed reusable self-service components to speed up further development by client product teams. We tested and documented technologies in line with the group level guidelines. Best practice architecture standards and principles were implemented. Big data environment enabled the implementation of a range of analytics use cases.