dataphos components

and features

data infrastructure map

Data Infrastructure Map
Data Infrastructure Map

Dataphos Components


Real time Business Object Publisher

Based on Create-Transform-Publish at source pattern.​
Uses RDBMS to transform data at source
Data is formatted, serialized, compressed and encrypted. Significantly reduced network traffic​.

Strong security: both public internet and private networks can be used.

schema registry

Schema Registry Validator​

Validates message schema, enables schema registration, versioning and recovery​.

Interprets the incoming data flow and adjusts the schema automatically. Categorizes previously accumulated data in the new schema. All previous schema versions are stored; restoring a previous set is possible​.

Supports both binary (AVRO, Protobuff) and text formats (JSON, XML, CSV).


Data Lake Builder​

Permanently stores all data in the original format. Makes data available for subsequent processing and eliminates the risk of data loss. Fail-safe on the ingestion process​.

Versioning with full structure history of ingested data. Automated data lake build (versioned)​.

API for querying indexing engine enables resubmitting and replaying permanently stored messages. Depending on the message broker, it supports different read options – push, pull, streaming.

cached api

Serves data to consumers via an API REST endpoint​.

Uses embedded in-memory system for getting the most recent data to consumers. Only when the requested dataset cannot be served from the cache system, it queries the underlying database.

data mart loader

Creates an analytic data model in an automated fashion​.

Data mart builder patterns – different types of dimensions (Type 1, 2 and 7) and facts.

management console

Stand-alone ​management console used for configuring and deploying the platform via a UI.

Monitors performance. Product health dashboard with detailed product metrics and logs.

Architectural Paradigms and Technology​

A unique low-code, true real-time data platform, written in Go programming language.

Data mesh and event mesh architectural paradigms – including domain driven data design, data as a product and federated governance concepts. Full compatibility with decentralized data platform architecture.

Event-driven architecture and streaming data processing, avoiding complex orchestration.

Promotes and implements best data ingestion and data management patterns.

Built as a set of ready-to-use components designed as microservices with connectors and plugins – decoupling functionality whilst providing seamless data flow.

Enables monitoring and observability of health and performance of deployed data pipelines and data flows​.

Data from the source available on the platform immediately, in real time. As consumable business objects, transformed at source.

Legacy source systems are no longer an obstacle: new, target architecture is fully decoupled from the source systems.