dataphos components

and features

Dataphos in an Event-Driven Data Pipeline

Generic data pipeline
Generic Data Pipeline mobile

Dataphos Components

Publisher

Real time Business Object Publisher

Based on Create-Transform-Publish at source pattern. Uses RDBMS to transform data into structured business objects at source.

Data is formatted, serialized, compressed and encrypted. Significantly reduced network traffic.

Strong security: both public internet and private networks can be used

schema registry

Schema Registry Validator​

Validates message schema, enables schema registration, versioning and recovery.

Interprets the incoming data flow and adjusts the schema automatically. Categorizes previously accumulated data in the new schema. All previous schema versions are stored; restoring a previous set is possible.

Supports both binary (AVRO, Protobuff) and text formats (JSON, XML, CSV)

persistor

Data Lake Builder​

Permanently stores all data in the original format. Makes data available for subsequent processing and eliminates the risk of data loss. Fail-safe on the ingestion process.

Versioning with full structure history of ingested data. Automated data lake build (versioned).

Built-in indexing engine for tracing of a message location across the data lake.

Resubmits and replays permanently stored messages.

cached api

Serves data to consumers via an API REST endpoint.

Uses embedded in-memory system for getting the most recent data to consumers. Only when the requested dataset cannot be served from the cache system, it queries the underlying database.

data mart loader

Creates an analytic data model in an automated fashion.

Data mart builder patterns – different types of dimensions (Type 1, 2 and 7) and facts.

management console

Stand-alone management console used for configuring and deploying the platform via a UI.

Monitors performance. Product health dashboard with detailed product metrics and logs.

Architectural Paradigms and Technology​

A unique low-code, true real-time data platform, written in the Go programming language.

Data mesh and event mesh architectural paradigms – including domain driven data design, data as a product and federated governance concepts. Fully compatible with decentralized data platform architecture.

Event-driven architecture and streaming data processing, avoiding complex orchestration.

Enables monitoring and observability of health and performance of deployed data pipelines and data flows.

Promotes and implements best data ingestion and data management patterns.

Built as a set of ready-to-use components designed as microservices with connectors and plugins – decoupling functionality whilst providing seamless data flow.

Data from the source available on the platform immediately, in real time, as consumable business objects, transformed at source.

Legacy source systems are no longer an obstacle: new target architecture is fully decoupled from the source systems.

Copyright 2022 Syntio Ltd. All rights reserved