Interview: Nikola Sever, Principal Consultant from Syntio, CroPC.net

We spoke with Nikola Sever, Principal Consultant at Syntio, a Data Engineering company, about the Data Innovation Summit, the largest annual Data & AI conference recently held in Stockholm.

You recently gave a lecture at the Data Innovation Summit, which took place on the 11th and 12th of May. Can you tell us more about that conference?

The Data Innovation Summit is the largest annual conference primarily focused on Data & AI, which are fields that our company is involved in. The conference itself lasted for two days and was attended by over 2,000 participants, including over 300 speakers from Data & AI backgrounds. Keynote speakers included individuals from Google, Twitter, Snowflake, and Databricks, mainly from companies that are significant players in the Data & AI world. We are proud to have stood alongside them and represented Syntio through two main perspectives: through consulting, where we execute digital transformation for large enterprise companies, and through Dataphos, our digital transformation platform. Additionally, our booth attracted numerous interested individuals, and based on the reactions and comments we received, we garnered a lot of attention.

What did you talk about in your presentation titled “Data Mesh for event-driven applications”?

The focus of my presentation was the paradigm of ‘Event Mesh.’ Since the term ‘Event Mesh’ is still unfamiliar to a number of people, I didn’t want to include it in the presentation’s title. I envisioned it as a plot twist. First, I briefly explained what ‘Event Mesh’ is. ’Event Mesh’ is an architectural paradigm that enables the routing of events, messages, and similar entities to their destinations. Every application generates events (e.g., adding items to a shopping cart, purchasing an item, searching for specific terms, shipping goods, etc.). Therefore, it is essential to know how to route all this information from the source to where it will generate business value successfully. It is important to note, that the sources of generated events and their destinations are unaware of each other, and the locations of such applications, whether in the cloud or a data center, are often physically separated. ‘Event Mesh’ precisely bridges the gap between on-premise systems and cloud integrations. In my presentation, I wanted to emphasize that the current implementation of the ‘Data Mesh’ paradigm, in our opinion, is currently only possible with the help of ‘Event Mesh.’

What is the reason for that?

The reason is that large traditional systems contain certain SaaS applications over which they have no control. Additionally, they likely contain some cloud infrastructure, for example, having a ‘Data Lake’ as a central data storage location. We must not forget traditional on-premise systems such as SAS and Data Warehouse, which, for example, can be implemented on Oracle Exadata or IBM Netezza. Then we come to the main challenge: how to connect all these different systems, which are geographically separated and “do not speak the same language,” meaning they do not generate events in the same formats. This is where ‘Event Mesh’ comes in. It acts as a bridge using “glue,” such as an Event Broker or multiple brokers, and routes events from the source to the destination. I would like to highlight that when you search for the term ‘Event Mesh,’ you will come across Solace results. Solace is a company that offers its event broker, Solace. We at Syntio believe that currently only Kafka satisfies the majority of the functional needs for a successful ‘Event Mesh.’ Why only the majority? Because the event and ‘Data Mesh’ paradigm itself is relatively new, and there are currently no solutions on the market that cover all aspects of successful ‘Event Mesh.’

Data Mesh is part of your DataPhos platform – what can you tell us about it?

DataPhos is our response to various problems which we have encountered in our daily work with clients. Typical data platforms are highly centralized and complex, making it extremely difficult to change or adapt them, resulting in significant management, maintenance, and licensing costs. We were firmly convinced that there must be a better and simpler solution, so we decided to offer it at Syntio. We aim to simplify the story so that organizations can change or customize their data platform to meet their business needs while reducing costs. When considering what our clients would need in the near future, we realized that there was still no real-time data platform that was a ready-made product, so we seized the opportunity and created one. We developed a system consisting of multiple components that form a seamless whole.

How does your system work?

It can retrieve data from existing or completely new sources in real time, while detecting and reporting any changes in data structures to avoid dependencies within systems. All this data ends up in the data lake, where it is ready for consumption and use by the data science team or other transactional systems. Our platform covers the majority of today’s organizational data needs, it is straightforward to implement and maintain, and its usage is free. A key component is the module for tracking changes in data structures, which means you can monitor how your data structures evolve. We consider it crucial because it facilitates integration management, reduces vulnerabilities and the number of errors between systems, and proactively notifies about any changes the moment they occur. Managing system interdependencies has become an increasingly significant problem in today’s era, which we definitely wanted to address.

What are your impressions of The Data Innovation Summit after the conference?

The conference itself was organized at a very high level. Organizing nine parallel tracks, along with several workshops, all in one enormous enclosed space where you don’t have to go from one building to another, is truly impressive. Additionally, at every step, you come across exhibitor booths showcasing their platforms, from which you can learn and gather information. I was particularly impressed by the Agorify platform through which the entire event was organized. They offer their application with a plethora of useful information and features such as scheduling meetings with conference attendees, scanning QR codes generated for each visitor (to facilitate future connections), creating your own agenda for conference sessions, and more. Our conferences in Croatia may not yet be at that level, but I believe we will soon adopt some good practices by learning from other large-scale conferences like this one.

 

The original article