Modern Data Pipelines
Building a Streaming data Lakehouse
Event-driven applications don’t just give you isolated systems, they create a continuous stream of valuable business facts flowing through your organization.
The Data Lakehouse is the macro-architecture pattern that harnesses these data streams, enabling a modern data platform where real-time insights and historical data come together to unlock smarter decisions and innovation.
Our Promise
Value Proposition
Practical
Gain real-world experience through exercises and labs. Our Trainings contain practical exercises and demos tailored to your known environment.
Expert Guidance
Learn from experienced and certified professionals. We’re here to help you and your organization to get better, ask questions and learn with your peers.
Customized
Sessions are tailored to your team’s specific requirements. Before the trainings we conduct meetings with you to fully understand your needs and tailor the training to your needs.
Proven Expertise
Industry-expert trainers with years of experience
Tailor-made
Tailored content to meet the needs of your company.
Hands-on
Hands-on, practical training with real-world applications
Flexible
On-site, remote, and self-paced options.
Always Learn More
Potential Topics
Build real-time pipelines that clean, enrich, and transform data.
Examples:
- Ingest from Kafka → filter + transform → store in S3/Snowflake
- Join user behavior streams with product data
Detect issues before they impact your users.
Examples:
- Alert on error spikes in server logs
- Spot temperature anomalies in IoT sensors
Identify suspicious behavior in real-time.
Examples:
- Flag repeated failed logins or odd transactions
- Common in fintech and banking
Enable predictive power on the stream.
Examples:
- Real-time churn predictions
- Call out to external ML models via REST/gRPC
Design complex business logic across events.
Examples:
- Detect cart abandonment → trigger retargeting
- Monitor call sequences in telecom
Delight users with real-time personalization.
Examples:
- Netflix uses Flink to update recommendations live
- E-commerce platforms show dynamic product suggestions
- Live KPIs: revenue, active users, orders
- Alibaba uses Flink for Singles’ Day analytics
Always Learn More
What you'll learn
Introduction to Event-Driven Systems
-
From Batch to Real-Time Processing
-
Event-Driven Architectures (EDA)
-
Messaging basics with Kafka
Important Architectures
-
Self-Contained Systems (SCS) & Microservices
-
CQRS & the Dual Write Problem
-
Pipes & Filters Architecture
-
Choreography
-
Group Assignment: Evaluate pre-defined architectures using balanced scorecards. Identify pros & cons of different styles.
Data Lakehouse Architecture
-
The Evolution: Data Warehouse → Data Lake → Data Lakehouse
-
Core Components of a Modern Data Lakehouse
-
From Parquet Files to Open Table Formats (Paimon, Iceberg, Hudi)
-
Why Compute & Storage Must Be Decoupl
Upcoming
Next Training
Dates for 2026 will follow soon!
- Location: Bern (on-site) or Virtual
- Date: tbd
- Time: 9:00 – 17:00
- Price: CHF 2,200
- Includes: Course materials, access to sandbox environment, lunch & coffee
- Minimum 4 participants to run
Why choose us?
OVER 300 Engineers Trained
It all starts with a message
Training Request
Frequently Asked Questions
- Contact us to discuss your needs.
- Collaborate with experts to create a customized agenda.
- Receive a non-binding offer and finalize your plan.
- Start your tailored Kafka training.
We recommend the training for:
Software Engineers working on data-intensive applications
Data Engineers building real-time ETL pipelines
To benefit from our Kafka trainings, participants should have a basic understanding of distributed systems and programming concepts. Familiarity with Java or another programming language is recommended, but not mandatory. Our trainings are designed to accommodate participants with a range of experience levels, from beginners to advanced users. We recommend clustering the people in experience groups, so that no one is bored or overwhelmed.
The costs depend on the duration and the number of participants, and will be discussed during the first meeting.