Looking for a Senior Software Streaming Pipelines Engineer to join our team reimagining how merchant data flows through modern streaming architectures. This isn’t your typical ETL role – you’ll be crafting solutions that challenge conventional approaches to data processing at global scale.
What makes this exciting?
- We’ve developed a declarative pipeline framework using Apache Beam, Google Cloud Dataflow, and ClickHouse that transforms how data engineers build streaming systems. Imagine YAML-driven pipelines that eliminate boilerplate code, real-time stream processing across multiple regions, and an architecture that makes complex data transformations feel effortless.
- You’ll work across multiple languages – Kotlin, Ruby, Python, and Rust – choosing the right tool for each challenge, alongside dbt for elegant data modelling and our custom framework that turns pipeline development into a configuration exercise rather than a coding marathon.
- The puzzle? Replacing entrenched batch systems with streaming-first architecture while merchants never notice the transition.
- You’ll tackle fascinating problems: How do you handle late-arriving data in distributed streams? What’s the most elegant approach to backfill terabytes while maintaining real-time processing? How do you architect lightning-fast real-time modelling that seamlessly combines data from multiple tables?
- We embrace AI and LLMs to accelerate repetitive tasks, freeing you to focus on the creative problem-solving that makes this work truly rewarding.