- Build and scale enterprise-grade event streaming platforms
- Cutting-edge cloud, DevOps, and data streaming technologies
- Collaborate with high-performing teams on impactful integration projects
We’re looking for a Confluent Kafka Engineer to join our Enterprise Integration team. You’ll design, implement, and manage Kafka-based solutions at scale, enabling seamless integration across critical business platforms.
As part of an agile environment, you’ll be hands-on with Confluent Cloud, Kafka connectors, Java, Flink, KSQL, and cloud-native DevOps tooling such as Git, Docker, Kubernetes, Helm, and Terraform.
What you’ll do
- Deliver enterprise-scale Confluent Kafka streaming solutions
- Implement connectors, replicators, and security controls
- Design and develop integration solutions, from concept to deployment
- Collaborate across teams to ensure resilience, scalability, and performance
- Champion best practices in coding, documentation, and automation
What you’ll bring
- Expertise in Confluent Kafka design, architecture, and management
- Strong background in Java, Flink, KSQL, and streaming frameworks
- DevOps-first mindset with Kubernetes, Docker, Terraform
- Experience troubleshooting Kafka administration and performance
- Bonus: exposure to AWS, GCP, or Azure and scripting (Python/Bash)
If you’re passionate about real-time data, event streaming, and large-scale integration, we’d love to hear from you.
BH5943