Senior Kafka Platform Engineer - High-Volume Data Streaming Expert
Location: Krakow, Poland
Contract Type: Permanent
- Join a global leader in cutting-edge data streaming solutions
- Collaborate with a world-class team of engineers and innovators
- Competitive salary and comprehensive benefits package
- Opportunities for professional growth and career advancement
- Flexible work arrangements and excellent work-life balance
Our client, a renowned technology company, is seeking a talented Senior Kafka Platform Engineer to join their dynamic team in Krakow, Poland. As a key member of the engineering group, you will play a crucial role in designing, implementing, and maintaining high-performance data streaming solutions using Apache Kafka.
Position Overview
In this pivotal role, you will leverage your expertise in Kafka to develop and optimize data streaming pipelines that process massive volumes of data in real-time. You will collaborate closely with cross-functional teams to understand business requirements, architect scalable solutions, and ensure the reliability and performance of the Kafka platform. Your work will have a significant impact on the company's ability to deliver innovative products and services to its global customer base.
Responsibilities
- Provide expertise in Kafka brokers, zookeepers, Kafka Connect, Schema Registry, KSQL, REST Proxy, and Kafka Control Center
- Design and implement high-volume data streaming solutions using Kafka Connect and Schema Registry
- Administer and operate the Kafka platform, including provisioning, access control, Kerberos, and SSL configurations
- Develop and maintain Kafka connectors such as MQ, Elasticsearch, JDBC, File Stream, JMS Source, and custom connectors using Kafka core concepts and APIs
- Create topics, set up redundancy clusters, deploy monitoring tools, and configure alerts for producers, consumers, and consumer groups
- Conduct root cause analysis of production incidents, document findings, and implement proactive measures to enhance system reliability
- Automate routine tasks using scripts and automation tools, and perform data-related benchmarking, performance analysis, and tuning
Requirements
- Strong problem-solving and analytical skills
- In-depth understanding of Kafka architecture and its components
- Experience with cluster maintenance processes, implementing changes, and recommending fixes to protect production environments
- Proficiency in operating within an infrastructure-as-code and automation-first principles environment
- Expertise in messaging technologies such as Apache Kafka and Confluent Kafka
- Familiarity with DevOps toolsets, including GitHub, JIRA, Confluence, and Jenkins
- Knowledge of automation tools like Ansible or Puppet
- Experience with observability tools such as Datadog, New Relic, Prometheus, and Grafana
- Excellent communication skills, able to engage with engineers and stakeholders at a technical level
Benefits
- Competitive salary and performance-based bonuses
- Comprehensive health insurance and wellness programs
- Generous paid time off and flexible work arrangements
- Professional development opportunities, including training and certifications
- Collaborative and inclusive work environment that values diversity and innovation
As part of a global leader in data streaming solutions, you'll have the opportunity to work on cutting-edge projects that shape the future of real-time data processing. Our client fosters a culture of innovation, collaboration, and continuous learning, providing you with the support and resources to grow your career and make a lasting impact in the field.