Senior Kafka Engineer
Apply NowCompany: Macpower Digital Assets Edge
Location: Tempe, AZ 85281
Description:
Job Overview: We are seeking an experienced Kafka Engineer with expertise in Confluent Kafka, Java/Scala, and distributed systems. The ideal candidate should be skilled in designing scalable, fault-tolerant Kafka-based data pipelines, troubleshooting messaging issues, and optimizing performance. A strong background in cloud deployments, microservices, and Agile development with an automate-first approach is essential.
Responsibilities:
Key Skills & Expertise:
Must-Have Qualifications:
Responsibilities:
- Identify and resolve Kafka messaging issues within a justified timeframe.
- Collaborate with business and IT teams to understand business problems and design, implement, and deliver appropriate solutions using Agile methodology within a larger program.
- Work independently to implement solutions across multiple environments (DEV, QA, UAT, PROD).
- Provide technical direction, guidance, and code reviews for other engineers working on the same project.
- dminister distributed Kafka clusters in DEV, QA, UAT, and PROD environments and troubleshoot performance issues.
- Implement and debug subsystems, microservices, and components.
- Follow an automate-first/automate-everything philosophy.
- Demonstrate hands-on experience with programming languages relevant to the role.
Key Skills & Expertise:
- Deep understanding of Confluent Kafka - Proficient in Kafka concepts, including producers, consumers, topics, partitions, brokers, and replication mechanisms.
- Programming proficiency - Expertise in Java or Scala, with potential Python usage depending on the project.
- System design and architecture - Ability to design robust, scalable Kafka-based data pipelines considering data throughput, fault tolerance, and latency.
- Data management skills - Knowledge of data serialization formats such as JSON, Avro, and Protobuf, and schema evolution management.
- Kafka Streams API (optional) - Familiarity with Kafka Streams for real-time data processing within the Kafka ecosystem.
- Monitoring & troubleshooting - Experience with Kafka cluster health monitoring, identifying performance bottlenecks, and troubleshooting issues.
- Cloud integration - Experience deploying and managing Kafka on AWS, Azure, or GCP.
- Understanding of distributed systems concepts.
Must-Have Qualifications:
- 8-12 years of experience in software engineering.
- Kafka expertise - Deep knowledge of Kafka producers, consumers, topics, partitions, brokers, and replication.
- Programming proficiency - Strong in Java or Scala, with potential Python usage.
- System design & architecture - Experience in designing high-throughput, scalable Kafka pipelines.
- Cloud & DevOps - Experience deploying Kafka on AWS, Azure, or GCP.
- Monitoring & troubleshooting - Familiarity with Kafka cluster health monitoring and performance tuning.
