Kafka Data Engineer
Apply NowCompany: Compunnel Software Group
Location: Montreal, QC H1A 0A1
Description:
Job Summary:
We are seeking a Kafka Data Engineer to join the team responsible for the development and maintenance of the firm's Trade Capture data stores, which handle large volumes of transactional big data for real-time and archival processing. This role requires both hands-on technical skills and the ability to collaborate across global teams, working closely with business owners, operations, and infrastructure partners to support critical trading applications and data pipelines.
Job Responsibilities:
Required Skills & Experience:
Preferred Skills:
Soft Skills:
Certifications (Optional but a Plus):
Confluent Certified Developer for Apache Kafka (CCDAK)
AWS Certified Data Analytics - Specialty or similar cloud/data certifications
Education: Bachelors Degree
Certification: Confluent Certified Developer for Apache Kafka , AWS Certified Data Analytics
We are seeking a Kafka Data Engineer to join the team responsible for the development and maintenance of the firm's Trade Capture data stores, which handle large volumes of transactional big data for real-time and archival processing. This role requires both hands-on technical skills and the ability to collaborate across global teams, working closely with business owners, operations, and infrastructure partners to support critical trading applications and data pipelines.
Job Responsibilities:
- Configure and manage Kafka clusters (Kafka plants) to support high-performance trading applications.
- Build and maintain monitoring and observability tools to proactively predict, detect, and prevent system outages.
- Collaborate with application developers and infrastructure teams to diagnose production issues and implement effective solutions.
- Participate in agile development processes, including sprint planning, daily standups, and retrospectives.
- Provide early morning support coverage for Asia time zones as needed.
- Contribute to real-time and archival data processing pipelines and ensure data is accurately streamed into data lakes and archives.
Required Skills & Experience:
- Hands-on experience with Kafka, including setup, configuration, tuning, and troubleshooting.
- Strong scripting abilities using Python and Unix/K-Shell for automation and system management.
- Solid understanding of database concepts and data modeling principles.
- Familiarity with the lifecycle of a trade and data flows in investment banking operations.
- Proven experience working in Agile/Scrum environments with collaborative, iterative development.
Preferred Skills:
- Experience with Kafka Connect, Kafka Streams, and schema registry (e.g., Confluent Platform).
- Exposure to cloud platforms (AWS, Azure, GCP) and container orchestration (e.g., Kubernetes).
- Familiarity with CI/CD pipelines, DevOps practices, and infrastructure as code.
- Knowledge of big data tools such as Spark, Hadoop, or similar.
Soft Skills:
- Strong problem-solving skills with a proactive and analytical approach.
- Excellent communication and collaboration abilities in a globally distributed team environment.
- Ability to take ownership and drive tasks to completion with minimal supervision.
Certifications (Optional but a Plus):
Confluent Certified Developer for Apache Kafka (CCDAK)
AWS Certified Data Analytics - Specialty or similar cloud/data certifications
Education: Bachelors Degree
Certification: Confluent Certified Developer for Apache Kafka , AWS Certified Data Analytics