Sr. AI Data Engineer to design large scale streaming and batch data pipelines - 130225-195435
Apply NowCompany: Annex Consulting Group
Location: Toronto, ON M4E 3Y1
Description:
Position: Sr. Data Engineer to design large scale streaming and batch data pipelines
Duration: 6 mo
Location: Hybrid Toronto
JOB ID: 130225-195435
Required Skills
Preferred Skills
What you'll do
Duration: 6 mo
Location: Hybrid Toronto
JOB ID: 130225-195435
Required Skills
- Contact Center tech experience and AI experience.
- 6+ years of professional software engineering and programming experience (Java, Python) with a focus on designing and developing complex data-intensive applications
- 3+ years of architecture and design (patterns, reliability, scalability, quality) of complex systems
- Advanced coding skills and practices (concurrency, distributed systems, functional principles, performance optimization)
Preferred Skills
- In-depth knowledge of software and data engineering best practices
- Experience in mentoring and leading junior engineers
- Experience in serving as the technical lead for complex software development projects
- Experience with large-scale distributed data technologies and tools
- Strong experience with multiple database models ( relational, document, in-memory, search, etc )
- Strong experience with Data Streaming Architecture ( Kafka, Spark, Airfl ow, SQL, NoSQL, CDC, etc )
- Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Pub/Sub, Datafl ow, Dataproc, Looker, and other cloud-native off erings
- Strong Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc)
- Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited fi les, databases (SQL, NoSQL, Time-series)
- Strong coding skills for analytics and data engineering (Java, Python, and Scala)
- Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Google's Cloud Platform (GCP)
- Understands how to translate business requirements to technical architectures and designs
- Comfortable communicating with various stakeholders (technical and non-technical)
What you'll do
- Apply your expertise in data and software engineering to design and implement data products that meet extreme requirements on scalability, reliability, maintainability, fl exibility, auditability, and quality
- Be T-Shaped: Your primary area is data engineering, but you are comfortable working in a secondary area of expertise such as data presentation/visualization, backend engineering, or data modelling (SQL, NoSQL, Graph & Time-series)
- Work closely with cross-functional teams of data, backend and frontend engineers, product owners, technical product owners, and technical support personnel
- Gaining technical expertise in building a data platform at scale to solve business, product, and technical use cases
- Getting hands-on experience with technologies such as Elasticsearch, Apache Airfl ow, Apache Kafka, Apache Beam, Apache Spark, Hive, HDFS, Kubernetes (Openshift)
- Getting hands-on experience with Google Cloud Platform and technologies such as BigQuery, Cloud Composer, Pub/Sub, Datafl ow, Dataproc, GCS, Looker, and other cloud-native off erings in GCP