Snowflake - Data Engineer
Apply NowCompany: Omni Inclusive
Location: Boca Raton, FL 33433
Description:
Total of 10+ years in data engineering role with 4+ years of recent experience with Snowflake.
Extensive experience in design, development and support of complex ETL solutions
bility to design and implement highly performant data ingestion pipelines from multiple sources using DataStage, SnowPipe, SNOWSQL.
In-depth knowledge of SnowPipe, SNOWSQL, stored procedures
Good knowledge of Agile processes and able to work with Scrum teams.
Experience in DataStage and Snowflake performance optimization
Hands-on development experience with Snowflake data platform features including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and respective use cases.
dvanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices.
Ready to cover on-call support on rotation basis
Proficiency in Snowflake Cloud Data Platform and familiarity with AWS /Azure cloud platform
Strong leadership quality and able to coordinate with team at offshore
bility to provide technical guidance to data engineering team for data pipeline design and enhancements
Strong experience in ETL with Data migration, data consolidation
Hands-on experience in ETL data loading around event and messaging patterns, streaming data, Kafka , API
Understanding of fundamentals of DevOps CI/CD, Git and Git workflows and SAAS-based Git tools like GitHub, GitLab, Bitbucket
Experience of working in agile application development environment
bility to proactively prioritize tasks in consultation with business stakeholders, Product Owners, Product Managers
Design, Build, Deploy and Support DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
Ensure data quality, efficient processing, and timely delivery of accurate and trusted data.
The ability to design, implement and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential.
Establish ongoing end-to-end monitoring for the data pipelines.
Strong understanding of full CI/CD lifecycle.
Convert business requirements to technical solution
Ensure adherence to architectural guidelines, strategic business needs
Technical feasibility analysis, recommendations and effort estimation
Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
Performance optimization
QA support
utomation
Good to Have:
Valid professional certification
Experience in Python and BigData Cloud platform
Expertise in Unix & Shell Scripting
Extensive experience in design, development and support of complex ETL solutions
bility to design and implement highly performant data ingestion pipelines from multiple sources using DataStage, SnowPipe, SNOWSQL.
In-depth knowledge of SnowPipe, SNOWSQL, stored procedures
Good knowledge of Agile processes and able to work with Scrum teams.
Experience in DataStage and Snowflake performance optimization
Hands-on development experience with Snowflake data platform features including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and respective use cases.
dvanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices.
Ready to cover on-call support on rotation basis
Proficiency in Snowflake Cloud Data Platform and familiarity with AWS /Azure cloud platform
Strong leadership quality and able to coordinate with team at offshore
bility to provide technical guidance to data engineering team for data pipeline design and enhancements
Strong experience in ETL with Data migration, data consolidation
Hands-on experience in ETL data loading around event and messaging patterns, streaming data, Kafka , API
Understanding of fundamentals of DevOps CI/CD, Git and Git workflows and SAAS-based Git tools like GitHub, GitLab, Bitbucket
Experience of working in agile application development environment
bility to proactively prioritize tasks in consultation with business stakeholders, Product Owners, Product Managers
Design, Build, Deploy and Support DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
Ensure data quality, efficient processing, and timely delivery of accurate and trusted data.
The ability to design, implement and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential.
Establish ongoing end-to-end monitoring for the data pipelines.
Strong understanding of full CI/CD lifecycle.
Convert business requirements to technical solution
Ensure adherence to architectural guidelines, strategic business needs
Technical feasibility analysis, recommendations and effort estimation
Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
Performance optimization
QA support
utomation
Good to Have:
Valid professional certification
Experience in Python and BigData Cloud platform
Expertise in Unix & Shell Scripting