Data Engineer
Apply NowCompany: Technogen International Company
Location: Atlanta, GA 30349
Description:
Location: REMOTE
Job Title: Data Engineer
Job Location: Remote
Responsibilities:
Responsibilities:
The Data Engineer is responsible for designing and developing robust, scalable solutions for collecting, analyzing large data sets, creating and maintaining data pipelines, data structures and reports.
Understand business processes, applications and how data is gathered; and tie application telemetry to transactional data model.
Build and manage data marts to satisfy our growing data needs.
Develop and manage data pipelines at enterprise scale
Build data expertise and own data quality for various data flows
Use your coding skills across several languages like SQL and Python to support analysts and data scientists
Interface with internal data consumers to understand data needs
Requirements:
2+ years of SQL (Hive, Snowflake, AWS Redshift, etc.) experience is required;
Experience in custom or structured (i.e. Airflow / Informatica / Talend / Pentaho) ETL design, implementation and maintenance;
Ability to analyze data to identify deliverables, gaps and inconsistencies.
Experience working with either a Map Reduce or MPP system on any size/scale;
+3 or -3 hour time zone within Pacific Time is preferred
Job Title: Data Engineer
Job Location: Remote
Responsibilities:
Responsibilities:
The Data Engineer is responsible for designing and developing robust, scalable solutions for collecting, analyzing large data sets, creating and maintaining data pipelines, data structures and reports.
Understand business processes, applications and how data is gathered; and tie application telemetry to transactional data model.
Build and manage data marts to satisfy our growing data needs.
Develop and manage data pipelines at enterprise scale
Build data expertise and own data quality for various data flows
Use your coding skills across several languages like SQL and Python to support analysts and data scientists
Interface with internal data consumers to understand data needs
Requirements:
2+ years of SQL (Hive, Snowflake, AWS Redshift, etc.) experience is required;
Experience in custom or structured (i.e. Airflow / Informatica / Talend / Pentaho) ETL design, implementation and maintenance;
Ability to analyze data to identify deliverables, gaps and inconsistencies.
Experience working with either a Map Reduce or MPP system on any size/scale;
+3 or -3 hour time zone within Pacific Time is preferred