Marketing Data Engineer
Apply NowCompany: Codeforce360
Location: Seattle, WA 98115
Description:
Required Skills:
Position Summary:
Project Scope:
Primary Responsibilities (Essential Functions):
Knowledge and Skill Requirements:
Experience:
- Amazon Web Services (AWS),GIT,SNOWFLAKE,ETL,Python,SQL.
Position Summary:
- The Marketing Data Engineer contractor contributes technical solutions on behalf of the company's Marketing Data Engineering team.
- This team is responsible for building and operating our cloud-based Marketing data platform that enables self-service analytics.
- The Marketing Data Engineer role will develop and help solve challenging data integration problems by participating in development and operations work related to building the Marketing Data Platform.
- The role will collaborate with other Data Engineers, Data Scientists, Analysts as well as our internal marketing customers to scope and define implementation plans that meet business requirements.
Project Scope:
- The Marketing team is currently migrating it's legacy data architecture along with its pipelining and ETL processes over to AWS in a system that utilizes Airflow, custom Python modules, and dbt for data modeling.
- The Marketing Data Engineer will be responsible for building custom data pipelines from multiple data sources (API sources, flat files, etc.) within this new data architecture.
Primary Responsibilities (Essential Functions):
- Build robust scalable data processing and data integration pipelines using Python and SQL.
- Develop data quality automations and unit tests to ensure the accuracy of the data delivered to the Analysts and Business Customers.
- Build solutions that scale as our data volumes grow exponentially.
- Define and implement monitoring and alerting policies for data solutions.
- Develop data models that support analytical models used by the company.
- Test and document data pipelines that are developed.
- Regular, predictable job attendance.
Knowledge and Skill Requirements:
- Demonstrated experience writing Python code.
- Good understanding of RESTful APIs, data streaming, and other data sourcing technologies.
- Demonstrated experience writing optimized SQL queries across large data sets.
- Experience with Git.
- Experience with Snowflake.
- Experience with Airflow.
- Experience with containers and container orchestration tools such as Docker and Kubernetes.
- Experience with CI/CD pipelines and the SDLC.
- AWS Certifications on either a Developer or Architect is preferred but not required.
Experience:
- BA/BS in Computer Science or equivalent.
- 4+ year's professional experience in Data Engineering role.
- A technical background in ETL development and data warehousing concepts is preferred.
- Proven technical leadership and mentoring of other engineers.