AWS Snowflake Data Solutions Engineer

Apply Now

Company: V2 Innovations

Location: Portland, OR 97229

Description:

Design and Develop: Create reusable components, frameworks, and libraries on a large scale to support analytics products.

Collaborate: Work closely with business and technology teams to design and implement product features.

Data Management: Identify and address data quality issues, including data cleaning, preparation, and optimization for ingestion and consumption.

Architectural Improvements: Collaborate on new data management projects and enhancements to the current data architecture.

Automation: Implement automated workflows and routines using scheduling tools.

Frameworks: Build continuous integration, test-driven development, and production deployment frameworks.

Data Analysis: Analyze and profile data to design scalable solutions.

Problem-Solving: Troubleshoot data issues and proactively resolve product and operational problems through root cause analysis.

Experience:
  • Strong grasp of data structures and algorithms.
  • Familiarity with solution and technical design.
  • Strong problem-solving and analytical skills.
  • Effective verbal and written communication with team members and business stakeholders.
  • Quick learning ability for new programming languages, technologies, and frameworks.
  • Experience in creating scalable, real-time, and high-performance data lake solutions on the cloud.
  • Fair understanding of developing complex data solutions.
  • Experience in end-to-end solution design.
  • Willingness to acquire new skills and technologies.
  • Passion for data solutions.

Required and Preferred Skills:
  • Hands-on experience in AWS, including EMR (Hive, PySpark), S3, Athena, or equivalent cloud platforms.
  • Familiarity with Spark Structured Streaming.
  • Minimum experience in working with Hadoop stack for handling large data volumes in a scalable manner.
  • Hands-on experience with SQL, ETL, data transformation, and analytics functions.
  • Proficiency in Python, including batch scripting, data manipulation, and distributable packages.
  • Experience with batch orchestration tools like Apache Airflow or equivalent, preferably Airflow.
  • Proficiency with code versioning tools such as GitHub or BitBucket, with expert-level understanding of repository design and best practices.
  • Familiarity with deployment automation tools like Jenkins.
  • Hands-on experience in designing and building ETL pipelines, expertise in data ingest, change data capture, data quality, and API development.
  • Experience in designing and developing relational database objects, knowledge of logical and physical data modeling concepts, and some exposure to Snowflake.
  • Familiarity with Tableau or Cognos use cases.
  • Understanding of Agile methodologies, with preferred working experience.

This role involves designing, developing, and optimizing data solutions in collaboration with various teams while leveraging a wide range of technical skills and tools.

Similar Jobs