Data Engineer - Locals Only
Apply NowCompany: V2 Innovations
Location: Mountain View, CA 94040
Description:
Major Responsibilities:
Background, Experience & Qualifications:
Necessary Skills / Attributes:
- Collaborate with internal and external teams for data engineering tasks such as engineering, product/business, service providers and third-party vendors across multiple locations.
- Analyze, Design and Develop ETLs/ Data Processing pipelines, data management tools, and related processes for the organization.
- Identify process that can be automated or optimized. Design and Develop automated processes such infrastructure and code deployment. Improve performance of existing process by fine tuning the code, re-design, using alternative methodologies / architecture etc.
- Build required infrastructure for optimal extraction, transformation and loading of data from various data sources on AWS Cloud.
- Use Python and SQL languages for developing ETL.
- Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency.
- Research, prototype and develop solutions for various cutting-edge issues, scalability problems, etc.
- Conduct integration, integration tests and performance tests of the E2E system including external dependencies, improve the system to meet performance and reliability requirements
- Maintain the overall Production live system highly available at all times while expanding its functionality
- Debug production issues raised by PMs and data analyst to isolate the cause and work towards a fix.
Background, Experience & Qualifications:
- Bachelors or Masters degree in Computer Science or Computer Engineering
- 8+ years of software development experience in a large-scale Production environment
- Full Stack: Strong AWS, Kubernetes, Javascript/Typescript/NodeJS, E2E, Python
- (optional) Front End: React.js
- Preferred: Big Data experience with backend development and deployment
- Demonstrable Experience of 5 years with Data Quality, Data Management tools, and integration platforms.
- 2+ years of experience in building integration with upstream and downstream systems with REST APIs
- Experience with ETL/ELT and data integration development using multiple tools in AWS cloud platform.
- Experience with the devops approach, continuous integration, continuous deployment, monitoring and maintenance of deployments in the AWS cloud with technologies including Docker, Kubernetes, Gitlab CICD, Jenkins, Terraform, Helm/Helmfile, Ansible, etc.
- Experience with overall use of AWS technologies including EKS, Amazon MSK, RDS, Amazon OpenSearch, ElastiCache, EC2, VPC, ASG, ELB, ECR, S3, Athena, Glue, Cognito, SageMaker, CloudFront, Lambda, CloudWatch, IAM, Redshift, etc.
- Familiarity with Business Intelligence development and tools: Tableau, Domo, Quicksights, etc.
- Optional: Experience with Google Cloud Platform, Azure SQL family, PostgreSQL, Snowflake, etc.
- Preferred: Fluent in Korean written and spoken language
Necessary Skills / Attributes:
- Strong ability to build and optimize data sets, data pipelines, and architectures
- Ability to build processes that support data transformation, data structures, and metadata
- Strong understanding of Data management and data stewardship roles in an organization
- Strong experience with domain-driven design and working with REST and event-driven microservice architectures in Kubernetes
- Strong ability to debug unfamiliar distributed architectures and isolate problems
- Ability to solve problems in a fast paced and dynamic environment with focus on maintaining high quality and standards
- Ability to communicate in writing, through email and reports, or orally complex technical matters to an audience of equal or higher-level technical competency
- Ability to deliver and accept feedback on code and design
- Strong time management and organization skills. Ability to work on multiple projects concurrently