Data Engineer - Senior
Apply NowCompany: CIBER
Location: Dearborn, MI 48126
Description:
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Position Responsibilities:
Required Skills:
Desired Skills:
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Position Responsibilities:
- Design, Develop and Implement data engineering solutions using standards, templates, patterns and best practices in the Google Cloud Platform.
- Full stack data engineering solutions utilizing both structured and unstructured data: development, ingestion, curation, implementation, deployment, automation and monitoring.
- Collaborates with the Data Factory Engineering Organization, Data Architecture, Information Technology and Data Consumers to drive data engineering capabilities, product design and proof of concepts, MVPs, to expand understanding, define technical optimization, explore configurations and overcome challenges.
- Create high quality, elegant data engineering solutions that focus on cloud-first, encapsulation, repeatability, automation and auditability.
- Work as an individual contributor and part of a team to build, test, maintain and troubleshoot data solutions.
- Continuously integrates and deploys data solutions via CI/CD.
- Use of Test Driven Development and Code Pairing Practices.
Required Skills:
- Bachelor's degree in a Technical Field: Computer Science, Data Science, Computational Finance, Statistics, Economics and/or Mathematics: Masters Preferred.
- 5+ years of experience in data solution, pipeline, mart and/or warehouse development and delivery using agile development methodology.
- Critical thinking skills to propose solutions, test, and make them a reality.
- 2+ years of experience in a Data Engineering Competency on a public cloud - Google, MS Azure, AWS.
- 2+ years Integration and Configuration Scripting in Tekton and Terraform.
- 3+ years of experience writing complex SQL.
- 3+ years of experience writing data solutions in Python, Java, Scala or Go.
- Deep understanding of data service ecosystems including data warehouses, lakes, metadata, meshes, fabrics and analytical use cases.
- User experience advocacy through empathetic stakeholder relationship.
- Excellent Communication Skills, Verbal and Written, for both internal and external team members.
Desired Skills:
- 2+ years Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, Data Flow, Data Fusion, PubSub / Kafka, Looker Studio and VertexAI.
- Experience with Teradata, Hadoop, Hive, Spark and other on-premise/legacy data solutions.
- Experience optimizing data solutions and data science/analytical workflows: re-coding/re-developing/re-factoring.
- Data Governance concepts including GDPR (General Data Protection Regulation), CCPA (California Consumer Protection Act), PoLP and how these can impact technical architecture.