GCP Data Engineer

Apply Now

Company: HTC Global Services

Location: Dearborn, MI 48126

Description:

HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.

At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.

Job Description:

Position Description: We're seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform.

You will:
  • Work in collaborative environment including pairing and mobbing with other cross-functional engineers
  • Work on a small agile team to deliver working, tested software
  • Work effectively with fellow data engineers, product owners, data champions and other technical experts
  • Demonstrate technical knowledge/leadership skills and advocate for technical excellence
  • Develop exceptional Analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid Data Warehouse principles
  • Be the Subject Matter Expert in Data Engineering and GCP tool technologies

Skills Required:
  • Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
  • Implement methods for automation of all parts of the pipeline to minimize labor in development and production
  • Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products
  • Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting
  • Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Skills Preferred:
  • Strong drive for results and ability to multi-task and work independently
  • Self-starter with proven innovation skills
  • Ability to communicate and work with cross-functional teams and all levels of management
  • Demonstrated commitment to quality and project timing
  • Demonstrated ability to document complex systems
  • Experience in creating and executing detailed test plans
Experience Required:
  • In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures
  • 5+ years of analytics application development experience required
  • 5+ years of SQL development experience
  • 3+ years of Cloud experience (GCP preferred) with solution designed and implemented at production scale
  • Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Terraform, Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow, etc.
  • 2 + years professional development experience in Java or Python, and Apache Beam
  • Extracting, Loading, Transforming, cleaning, and validating data
  • Designing pipelines and architectures for data processing
  • 1+ year of designing and building CI/CD pipelines
Experience Preferred:
  • Experience building Machine Learning solutions using TensorFlow, BigQueryML, AutoML, Vertex AI
  • Experience in building solution architecture, provision infrastructure, secure and reliable data-centric services and application in GCP
  • Experience with DataPlex or Informatica EDC is preferred
  • Experience with development eco-system such as Git, Jenkins and CICD
  • Exceptional problem solving and communication skills
  • Experience in working with DBT/Dataform
  • Experience in working with Agile and Lean methodologies
  • Team player and attention to detail
  • Performance tuning experience
Education Required:
  • Bachelor's degree in computer science or related scientific field
Education Preferred:
  • GCP Professional Data Engineer Certified
  • Master's degree in computer science or related field
  • 2+ years mentoring engineers In-depth software engineering knowledge

Our success as a company is built on practicing inclusion and embracing diversity. HTC Global Services is committed to providing a work environment free from discrimination and harassment, where all employees are treated with respect and dignity. Together we work to create and maintain an environment where everyone feels valued, included, and respected. At HTC Global Services, our differences are embraced and celebrated. HTC is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills, and experiences within our workforce. HTC is proud to be recognized as a National Minority Supplier.

#LI-GL1 #LI-Hybrid

Similar Jobs