GCP Lead
Apply NowCompany: Omni Inclusive
Location: Chicago, IL 60629
Description:
Primary - GCP, Secondary - DevOps
Roles & Responsibilities
The role requires strong hands-on experience in Data Ingestion, Integration, data Wrangling, Computation, Analytics pipelines using GCP ecosystem components
Be able to leverage expertise in data integration and big data design principles in implementation and enhancement of data integration solutions
Ability to provide design inputs to improve the enterprise data pipelines
Experience in Design, Development, Deployment and Support of data pipelines including
Data Ingestion, Integration, Data Quality & Governance
Data Storage and Computation Frameworks
Performance Optimization
Must Haves:
Overall 7+ years of IT experience with Cloud Platform and exposure to Administration
Hands-on experience in GCP - Cloud Dataflow, Cloud Pub-Sub, Big Query, Cloud Storage, Google Composer etc.
Minimum 3+ years of experience in at least one cloud platform (AWS / Azure / GCP) and related data services
Working knowledge of the big data, DataProc architecture and how it differs in comparison to other big data frameworks
Experience in Performance tuning and query optimization
Strong SQL expertise
Proficient with SQL, Shell/Python scripting, Extract, Transform Load methodology
Exposure in DevOps tools and orchestration tools to automate pipeline execution
Familiarity with standards and guidelines for design, development, and deployment of Data Architectures and Data Management
Understanding of fundamentals of Git and Git workflows. Must have used at least one of the SAAS-based Git tools like GitHub, GitLab, Bitbucket
Well versed and working knowledge in GCP IAM and data security
Good to Haves:
Experience in Java
Experience in automation of the deployment of services and configuration updates
Experience in optimization of existing cloud-based services running in production environments
Awareness of data governance processes (security, lineage, catalog) and tools like Collibra
Exposure to CI/CD - Infra provisioning on cloud, auto build & deployment pipelines, code quality
Valid professional certification in cloud platform (AWS / Azure / GCP)
Exposure to admin activities
Proficiency on the Linux command line using tools such as bash, SSH, vim
Experience in python for scripting repetitive tasks
Personal Attributes:
Excellent verbal and written communication and interpersonal skills
Confidence and agility in challenging times
Ability to work collaboratively with cross-functional teams in a fast-paced, team environment
Self-starter who requires minimal oversight
Ability to prioritize and manage multiple tasks
Process orientation and the ability to define and set up processes
Roles & Responsibilities
The role requires strong hands-on experience in Data Ingestion, Integration, data Wrangling, Computation, Analytics pipelines using GCP ecosystem components
Be able to leverage expertise in data integration and big data design principles in implementation and enhancement of data integration solutions
Ability to provide design inputs to improve the enterprise data pipelines
Experience in Design, Development, Deployment and Support of data pipelines including
Data Ingestion, Integration, Data Quality & Governance
Data Storage and Computation Frameworks
Performance Optimization
Must Haves:
Overall 7+ years of IT experience with Cloud Platform and exposure to Administration
Hands-on experience in GCP - Cloud Dataflow, Cloud Pub-Sub, Big Query, Cloud Storage, Google Composer etc.
Minimum 3+ years of experience in at least one cloud platform (AWS / Azure / GCP) and related data services
Working knowledge of the big data, DataProc architecture and how it differs in comparison to other big data frameworks
Experience in Performance tuning and query optimization
Strong SQL expertise
Proficient with SQL, Shell/Python scripting, Extract, Transform Load methodology
Exposure in DevOps tools and orchestration tools to automate pipeline execution
Familiarity with standards and guidelines for design, development, and deployment of Data Architectures and Data Management
Understanding of fundamentals of Git and Git workflows. Must have used at least one of the SAAS-based Git tools like GitHub, GitLab, Bitbucket
Well versed and working knowledge in GCP IAM and data security
Good to Haves:
Experience in Java
Experience in automation of the deployment of services and configuration updates
Experience in optimization of existing cloud-based services running in production environments
Awareness of data governance processes (security, lineage, catalog) and tools like Collibra
Exposure to CI/CD - Infra provisioning on cloud, auto build & deployment pipelines, code quality
Valid professional certification in cloud platform (AWS / Azure / GCP)
Exposure to admin activities
Proficiency on the Linux command line using tools such as bash, SSH, vim
Experience in python for scripting repetitive tasks
Personal Attributes:
Excellent verbal and written communication and interpersonal skills
Confidence and agility in challenging times
Ability to work collaboratively with cross-functional teams in a fast-paced, team environment
Self-starter who requires minimal oversight
Ability to prioritize and manage multiple tasks
Process orientation and the ability to define and set up processes