GCP Developer
Apply NowCompany: ASB Resources
Location: Iselin, NJ 08830
Description:
#jobs
GCP Microservices + Big Query Developer
Have at least 2 years of experience in Google Cloud platform (especially Big Query & Dataflow), Experience with Java and Python and Google Cloud SDK & API Scripting. Experience of Financial domain will be an added advantage.
Document the process using multiple architectural models
Experience with Microservices architectures and ability to leverage Google Cloud.
Experience in interpreting customer business needs and translate those needs into requirements
Create and manage data storage solutions using GCP services such as Big Query, Cloud Storage, Cloud SQL, Dataflow, Data proc, and Pub/Sub
Develop and maintain data ingestion and transformation processes using tools like Apache Beam and Apache Spark
Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring
Enhanced quality of data insights through implementation of automated data validation processes and improved access to data sources.
Automate data processing tasks using scripting languages such as Python and Bash
Participate in code reviews and contribute to the development of best practices for data engineering on GCP
Coordinate and assist with UAT testing with business stakeholders to ensure the product meets their business needs.
Identify gaps between the current deployment of applications and future requirements.
-directly support primary objectives of the project,
-are appropriate in size for iterative development
-include clear and specific acceptance criteria
-take into account dependencies on other stories and/or projects & initiatives
GCP Microservices + Big Query Developer
Have at least 2 years of experience in Google Cloud platform (especially Big Query & Dataflow), Experience with Java and Python and Google Cloud SDK & API Scripting. Experience of Financial domain will be an added advantage.
Document the process using multiple architectural models
Experience with Microservices architectures and ability to leverage Google Cloud.
Experience in interpreting customer business needs and translate those needs into requirements
Create and manage data storage solutions using GCP services such as Big Query, Cloud Storage, Cloud SQL, Dataflow, Data proc, and Pub/Sub
Develop and maintain data ingestion and transformation processes using tools like Apache Beam and Apache Spark
Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring
Enhanced quality of data insights through implementation of automated data validation processes and improved access to data sources.
Automate data processing tasks using scripting languages such as Python and Bash
Participate in code reviews and contribute to the development of best practices for data engineering on GCP
Coordinate and assist with UAT testing with business stakeholders to ensure the product meets their business needs.
Identify gaps between the current deployment of applications and future requirements.
- Write user stories as part of an Agile project framework that:
-directly support primary objectives of the project,
-are appropriate in size for iterative development
-include clear and specific acceptance criteria
-take into account dependencies on other stories and/or projects & initiatives
- Participate in the daily Agile/scrum team meetings