Database Developer - III
Apply NowCompany: Compunnel Software Group
Location: San Francisco, CA 94112
Description:
Job Summary:
We are looking for a skilled Data Mesh Data Modeler with expertise in Databricks to join our team. The ideal candidate will be responsible for designing and implementing scalable and efficient data models within a data mesh architecture, ensuring reliable and efficient data processing while collaborating with cross-functional teams. You will leverage Databricks for various data engineering tasks, from data processing to orchestration, to meet the needs of our data-driven organization.
Job Responsibilities:
Required Skills:
Preferred Skills:
Certifications:
Databricks Certification or other relevant data engineering certifications are a plus.
Education: Bachelors Degree
We are looking for a skilled Data Mesh Data Modeler with expertise in Databricks to join our team. The ideal candidate will be responsible for designing and implementing scalable and efficient data models within a data mesh architecture, ensuring reliable and efficient data processing while collaborating with cross-functional teams. You will leverage Databricks for various data engineering tasks, from data processing to orchestration, to meet the needs of our data-driven organization.
Job Responsibilities:
- Design and implement scalable, efficient data models within a data mesh architecture, following principles such as domain-driven design and federated data governance.
- Work closely with data architects, engineers, and business stakeholders to translate business requirements into technical solutions.
- Communicate data model designs effectively to technical and non-technical teams.
- Leverage Databricks for data engineering tasks, including data processing, data validation, and data orchestration.
- Optimize data pipelines to ensure high performance, scalability, and efficient data processing.
- Implement data validation rules and quality checks to ensure data integrity and consistency
- Design, implement, and manage the lifecycle of data products within the Data Mesh architecture.
- Collaborate with other teams to build, manage, and monitor efficient data pipelines and data products.
Required Skills:
- Experience in Data Mesh Architecture: Expertise in modeling data products and understanding the principles of data mesh, domain-driven design, and federated governance.
- Databricks Expertise: Strong hands-on experience with Databricks, including using Spark for large-scale data processing, validation, and orchestration.
- Programming Skills: Proficiency in SQL and Python for data processing, transformation, and validation.
- Data Pipeline Optimization: Experience in designing and optimizing data pipelines to ensure scalability, high performance, and efficient data flow.
- Data Integrity: Implementing robust data validation and quality checks to ensure data consistency and accuracy across systems.
- Collaboration: Strong ability to work closely with cross-functional teams, including business users, data architects, and engineers.
- Problem-Solving: Strong troubleshooting and problem-solving skills to ensure the efficiency and reliability of data systems.
Preferred Skills:
- Experience in designing and managing the lifecycle of Data Products within a Data Mesh architecture.
- Familiarity with data governance practices and tools for managing data quality and compliance.
- Knowledge of cloud platforms like AWS, Azure, or GCP for deploying and managing data solutions.
Certifications:
Databricks Certification or other relevant data engineering certifications are a plus.
Education: Bachelors Degree