Senior Data Engineer
Apply NowCompany: Compunnel Software Group
Location: Bentonville, AR 72712
Description:
Job Summary
We are seeking a highly skilled Senior Data Engineer with extensive experience in GCP, PySpark, and Scala to design and develop big data applications. This role requires expertise in building scalable data pipelines, managing distributed data platforms, and leading Agile teams. The ideal candidate will be responsible for technical leadership, automation, workflow orchestration, and optimizing data architecture.
Key Responsibilities
" Design and develop big data applications using open-source technologies.
" Lead data engineering efforts in an offshore model with a managed outcome approach.
" Develop logical and physical data models for big data platforms.
" Automate workflows using Apache Airflow.
" Create data pipelines using Apache Hive, Apache Spark, and Apache Kafka.
" Provide ongoing maintenance and enhancements to existing systems.
" Participate in rotational on-call support.
" Mentor junior engineers and provide technical guidance.
" Lead daily standups and design reviews.
" Groom and prioritize backlog using JIRA.
" Act as the point of contact for assigned business domains.
" Ensure adherence to coding standards, performance optimization, and documentation best practices.
" Conduct root cause analysis (RCA) and mitigate defects.
" Engage with customers, present design options, and conduct product demos.
" Manage complex user stories and estimate resource efforts.
" Monitor and optimize project efficiency, cost, and quality.
Required Qualifications
" 12+ years of hands-on experience in data engineering and data warehouse solutions.
" 10+ years of experience as a Scrum Master.
" 6+ years of experience with Hadoop, Hive, Spark, Airflow, or similar workflow orchestration tools.
" 5+ years of experience with GCP (Dataproc, GCS, BigQuery).
" 5+ years of experience in PySpark and Scala development.
" 5+ years of experience in schema modeling and design for data lakes or RDBMS.
" Experience with programming languages: Python, Java, Scala.
" Experience with scripting languages: Perl, Shell.
" Strong background in Agile/Scrum methodologies.
" Experience with test-driven development and automated testing frameworks.
" Hands-on experience working with large datasets (multi-TB/PB scale).
" Experience with Gitflow, BitBucket, JIRA, and Confluence.
" Experience with CI/CD tools like Bamboo, Jenkins, or TFS.
" Excellent communication skills and ability to manage multiple priorities.
" Bachelor's degree in Computer Science or equivalent experience.
Location: 805 SE Moberly Lane, Bentonville, AR
Education: Bachelors Degree
We are seeking a highly skilled Senior Data Engineer with extensive experience in GCP, PySpark, and Scala to design and develop big data applications. This role requires expertise in building scalable data pipelines, managing distributed data platforms, and leading Agile teams. The ideal candidate will be responsible for technical leadership, automation, workflow orchestration, and optimizing data architecture.
Key Responsibilities
" Design and develop big data applications using open-source technologies.
" Lead data engineering efforts in an offshore model with a managed outcome approach.
" Develop logical and physical data models for big data platforms.
" Automate workflows using Apache Airflow.
" Create data pipelines using Apache Hive, Apache Spark, and Apache Kafka.
" Provide ongoing maintenance and enhancements to existing systems.
" Participate in rotational on-call support.
" Mentor junior engineers and provide technical guidance.
" Lead daily standups and design reviews.
" Groom and prioritize backlog using JIRA.
" Act as the point of contact for assigned business domains.
" Ensure adherence to coding standards, performance optimization, and documentation best practices.
" Conduct root cause analysis (RCA) and mitigate defects.
" Engage with customers, present design options, and conduct product demos.
" Manage complex user stories and estimate resource efforts.
" Monitor and optimize project efficiency, cost, and quality.
Required Qualifications
" 12+ years of hands-on experience in data engineering and data warehouse solutions.
" 10+ years of experience as a Scrum Master.
" 6+ years of experience with Hadoop, Hive, Spark, Airflow, or similar workflow orchestration tools.
" 5+ years of experience with GCP (Dataproc, GCS, BigQuery).
" 5+ years of experience in PySpark and Scala development.
" 5+ years of experience in schema modeling and design for data lakes or RDBMS.
" Experience with programming languages: Python, Java, Scala.
" Experience with scripting languages: Perl, Shell.
" Strong background in Agile/Scrum methodologies.
" Experience with test-driven development and automated testing frameworks.
" Hands-on experience working with large datasets (multi-TB/PB scale).
" Experience with Gitflow, BitBucket, JIRA, and Confluence.
" Experience with CI/CD tools like Bamboo, Jenkins, or TFS.
" Excellent communication skills and ability to manage multiple priorities.
" Bachelor's degree in Computer Science or equivalent experience.
Location: 805 SE Moberly Lane, Bentonville, AR
Education: Bachelors Degree