Java/Big Data Engineer
Apply NowCompany: Compunnel Software Group
Location: Costa Mesa, CA 92627
Description:
Job Summary
We are seeking a highly skilled Java/Big Data Engineer to design, develop, and support scalable data solutions on the AWS cloud platform.
The ideal candidate will have hands-on experience in developing API services, managing big data workloads, and driving performance optimization on distributed systems, with strong expertise in Java, Spark, and DevOps practices.
Key Responsibilities
Required Qualifications
Preferred Qualifications
Education: Bachelors Degree
We are seeking a highly skilled Java/Big Data Engineer to design, develop, and support scalable data solutions on the AWS cloud platform.
The ideal candidate will have hands-on experience in developing API services, managing big data workloads, and driving performance optimization on distributed systems, with strong expertise in Java, Spark, and DevOps practices.
Key Responsibilities
- Design and develop scalable API services on AWS using Java, Scala, or Kotlin.
- Build and optimize data pipelines and applications using PySpark and EMR on the AWS ecosystem.
- Manage distributed data storage technologies such as Cassandra, ScyllaDB, and DynamoDB.
- Implement orchestration using tools like Apache Airflow and AWS Step Functions.
- Conduct performance tuning on Spark, Hadoop, EMR, and Python-based applications.
- Develop and maintain CI/CD pipelines using Jenkins, Artifactory, AWS CodeCommit, and CloudFormation.
- Write and maintain shell scripts for automation and operational tasks on Linux environments.
- Apply DevOps practices for infrastructure, deployment, and monitoring.
- Take ownership of components and drive delivery with a bias for action.
- Collaborate with cross-functional teams to develop and deploy high-scale data applications.
Required Qualifications
- Strong experience in Java/Scala/Kotlin for building APIs and backend services.
- Proficiency in big data technologies including Spark, EMR, and Hadoop.
- Deep understanding of AWS services including EC2, ECS, EMR, Step Functions, and DynamoDB.
- Strong knowledge of data messaging systems like Kafka.
- Experience with Linux environments and shell scripting.
- Solid DevOps skills, including infrastructure automation and continuous integration/deployment.
- Experience designing and developing large-scale distributed applications.
- Strong problem-solving skills and a proactive approach to technical challenges.
- Demonstrated ownership and accountability in previous projects.
Preferred Qualifications
- Experience with Airflow or similar workflow orchestration tools.
- Familiarity with CI/CD tools such as Jenkins, AWS CodePipeline, and Artifactory.
- Experience with containerization and orchestration using Docker and ECS/EKS.
- Understanding of modern development practices and cloud-native application patterns.
Education: Bachelors Degree