FULL TIME --- Big Data Engineer with Java and Hive
Apply NowCompany: Centraprise
Location: Dallas, TX 75217
Description:
We are looking for Senior Big Data Engineer with Java and Hive. Please find the details below:--
Job Title
Senior Big Data Engineer with Java and Hive
Relevant Experience & Experience Required
(in Yrs)
8+ years
Key words to search in resume
Senior Big Data Engineer with:
Java, Hive - Primary skills (Must have)
Azure, SQL, DevOps and Unix scripting skills - Secondary Skills (Should have)
Kafka, Postgres - Good to have
Technical/Functional Skills -MUST HAVE SKILLS
Roles & Responsibilities
Work Location (State, City )
Dallas, TX
Job Title
Senior Big Data Engineer with Java and Hive
Relevant Experience & Experience Required
(in Yrs)
8+ years
Key words to search in resume
Senior Big Data Engineer with:
Java, Hive - Primary skills (Must have)
Azure, SQL, DevOps and Unix scripting skills - Secondary Skills (Should have)
Kafka, Postgres - Good to have
Technical/Functional Skills -MUST HAVE SKILLS
- Bachelor's Degree in Computer Science, Information Systems or related discipline required, or equivalent work or educational experience.
- Hands on experience Development activities in Java, Hive technologies, Big Data and ETL
- Hands on experience in Cloud Technologies especially Azure
- Strong scripting skills in ksh/bash (Unix shell scripting), Powershell, SQL etc
- Experience in agile development (must have worked in Safe, Scrum or Kanban frameworks) and DevOps
- Hands on expertise and Knowledge of SQL based technologies
- Must be proficient in usage of version control repositories such as github
- Hands on experience in analysing, designing (selecting tech stack) and building Data Pipelines
- Must be able to perform performance optimization of existing data pipelines by identifying bottlenecks and resolving them
- Min 8+ years' hands on experience demonstrating depth and breadth across key engineer competencies e.g. Software Development, Testing, DevOps, Lifecycle Management, etc.
- As part of building Data Pipelines, knowledge of leveraging Orchestration tools like Apache Airflow will be a big plus
- Hands on experience in applying Design thinking principles or DataOps ways of working to achieve Self Service maturity for developing Data Pipelines
- Experience with Technical Documentation
- Experience of mentoring Engineers and providing direction to offshore team to achieve desired results
- Good knowledge of Testing strategies and implementation. Must be able to exhaustively test what has been developed/ configured and deployed
- Need proactive behavioral skills and sharp attention to details
- Proficient verbal and written skills required to effectively communicate in the English language.
- Should be able to work with multiple stakeholders to coordinate, resolve ambiguity and drive results, rather than waiting for instructions
- Must be a subject matter expert in Data Pipelines Analysis, Design and Engineering skills, and should be able to work unsupervised
- Good to have - Participated in hackathons, or self-created assets / improvements / innovations for the organization
Roles & Responsibilities
- Analyze, Design and Build data pipelines to orchestrate the movement, transformation, validation, and loading of data, from source to destination
- Hands on expertise in Unix Shell Scripting, Unix Commands, Linux based infrastructure
- Must be able to rapidly troubleshoot and identify areas of problems in case of technical issues with Data pipelines
- Provides leadership, technical direction and expertise to global product teams. May provide technical guidance to several product teams at once.
- Will be accountable to work as Big Data Engineer in Multi Cloud architecture (On Prem, Azure, GCP)
- Must be able to perform performance optimization of existing data pipelines by identifying bottlenecks and resolving them
- Hands on experience in Database systems
- Hands on experience on ETL tools
- Experience on working with Data APIs
- Knowledge of Algorithms and Data Structures
- Knowledge of working in Equalum environment will be a plus
- Knowledge of Big Data Frameworks and big data querying tools
- Proactively defining engineering standards and checklists for adherence within the team
- Experience with data integration mechanisms
- Accountable for the continuous delivery of product roadmap-aligned, strategic technology solutions applying leading engineering practices, adopting agile and DevOps principles and ensuring technical delivery is fully compliant with customer Security, Quality and Regulatory standards
- Partners with Enterprise Architecture team to understand and influence Target Architecture strategy. Ensures product solutions are aligned to Target Architecture.
- Accountable for elevation of competency across product engineers. Coaches and mentors' engineers at varying levels of maturity to embed innovation best practices as a core engineering competency. Depending on team structure, the incumbent may matrix or line manage a team.
- Provides thought leadership in driving technical improvements and optimization of development (tools, process, automation). Pro-actively engages in experimentation and innovation to drive relentless improvement
- Accountable for ensuring all technical documentation up to date in support of the lifecycle plan for audits/reviews.
Work Location (State, City )
Dallas, TX