Senior Data Engineer
Apply NowCompany: Codeforce360
Location: Durham, NC 27713
Description:
Career Opportunity:
Job Title: Senior Data Engineer
About CodeForce 360
Making a career choice is amongst the most critical choices one can make, and it's important for the choice to be calculated with factors such as a company's run of success since its inception and more. But, when you come across a company that has reputation proven with nothing but an illustrious run of success since the day it began, you don't need to think of anything else. That's precisely what some of our employees and prospective employees think when they came across CodeForce 360.
Position Overview
Senior Data Engineer
Requirements:
Role Requirements:
The Value You Deliver:
Required Skills:
Hadoop, Scala, Python.
How to Apply
Job ID: JPC - 56079
For more information, please contact below:
Eshwar Duppalli
470-280-7243
eshward@codeforce.com
Qualified individuals will be contacted for an interview.
Job Title: Senior Data Engineer
About CodeForce 360
Making a career choice is amongst the most critical choices one can make, and it's important for the choice to be calculated with factors such as a company's run of success since its inception and more. But, when you come across a company that has reputation proven with nothing but an illustrious run of success since the day it began, you don't need to think of anything else. That's precisely what some of our employees and prospective employees think when they came across CodeForce 360.
Position Overview
Senior Data Engineer
Requirements:
- The main focus will be working with architects and other developers to design develop and deploy the new platform to support reporting / analytic needs.
- However, the work will include an element of data architecture and governance, for example finding ways to understand data lineage, suitable methods to catalog each attribute defined in the system, detailing its meaning.
- Lastly, the role will involve an element of initial support and operational handover, especially during the first release.
Role Requirements:
- Hands on design and development.
- Ability to work independently or in small teams to design and implement Hadoop based data extraction, transformation / processing for load / presentation into visualization / reporting tools.
- The solution will ingest data from traditional databases & data warehouses (predominantly Oracle), flat files, web services, Mongo DB sources process / transform and augment the data structuring it into views suitable for integration with visualization and reporting tools (e.g. Tableau, OBIEE).
- Be highly proficient in the following languages, tool and frameworks meaningful to the components and techniques relating to Hadoop implementations, e.g.: Hadoop, HDFS, Sqoop, Spark, Presto, Oozie, Hive
- Notebook platforms.
- Scala, Python or Java (and specifics of integration with the Hadoop libraries / components).
- Amazon Web Services / EMR architecture and operation.
- Additionally, experience and knowledge of key data integration points as well as technologies present in the broader Client environment is helpful:
- Oracle, SQL, MongoDB.
- RHEL Linux, basic shell scripting, operating AWS instances and environments.
- ETL and data warehouse design.
- Public / hybrid Cloud platforms and practices (ideally AWS).
- An understanding of client reporting tools (Tableau, Oracle OBIEE).
- DevOps and deployment models.
- Good communication and collaboration skills.
- A self-starter who can lead and take ownership.
- Experience in agile environments - used to iterative delivery.
- Flexible and able to work across a mix of modern and legacy systems.
The Value You Deliver:
- Understanding business needs & processes, crafting, detailing, developing, and implementing Big Data applications in a dynamic financial services environment.
- Assesses Industry standards for data models, incorporating new advances into internal architecture standards.
- Ownership of the technology work and providing leadership needed to complete a project of the highest quality, on time and within budget.
- Sharing written and spoken information in a way that is understood by those needing the information
- Performing complex data warehouse development.
- Monitoring progress, work quality and risk of the projects; calling out situations needing management attention.
- Solving and investigating issues in support of third line production support.
- Chipping in to and writing concise and clear technical systems documents based on the analysis of complex business requirements.
- Staying in sync with technical standards, policies, and procedures.
Required Skills:
Hadoop, Scala, Python.
How to Apply
Job ID: JPC - 56079
For more information, please contact below:
Eshwar Duppalli
470-280-7243
eshward@codeforce.com
Qualified individuals will be contacted for an interview.