Data Engineer
Apply NowCompany: CIBER
Location: Troy, MI 48085
Description:
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Job Description:
HTC Global Services is seeking a senior Data Engineer experienced with building data movement and data curation processes using modern toolsets including Snowflake, Python, YAML, and Azure Data Factory. This is NOT a traditional data modeling and ETL batch data processing role. This role will primarily be:
The varied and non-traditional mix of technologies and processes at play will require the successful candidate to be a very fast learner and a strong experienced problem-solver.
The core project involves acquiring and transforming data from various source systems and curating that data for consumption by a customer-facing portal. While this is primarily an individual contributor role, the role will also serve as an onshore coordinator for a larger offshore development team.
Our client is a Pennsylvania-based transportation and logistics company. This role is primarily remote though, so candidates may reside anywhere in the United States. Be advised however that occasional travel (10-15%) may be required as the needs of the project dictate.
Required People Skills:
Required Technical Skills:
Value Add Skills: (helpful but not required):
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Job Description:
HTC Global Services is seeking a senior Data Engineer experienced with building data movement and data curation processes using modern toolsets including Snowflake, Python, YAML, and Azure Data Factory. This is NOT a traditional data modeling and ETL batch data processing role. This role will primarily be:
- Designing, coding, and testing Python scripts to deliver prescribed business outcomes.
- Conducting detailed data analysis in response to both trouble tickets and new development.
- Providing real-time technical support during U.S. business hours.
The varied and non-traditional mix of technologies and processes at play will require the successful candidate to be a very fast learner and a strong experienced problem-solver.
The core project involves acquiring and transforming data from various source systems and curating that data for consumption by a customer-facing portal. While this is primarily an individual contributor role, the role will also serve as an onshore coordinator for a larger offshore development team.
Our client is a Pennsylvania-based transportation and logistics company. This role is primarily remote though, so candidates may reside anywhere in the United States. Be advised however that occasional travel (10-15%) may be required as the needs of the project dictate.
Required People Skills:
- Professional, confident, and cordial demeanor.
- Excellent communication skills - both oral and written.
- Proven ability to operate in dynamic environments and still deliver quality outcomes on a timely basis.
- Self-motivated, proactive, efficient, and highly organized.
- Strong desire and proven ability to learn new things quickly.
- Ability to work independently, with a high level of performance, with minimal direct supervision.
- Ability to smoothly collaborate with client and team resources to ensure results meet the defined business needs and exceed client expectations.
- Experience coordinating the efforts of geographically distributed team members.
- Ability to travel occasionally (10-15%) as required.
Required Technical Skills:
- 5+ years of experience implementing, enhancing, or supporting data solutions in support of downstream applications.
- Proven ability to write, refine, and troubleshoot highly efficient data manipulation code using Python, PySpark, and YAML.
- Proven experience working in technical support role - particularly for web applications and their underlying databases.
- Direct experience working with Snowflake databases.
- Ability to create and manage pipelines in Azure Data Factory.
- Strong problem-solving and troubleshooting skills - especially the ability to identify and resolve processing issues in near real time.
- Strong communication skills - particularly in terms of interacting with business users and teammates to quickly analyze and resolve both production issues and novel development challenges.
Value Add Skills: (helpful but not required):
- Direct experience working in the Transportation and Logistics industry.
- Direct hands-on experience working with the SnowPark toolset for data movement and transformation.
- Direct hands-on experience working with Databricks tools and environments.
- Advanced programming skills in C#, PowerShell, Python, or other equivalent modern programming languages.
- Experience working in the Jira Agile project management toolset.
- Relevant certifications in the toolsets of interest.