Data Engineer - Remote / Telecommute

Apply Now

Company: Cynet Systems

Location: Minneapolis, MN 55407

Description:

Job Description:

Required Skills:
  • Drives successful solution adoption and implementation for medium to complex portfolios.
  • Works on several large and enterprise-wide projects.
  • Participates in strategy design and leads initiatives.
  • Designs solutions for large scale initiatives.
  • Has Intermediate to advanced skills in python, Pytorch,tensorflow and other deep learning frameworks.
  • Is an expert in working with large databases, BI applications, data quality, and performance tuning.
  • Has expert knowledge of developing end-to-end business intelligence solutions: data modeling, ETL, and reporting.
  • Has deep understanding of data gathering, inspecting, cleansing, transforming, and modeling diagramming techniques.
  • Deep understanding and experience in Microservices architecture.
  • May act as an escalation point for others.
  • Has outstanding written and communication skills.
  • Identifies and drives process improvement.
  • Responsible for improving availability, security, compliance, interoperability, performance and reengineering activities.
  • Grows into role of a recognized subject matter expert in one or more functions
  • Has excellent communication skills, ability to work individually and in a broader geographically dispersed team.
Qualification:
  • Bachelors Required, Masters preferred.
  • 5-8 years of relevant required including several technology solutions such as Java, Big Data technologies and data management tools.
  • 8+Years preferred.
  • 5+ years of Data Ops and DevOps experience, building solutions using Hadoop technologies (Pig, SPARK, KAFKA), Python and building version controlled CICD pipelines using tools like Jenkins, GitHub.
  • 3-5 years' experience in design, development, and hands-on implementation of Google and AWS cloud solutions.
  • 5+ years in relational database concepts with a solid knowledge of star schema, Oracle, SQL, PL/SQL, SQL Tuning, OLAP, Big Data technologies, Snow Flake, Apache Nifi..
  • 5+ years' experience with building data pipelines in data lake setup
  • 3 years of Architecting, design and implementing enterprise scope projects / products and Data Management.
  • 3-4 years of experience in Cerner, EPIC and Lawson systems.
  • 5 years of Secure Data Engineering and scripting (Shell, Python).
  • 5 years' experience in healthcare data, building clinical and non-clinical solutions that drives patient outcomes.

Similar Jobs