Big Data Engineer
Apply NowCompany: Futran Tech Solutions Pvt. Ltd.
Location: Minneapolis, MN 55407
Description:
Big Data Engineer
Minneapolis, MN,
Long Term
Job description:
Responsible for designing, developing, testing, operating and maintaining ETL pipelines/processes using Informatica. Expertise in working with SQL based products Oracle, SQL server and big data tools hdfs, hive, spark and scala. Expertise in unix, shell scripting and scheduling tools. Takes end-to-end ownership by consistently writing production-ready and testable code. Develop high-quality code, define best engineering practice, perform peer code reviews to ensure successful deliverable with engineering excellence.
Accountable for ensuring all aspects of product development follow compliance and security best practices.
Skill's required:
3+ years of experience with ETL tools preferably Informatica Big data edition.
3+ years of experience with relational databases, hdfs and hive.
Experience in managing big data sets.
Excellent SQL and data warehouse skills.
Experience in unix and shell scripting.
Knowledge of working with mainframe sources is preferrable.
Knowledge of Spark, Scala and other programming languages
knowledge in airflow, google cloud, data quality.
Familiarity with one of the core cloud provider services, preferably Google Cloud.
Excellent verbal and written communication skills.
Ability to work independently with minimal guidance.
Minneapolis, MN,
Long Term
Job description:
Responsible for designing, developing, testing, operating and maintaining ETL pipelines/processes using Informatica. Expertise in working with SQL based products Oracle, SQL server and big data tools hdfs, hive, spark and scala. Expertise in unix, shell scripting and scheduling tools. Takes end-to-end ownership by consistently writing production-ready and testable code. Develop high-quality code, define best engineering practice, perform peer code reviews to ensure successful deliverable with engineering excellence.
Accountable for ensuring all aspects of product development follow compliance and security best practices.
Skill's required:
3+ years of experience with ETL tools preferably Informatica Big data edition.
3+ years of experience with relational databases, hdfs and hive.
Experience in managing big data sets.
Excellent SQL and data warehouse skills.
Experience in unix and shell scripting.
Knowledge of working with mainframe sources is preferrable.
Knowledge of Spark, Scala and other programming languages
knowledge in airflow, google cloud, data quality.
Familiarity with one of the core cloud provider services, preferably Google Cloud.
Excellent verbal and written communication skills.
Ability to work independently with minimal guidance.