Sr.Hadoop Developer
Apply NowCompany: Bridge Tech
Location: Beaverton, OR 97007
Description:
Company Description
Job Description
Typically requires a Bachelors Degree and minimum of 5 years directly relevant work experience
Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer you will work with a variety of talented client team mates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics.
Responsibilities:
Design and implement map reduce jobs to support distributed processing using java, cascading, python, hive and pig; Ability to design and implement end to end solution.
Build libraries, user defined functions, and frameworks around Hadoop
Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system
Develop user defined functions to provide custom hive and pig capabilities
Define and build data acquisitions and consumption strategies
Define & develop best practices
Work with support teams in resolving operational & performance issues
Work with architecture/engineering leads and other teams on capacity planning
Qualifications
Qualification:
MS/BS degree in a computer science field or related discipline
6+ years' experience in large-scale software development
1+ year experience in Hadoop
Strong Java programming, shell scripting, Python, and SQL
Strong development skills around Hadoop, MapReduce, Hive, Pig, Impala
Strong understanding of Hadoop internals
Good understanding of AVRO and Json and other compresssion
Experience with build tools such as Maven
Experience with databases like Oracle;
Experience with performance/scalability tuning, algorithms and computational complexity
Experience (at least familiarity) with data warehousing, dimensional modeling and ETL development
Ability to understand and ERDs and relational database schemas
Proven ability to work cross functional teams to deliver appropriate resolution
Nice to have:
Experience with open source NOSQL technologies such as HBase and Cassandra
Experience with messaging & complex event processing systems such as Kafka and Storm
Machine learning framework
Statistical analysis with Python, R or similar
Additional Information
All your information will be kept confidential according to EEO guidelines.
Job Description
Typically requires a Bachelors Degree and minimum of 5 years directly relevant work experience
Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer you will work with a variety of talented client team mates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics.
Responsibilities:
Design and implement map reduce jobs to support distributed processing using java, cascading, python, hive and pig; Ability to design and implement end to end solution.
Build libraries, user defined functions, and frameworks around Hadoop
Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system
Develop user defined functions to provide custom hive and pig capabilities
Define and build data acquisitions and consumption strategies
Define & develop best practices
Work with support teams in resolving operational & performance issues
Work with architecture/engineering leads and other teams on capacity planning
Qualifications
Qualification:
MS/BS degree in a computer science field or related discipline
6+ years' experience in large-scale software development
1+ year experience in Hadoop
Strong Java programming, shell scripting, Python, and SQL
Strong development skills around Hadoop, MapReduce, Hive, Pig, Impala
Strong understanding of Hadoop internals
Good understanding of AVRO and Json and other compresssion
Experience with build tools such as Maven
Experience with databases like Oracle;
Experience with performance/scalability tuning, algorithms and computational complexity
Experience (at least familiarity) with data warehousing, dimensional modeling and ETL development
Ability to understand and ERDs and relational database schemas
Proven ability to work cross functional teams to deliver appropriate resolution
Nice to have:
Experience with open source NOSQL technologies such as HBase and Cassandra
Experience with messaging & complex event processing systems such as Kafka and Storm
Machine learning framework
Statistical analysis with Python, R or similar
Additional Information
All your information will be kept confidential according to EEO guidelines.