Hadoop Developer
Apply NowCompany: incedo
Location: Wilson, NC 27893
Description:
Company Description
Incedo (http://www.incedoinc.com/) (formerly a part of $4Bn Group) is a technology solutions and services organization headquartered in the Bay Area, USA with workforce across North America, South Africa and India (Gurgaon, Bangalore). We specialize in Data & Analytics and Product Engineering Services, with deep expertise in Financial Services, Life Science and Communication Engineering. Our key focus is on Emerging Technologies and Innovation. Our end-to-end capabilities span across Application Services, Infrastructure and Operations.
Job Description
Position : Hadoop Consultant
Location : Raleigh, NC
Duration: Contract/Full Time
Job Description:
Bachelors (IT/Computer Science Preferred); Master's Degree preferred (IT/Computer Science Preferred) or equivalent experience
8-10 years of industry experience in analysing source system data and data flows, working with structured and unstructured data, and delivering data and solution architecture designs.
Experience with clustered/distributed computing systems, such as Hadoop/MapReduce, Spark/SparkR, Lucene/ElasticSearch, Storm, Cassandra, Graph Databases, Analytics Notebooks like Jupyter, Zeppelin etc
Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/asynchronous using MQ, Kafka, Steam processing.
5+ years of experience as a data engineer/solution architect designing and delivering large scale distributed software systems, preferably in large scale global business- preferably using open source tools and big data technologies such as Cassandra, Hadoop, Hive, Prestodb, Impala, HBase, Spark, Storm, Redis, Drill etc
Strong hands-on experience of programming with Java, Python, Scala etc.
Experience with SQL, NoSQL, relational database design, and methods for efficiently retrieving data for Time Series Analytics.
Knowledge of Data Warehousing best practices; modelling techniques and processes and complex data integration pipelines
Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.)
Excellent technical skills - able to effectively lead a technical team of business intelligence and big data developers as well as data analysts.
Optional Skills:
Experience in Machine Learning, Deep Learning, Data Science
Experience with Cloud architecture & service like AWS, Azure
Experience with Graph, Sematic Web, RDF Technologies
Experience with Text Analytics using SOLR or ElasticSearch
Qualifications
Additional Information
All your information will be kept confidential according to EEO guidelines.
Incedo (http://www.incedoinc.com/) (formerly a part of $4Bn Group) is a technology solutions and services organization headquartered in the Bay Area, USA with workforce across North America, South Africa and India (Gurgaon, Bangalore). We specialize in Data & Analytics and Product Engineering Services, with deep expertise in Financial Services, Life Science and Communication Engineering. Our key focus is on Emerging Technologies and Innovation. Our end-to-end capabilities span across Application Services, Infrastructure and Operations.
Job Description
Position : Hadoop Consultant
Location : Raleigh, NC
Duration: Contract/Full Time
Job Description:
Bachelors (IT/Computer Science Preferred); Master's Degree preferred (IT/Computer Science Preferred) or equivalent experience
8-10 years of industry experience in analysing source system data and data flows, working with structured and unstructured data, and delivering data and solution architecture designs.
Experience with clustered/distributed computing systems, such as Hadoop/MapReduce, Spark/SparkR, Lucene/ElasticSearch, Storm, Cassandra, Graph Databases, Analytics Notebooks like Jupyter, Zeppelin etc
Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/asynchronous using MQ, Kafka, Steam processing.
5+ years of experience as a data engineer/solution architect designing and delivering large scale distributed software systems, preferably in large scale global business- preferably using open source tools and big data technologies such as Cassandra, Hadoop, Hive, Prestodb, Impala, HBase, Spark, Storm, Redis, Drill etc
Strong hands-on experience of programming with Java, Python, Scala etc.
Experience with SQL, NoSQL, relational database design, and methods for efficiently retrieving data for Time Series Analytics.
Knowledge of Data Warehousing best practices; modelling techniques and processes and complex data integration pipelines
Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.)
Excellent technical skills - able to effectively lead a technical team of business intelligence and big data developers as well as data analysts.
Optional Skills:
Experience in Machine Learning, Deep Learning, Data Science
Experience with Cloud architecture & service like AWS, Azure
Experience with Graph, Sematic Web, RDF Technologies
Experience with Text Analytics using SOLR or ElasticSearch
Qualifications
Additional Information
All your information will be kept confidential according to EEO guidelines.