Data Engineer-AWS solution architect
Apply NowCompany: Tephra Inc.
Location: Malvern, PA 19355
Description:
Description:
Experience and Qualifications:
2+ years of hands-on experience designing and deploying an AWS-based application (native/re-factored)
5+ years with a high-level programming language (Java, Node JS or Python).
Expertise in the core AWS services, uses, automation, and architecture best practices
Proficiency in designing, developing, and deploying cloud-based solutions using AWS
Experience with developing and maintaining applications written for Amazon Simple Storage Service, Amazon Simple Queue Service, Amazon Simple Notification Service, Amazon Simple Workflow, API Gateway Service, AWS Elastic Beanstalk, and AWS CloudFormation
Proficiency in Amazon Compute and Storage Instances
Experience with S3 Server Side Encryption, IAM, and Policy, CloudTrail, CloudWatch.
Experience on EMR and Lambda Serverless Architecture
Experience setting up Kinsesis streams and integrating them with CDC tools (Attunity preferred)
Experience with databases like MySQL and RDS concepts.
Proficiency on High Available, Fault Tolerant, and DR Architecture
Good working knowledge of MPP, and NoSQL Database like DynamoDB
Hands on Experience in creating VPC, Security Group,NaCl, Route 53
Experience on DevOps CI and CD using Genkin or Bamboo or Code Deploy
AWS Developer, Solution Architect Certified a plus but not required
Experience with Atlassian stack highly preferred
Job responsibilities:
Design, develop and deliver scalable and automated Data Pipelines
Familiarity with Kinesis & ability to pick up the configuration based on the design patterns
Code and enable Data store using dynamo db & kinesis
Schema design & creation of Objects in dynamo db based on consumption pattern
Leverage IAM roles & policies for service authentication
Deploy Dynamo schema & objects in Cloud
Build transformation logic using Lamdba (python) for the target consumption
Leverage IAM roles & policies for service authentication
Promote Lambda code across AWS environments
Data Quality evaluations based on the source data
Data preparation & migration for the 1st time load onto dynamo DB
Experience and Qualifications:
2+ years of hands-on experience designing and deploying an AWS-based application (native/re-factored)
5+ years with a high-level programming language (Java, Node JS or Python).
Expertise in the core AWS services, uses, automation, and architecture best practices
Proficiency in designing, developing, and deploying cloud-based solutions using AWS
Experience with developing and maintaining applications written for Amazon Simple Storage Service, Amazon Simple Queue Service, Amazon Simple Notification Service, Amazon Simple Workflow, API Gateway Service, AWS Elastic Beanstalk, and AWS CloudFormation
Proficiency in Amazon Compute and Storage Instances
Experience with S3 Server Side Encryption, IAM, and Policy, CloudTrail, CloudWatch.
Experience on EMR and Lambda Serverless Architecture
Experience setting up Kinsesis streams and integrating them with CDC tools (Attunity preferred)
Experience with databases like MySQL and RDS concepts.
Proficiency on High Available, Fault Tolerant, and DR Architecture
Good working knowledge of MPP, and NoSQL Database like DynamoDB
Hands on Experience in creating VPC, Security Group,NaCl, Route 53
Experience on DevOps CI and CD using Genkin or Bamboo or Code Deploy
AWS Developer, Solution Architect Certified a plus but not required
Experience with Atlassian stack highly preferred
Job responsibilities:
Design, develop and deliver scalable and automated Data Pipelines
Familiarity with Kinesis & ability to pick up the configuration based on the design patterns
Code and enable Data store using dynamo db & kinesis
Schema design & creation of Objects in dynamo db based on consumption pattern
Leverage IAM roles & policies for service authentication
Deploy Dynamo schema & objects in Cloud
Build transformation logic using Lamdba (python) for the target consumption
Leverage IAM roles & policies for service authentication
Promote Lambda code across AWS environments
Data Quality evaluations based on the source data
Data preparation & migration for the 1st time load onto dynamo DB