Data Integration Developer - Python
Apply NowCompany: ShiftCode Analytics
Location: Washington, DC 20011
Description:
Interview : Video
Visa : USC, GC
This is hybrid from day-1 ( Need local candidates )
Description :
We're hiring a Data Integration Developer with deep expertise in Python and AWS to support strategic cost allocation and financial data engineering initiatives. If you're passionate about building robust, scalable data pipelines and working with a diverse set of AWS services, this is the role to showcase your skillset on a high-visibility, multi-year initiative.
What You'll Do:
What You Bring:
Bonus Points For:
Visa : USC, GC
This is hybrid from day-1 ( Need local candidates )
Description :
We're hiring a Data Integration Developer with deep expertise in Python and AWS to support strategic cost allocation and financial data engineering initiatives. If you're passionate about building robust, scalable data pipelines and working with a diverse set of AWS services, this is the role to showcase your skillset on a high-visibility, multi-year initiative.
What You'll Do:
- Design and implement high-performance data pipelines using AWS-native tools such as Glue, Redshift, Lambda, and Step Functions.
- Leverage Python and APIs to build scalable backend services that integrate large-scale financial datasets.
- Optimize storage and retrieval of structured/unstructured data across S3, RDS, and EC2 instances.
- Collaborate cross-functionally to design intuitive data models, automate data flows, and support analytics/visualization efforts.
- Troubleshoot, test, and refine data interfaces, pipelines, and system integrations.
- Support DevOps pipelines using AWS CLI, infrastructure-as-code tools, and CI/CD best practices.
- Engage with product owners, stakeholders, and agile team members to refine functionality and deliver value iteratively.
What You Bring:
- 5+ years of hands-on experience with Python programming and backend development.
- Strong command of AWS services including Redshift, Glue, Lambda, S3, Step Functions, EC2, ECS, RDS, and CloudWatch.
- Proven track record working with Pandas, PySpark, and advanced SQL for data manipulation and transformation.
- Experience working with APIs (Flask, FastAPI, or Django) to build and deploy scalable services.
- Familiarity with scripting and version control tools such as Git, Bitbucket, or GitLab.
- Understanding of data modeling, schema design, and building dashboards or visualizations (bonus if with financial data).
- Excellent communication skills and ability to collaborate in a team-oriented environment.
- Comfortable working in a structured Agile development process using tools like Jira and Confluence.
Bonus Points For:
- Exposure to JavaScript frameworks like React or Angular for UI/UX components.
- Experience with SAS or financial analytics platforms.
- Familiarity with Terraform, Jenkins, and automation of cloud infrastructure.
- Financial services or Capital Markets domain knowledge.