Data Operations Engineer

Apply Now

Company: CERES Group

Location: Boston, MA 02115

Description:

Reporting to the Head of Data Management, the Senior Data Operations Engineer is responsible for leadership and hands-on maintenance of the day-day operations for the firm's Azure-based Enterprise Data Architecture. Responsibilities include the processing and movement of data from source through the Data Lake, Data Warehouse, Marts and ultimately consuming systems. This role will play a key role in building off the existing tools and workflows and evolve the control framework, daily monitoring, and triage processes to be a best in class for Data Architecture Operations.

Responsibilities, Duties, and Accountabilities:

The primary role will be to support the firm's day-to-day data operations while working on projects to enhance functionality and reliability of the overall data infrastructure.
This role will work cross-functionally, so must be process-minded and innately collaborative. Key stakeholders include other data team members, IT Infrastructure, Product Management, Project Management, and Data Stewards. As a member of the Data Analytics and Reporting Team (DART), the role will work alongside with Database Developers, Data Analysts, and Data Architects.

This position is a mix of project-based and production support work with an emphasis on building a robust, sustainable enterprise data environment. In particular, a key priority above ensuring the stability and scalability of the architecture and environment.

Responsibilities:
  • Collaborating with data engineers related to optimizing application logic and functionality in the Azure Gen2 Data Lake, Data Factory, Data Bricks and SQL Database environment.
  • Working knowledge of the firm's data warehouse and mart schema.
  • Represent the Data Team on the Change Control Board to assess and communicate impact of proposed change controls and liaison for and with the Data Team and sponsors of change controls that could impact the Data Architecture or Data Team.
  • Work on cross-functional teams to represent the Data Team on corporate projects.
  • Work with Shared Services peers to troubleshoot issues and propose solutions.
  • Support compliance with data stewardship standards and data security procedures.
  • Apply proven communication and problem-solving skills to resolve support issues as they arise.
Requirements:

The ideal candidate will have served in a data engineering, cloud-based data architecture, or DevOps role previously. Bachelor's degree in Computer Science, Software Development, Database Management, Information Systems or equivalent experience is required. Other qualifications include:
  • Demonstrated experience monitoring and optimizing data architectures
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Experience with scripting languages such as SQL, Python, Scala (plus).
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
Preferred applicants will also have:
  • Expertise in database development projects and ETL processes.
  • Experience in an agile SDLC environment.
  • Experience planning and implementing QA and testing, and data warehousing.
  • Experience with Microsoft Azure Data Lake, Data Factory, and SQL Database products or equivalent products from other cloud services providers (e. g. AWS Elastic MapReduce, AWS Data Pipeline, Amazon RDS).
  • Designing and developing architectures for efficient and scalable data transformations inbound to the EDW and outbound to consuming data stores and systems within Azure and the BP Enterprise Data Architecture
  • Provides subject matter expertise in addressing projects and issues that encompass a wide range of internal and external systems (Core banking, Wealth, Trust, Data Warehousing), components, and processes
  • Work closely with DART members such as Data Analysts, Report Developers, QA, Release Management and other software engineers to develop and release new capabilities as well as research technical requirements and resolve issues
  • Troubleshoot and resolve technical issues with data migration and transformation within the Microsoft Azure environment such as DataBricks, Data Factory, bespoke code, and other applications in the Development. Test and Production environments. Perform production deployments.
  • Ensure the quality, completeness, security, privacy, and integrity of data throughout the data lifecycle, especially from Data Lake through downstream data delivery and reporting
  • Implement code changes and follow DART's control process including code check in, code reviews, jira issue updates
  • Leverages industry best practices data transformation and processing techniques, tools and coding languages, data models, query optimizations & analytics; share same with team
  • Extend and maintain the EDW/Mart zone such that they enable self-service capabilities for business power users to self-upload, analyze and report on governed data assests
  • Serve as lead developer on the DART Data Governance technical working group t o ensure all standards and polices are implemented, maintained and enforced
  • Provide weekly data integrity reporting to the Data Governance technical working group for the EDW and Data Marts to ensure integrity standards are actively monitored
Qualifications:
  • 10+years' Application Development/Engineering experience across a diverse technology base. Demonstrated understanding and successful application of proven database design principles such as: requirements analysis, data normalization, data modeling, risk management and quality assurance and for both On-line Transaction Processing (OLTP) and On-line Analytical Processes (OLAP) database systems.
  • 5+years' experience with Microsoft ETL and business intelligence technologies including but not limited to SQL Server database, SSIS SSRS and SSAS and their application within the Microsoft Azure environment and tools such as Data Bricks.
  • Strong direct Data Modeling experience using Relational, Dimensional and other approaches related to SQL and NoSQL databases.
  • Demonstrated expertise with relational databases (SQL server, Oracle), star schemas required.
  • Financial Services experience in Banking and/or Lending, Investment Management and/or Advisory is a requirement.
  • An understanding of AI/ML applications is a plus.
  • Experience with BI tools such as Tableau is a plus.

Similar Jobs