Sr. Software Engineer TECHM-JOB-26188
Apply NowCompany: Keylent, Inc.
Location: Hull, TX 77564
Description:
Sr. Software Engineer TECHM-JOB-26188
Location: Plano TX
Skill: Azure Data Lake Storage (ADLS)
Job Description
As a Sr Data Engineer, you will be responsible in designing, developing and deploying solutions required to extract, transform, clean and move data from the business systems into the enterprise data lake, data warehouse, data marts and operational data stores on Azure cloud. Other related responsibilities include ongoing data infrastructure support, and maintenance along with on-going/ad-hoc data integration, cleansing and modeling as required by business needs.
ROLE DESCRIPTION
Develops and maintains scalable Azure Data Factory (ADF) data pipelines and/or builds out new API integrations to support continuing increases in data volume and complexity across our source data systems.
Collaborates with data modelers, data scientists, business analysts and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
Writes unit/integration tests and documents work.
Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
Designs data integrations and data quality framework.
REQUIREMENTS
Experience designing, building, and deploying scalable Azure cloud-based solution architectures.
Minimum 8+ years of relevant hands-on experience in Data Ingestion, Integration and Transformation
Strong working knowledge of ETL, BI, Data Lake and Data warehousing tools and technologies
Strong experience with data systems and infrastructure on Azure, especially Azure Data Factory (ADF) and Databricks
Knowledge of Azure Synapse Analytics, Azure Data Lake, CosmosDB, Azure SQL, Azure Databricks or equivalent tools and technologies
Expertise in Python, SQL.
Knowledge of Informatica Power Center and/or IICS, Kafka and/or other streaming mechanisms
Experience with error exception handling mechanisms, monitoring and alerting techniques
Knowledge of DevOps, or Continuous Integration/Delivery
Knowledge of containerization and container orchestration technologies.
Familiarity in one or more languages, such as Java, Go, JavaScript, C++, or similar
Familiarity with standard IT security practices such as identity and access management, SSO, data protection, encryption, certificate, and key management
Perform data analysis to support strategic initiatives and clearly communicate and present insights to leadership.
Strong analytical problem-solving skills
Excellent written and verbal communication skills.
Self-starter takes the initiative and works well under pressure.
Can work well within a matrixed team environment.
Location: Plano TX
Skill: Azure Data Lake Storage (ADLS)
Job Description
As a Sr Data Engineer, you will be responsible in designing, developing and deploying solutions required to extract, transform, clean and move data from the business systems into the enterprise data lake, data warehouse, data marts and operational data stores on Azure cloud. Other related responsibilities include ongoing data infrastructure support, and maintenance along with on-going/ad-hoc data integration, cleansing and modeling as required by business needs.
ROLE DESCRIPTION
Develops and maintains scalable Azure Data Factory (ADF) data pipelines and/or builds out new API integrations to support continuing increases in data volume and complexity across our source data systems.
Collaborates with data modelers, data scientists, business analysts and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
Writes unit/integration tests and documents work.
Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
Designs data integrations and data quality framework.
REQUIREMENTS
Experience designing, building, and deploying scalable Azure cloud-based solution architectures.
Minimum 8+ years of relevant hands-on experience in Data Ingestion, Integration and Transformation
Strong working knowledge of ETL, BI, Data Lake and Data warehousing tools and technologies
Strong experience with data systems and infrastructure on Azure, especially Azure Data Factory (ADF) and Databricks
Knowledge of Azure Synapse Analytics, Azure Data Lake, CosmosDB, Azure SQL, Azure Databricks or equivalent tools and technologies
Expertise in Python, SQL.
Knowledge of Informatica Power Center and/or IICS, Kafka and/or other streaming mechanisms
Experience with error exception handling mechanisms, monitoring and alerting techniques
Knowledge of DevOps, or Continuous Integration/Delivery
Knowledge of containerization and container orchestration technologies.
Familiarity in one or more languages, such as Java, Go, JavaScript, C++, or similar
Familiarity with standard IT security practices such as identity and access management, SSO, data protection, encryption, certificate, and key management
Perform data analysis to support strategic initiatives and clearly communicate and present insights to leadership.
Strong analytical problem-solving skills
Excellent written and verbal communication skills.
Self-starter takes the initiative and works well under pressure.
Can work well within a matrixed team environment.