Big Data Lead
Apply NowCompany: Hexaware Technologies
Location: New York, NY 10025
Description:
Job Description
Database Development & Management Strong experience in SQL development and query optimization. Hands-on experience with Snowflake (preferred), Oracle, or any relational database. Understanding of database indexing, partitioning, and performance tuning. . ETL/ELT Development Expertise in ETL/ELT development, preferably using Talend (or any other ETL tool). Strong proficiency in Advanced SQL scripting for data transformation and processing. Ability to design, develop, and optimize data pipelines for structured/unstructured data. Experience in error handling, logging, and recovery mechanisms for ETL processes. Data Warehousing & Modeling Understanding of Data Warehousing concepts, including Star Schema and Dimension Modeling. Hands-on experience with Snowflake (preferred) or other cloud/on-prem data warehouses. Ability to design and maintain Fact and Dimension tables for analytics and reporting. Knowledge of data partitioning and performance tuning techniques in DWH environments. CI/CD & Version Control (Good to Have, Not Core Focus) Experience using GIT for version control of ETL scripts and database objects. Exposure to CI/CD tools like TeamCity (preferred), Jenkins, or Azure DevOps for ETL deployment automation. Understanding of branching strategies, merging, and automated deployments for ETL processes. Familiarity with scheduled job execution and monitoring via CI/CD tools. Cloud Exposure (Good to Have, Not Core Focus) Basic familiarity with Azure or AWS cloud environments. Understanding of Snowflake on Azure/AWS or Redshift on AWS (data storage, querying, and schema management). Exposure to cloud storage solutions (Azure Blob, AWS S3) for data ingestion and staging. Awareness of cloud-based ETL services (Azure Data Factory, AWS Glue) - preferred but not required."
Database Development & Management Strong experience in SQL development and query optimization. Hands-on experience with Snowflake (preferred), Oracle, or any relational database. Understanding of database indexing, partitioning, and performance tuning. . ETL/ELT Development Expertise in ETL/ELT development, preferably using Talend (or any other ETL tool). Strong proficiency in Advanced SQL scripting for data transformation and processing. Ability to design, develop, and optimize data pipelines for structured/unstructured data. Experience in error handling, logging, and recovery mechanisms for ETL processes. Data Warehousing & Modeling Understanding of Data Warehousing concepts, including Star Schema and Dimension Modeling. Hands-on experience with Snowflake (preferred) or other cloud/on-prem data warehouses. Ability to design and maintain Fact and Dimension tables for analytics and reporting. Knowledge of data partitioning and performance tuning techniques in DWH environments. CI/CD & Version Control (Good to Have, Not Core Focus) Experience using GIT for version control of ETL scripts and database objects. Exposure to CI/CD tools like TeamCity (preferred), Jenkins, or Azure DevOps for ETL deployment automation. Understanding of branching strategies, merging, and automated deployments for ETL processes. Familiarity with scheduled job execution and monitoring via CI/CD tools. Cloud Exposure (Good to Have, Not Core Focus) Basic familiarity with Azure or AWS cloud environments. Understanding of Snowflake on Azure/AWS or Redshift on AWS (data storage, querying, and schema management). Exposure to cloud storage solutions (Azure Blob, AWS S3) for data ingestion and staging. Awareness of cloud-based ETL services (Azure Data Factory, AWS Glue) - preferred but not required."