Database Engineer
Apply NowCompany: Veros Software
Location: Santa Ana, CA 92704
Description:
About the Company:
Headquartered in Santa Ana California, Veros (www.veros.com) is a growing technology company that develops, operates and maintains custom software and business analytic solutions for the financial services industry. We are looking for a self-motivated, independent person to play a critical role, supporting the company's end user technology needs.
Veros offers a unique opportunity that encourages creativity and professional growth along with a competitive salary and benefits package including medical, dental, vision, life, 401(k), paid vacation, holidays, and more
Job Description:
We are seeking a Database Engineer with expertise in geospatial data processing and experience working with modern cloud-based data platforms. The ideal candidate will have strong proficiency in Snowflake, Python, SQL, Oracle, and Apache Airflow, with an understanding of geospatial data formats. This role involves designing, optimizing, and managing geospatial databases while ensuring efficient data processing and integration across multiple systems in a hybrid cloud environment.
Principal Responsibilities:
Quickly gain an understanding of the Company's real estate and mortgage property valuation products and analytics and associated databases
Design, develop, and optimize geospatial databases, ensuring scalability, performance, and reliability.
Provide the data support for enterprise data integration tasks including acquisition, ingestion, standardization, enrichment, mastering and assembly of data products for downstream applications
Develop and maintain ETL pipelines using Apache Airflow and other Vendor Software (EtLeap) to automate data ingestion, transformation, and processing.
Implement and optimize SQL queries in Snowflake, Oracle, and other relational databases for spatial and non-spatial data.
Experience with SQL stored procedures, triggers, functions, and scheduling.
Leverage Python to build automation scripts, data validation tools, and data processing workflows.
Integrate and process geospatial datasets from various formats, including shapefiles (SHP), GeoJSON, KML, raster data (GeoTIFF), and PostGIS.
Develop intelligence out of raw data sets and ensuring operational processes are in place to leverage and automate that derivative data
Perform proactive analysis including but not limited to data profiling and exploration providing back transparency to Data Quality issues working with the team to resolve
Develop and maintain data quality reports to track progress and identify areas for improvement as well as prior areas that were improved
Collaborate with the Data Science team to test model results and confirm issues or lack thereof ahead of new model releases
Collaborate with cross-functional teams, including data engineers, analysts, architects, scientists, as well as Software and QA engineers to support business needs.
Partner with customer support and product to research customer inquiries on potential data issues and resolve
Collaborate with vendors to solve data quality issues stemming from the data source or internal processes.
Monitor and optimize database performance, identifying bottlenecks, cost savings and implementing improvements.
Qualifications and Requirements:
B.S. degree in Computer Science, Mathematics, Statistics, or equivalent or have related experience
3+ years of experience with Database Engineering or Data Analysis / Modeling
Strong SQL skills with experience in Snowflake, Oracle, or PostGIS for spatial queries and optimizations.
Experience with common data analysis tools such as (Python, R, SQL) including Experience with PostgreSQL and spatial data
Hands-on experience with Apache Airflow
Cloud experience (AWS, Azure, or GCP), including working with cloud-based data warehouses (e.g. Snowflake, Redshift) and storage solutions.
Understanding of geospatial file formats
Strong problem-solving skills, ability to work independently, and excellent communication
In depth knowledge of relational databases (e.g., Oracle)
Strong analytical/problem solving skills
Excellent communication and interpersonal skills and ability to present findings to varied team of stakeholders
Excellent organizational skills, attention to detail is critical to the success of all candidates
Ability to work independently and collaboratively in a team environment
Proven experience in operational analysis and design
Ability to pick up new tools and processes quickly
Strong attention to detail and ability to prioritize tasks and multi-task
Preferred but not required:
Familiarity with Azure and specifically Microsoft Azure Data Factory
Familiarity with data warehousing and business intelligence concepts preferred
Familiarity with Sigma or related business intelligence tools (e.g., Power BI, Tableau)
Experience in the Real Estate or Mortgage industry preferred, but not required
Knowledge of big data processing in cloud environments.
Exposure to Terraform or Infrastructure as Code (IaC) for managing cloud resources.
Understanding machine learning applications with geospatial data.
Understanding of AI techniques and applications
Veros is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
Headquartered in Santa Ana California, Veros (www.veros.com) is a growing technology company that develops, operates and maintains custom software and business analytic solutions for the financial services industry. We are looking for a self-motivated, independent person to play a critical role, supporting the company's end user technology needs.
Veros offers a unique opportunity that encourages creativity and professional growth along with a competitive salary and benefits package including medical, dental, vision, life, 401(k), paid vacation, holidays, and more
Job Description:
We are seeking a Database Engineer with expertise in geospatial data processing and experience working with modern cloud-based data platforms. The ideal candidate will have strong proficiency in Snowflake, Python, SQL, Oracle, and Apache Airflow, with an understanding of geospatial data formats. This role involves designing, optimizing, and managing geospatial databases while ensuring efficient data processing and integration across multiple systems in a hybrid cloud environment.
Principal Responsibilities:
Quickly gain an understanding of the Company's real estate and mortgage property valuation products and analytics and associated databases
Design, develop, and optimize geospatial databases, ensuring scalability, performance, and reliability.
Provide the data support for enterprise data integration tasks including acquisition, ingestion, standardization, enrichment, mastering and assembly of data products for downstream applications
Develop and maintain ETL pipelines using Apache Airflow and other Vendor Software (EtLeap) to automate data ingestion, transformation, and processing.
Implement and optimize SQL queries in Snowflake, Oracle, and other relational databases for spatial and non-spatial data.
Experience with SQL stored procedures, triggers, functions, and scheduling.
Leverage Python to build automation scripts, data validation tools, and data processing workflows.
Integrate and process geospatial datasets from various formats, including shapefiles (SHP), GeoJSON, KML, raster data (GeoTIFF), and PostGIS.
Develop intelligence out of raw data sets and ensuring operational processes are in place to leverage and automate that derivative data
Perform proactive analysis including but not limited to data profiling and exploration providing back transparency to Data Quality issues working with the team to resolve
Develop and maintain data quality reports to track progress and identify areas for improvement as well as prior areas that were improved
Collaborate with the Data Science team to test model results and confirm issues or lack thereof ahead of new model releases
Collaborate with cross-functional teams, including data engineers, analysts, architects, scientists, as well as Software and QA engineers to support business needs.
Partner with customer support and product to research customer inquiries on potential data issues and resolve
Collaborate with vendors to solve data quality issues stemming from the data source or internal processes.
Monitor and optimize database performance, identifying bottlenecks, cost savings and implementing improvements.
Qualifications and Requirements:
B.S. degree in Computer Science, Mathematics, Statistics, or equivalent or have related experience
3+ years of experience with Database Engineering or Data Analysis / Modeling
Strong SQL skills with experience in Snowflake, Oracle, or PostGIS for spatial queries and optimizations.
Experience with common data analysis tools such as (Python, R, SQL) including Experience with PostgreSQL and spatial data
Hands-on experience with Apache Airflow
Cloud experience (AWS, Azure, or GCP), including working with cloud-based data warehouses (e.g. Snowflake, Redshift) and storage solutions.
Understanding of geospatial file formats
Strong problem-solving skills, ability to work independently, and excellent communication
In depth knowledge of relational databases (e.g., Oracle)
Strong analytical/problem solving skills
Excellent communication and interpersonal skills and ability to present findings to varied team of stakeholders
Excellent organizational skills, attention to detail is critical to the success of all candidates
Ability to work independently and collaboratively in a team environment
Proven experience in operational analysis and design
Ability to pick up new tools and processes quickly
Strong attention to detail and ability to prioritize tasks and multi-task
Preferred but not required:
Familiarity with Azure and specifically Microsoft Azure Data Factory
Familiarity with data warehousing and business intelligence concepts preferred
Familiarity with Sigma or related business intelligence tools (e.g., Power BI, Tableau)
Experience in the Real Estate or Mortgage industry preferred, but not required
Knowledge of big data processing in cloud environments.
Exposure to Terraform or Infrastructure as Code (IaC) for managing cloud resources.
Understanding machine learning applications with geospatial data.
Understanding of AI techniques and applications
Veros is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.