Data Engineer - Credit Technology

Apply Now

Company: Balyasny Asset Management L.P.

Location: New York, NY 10025

Description:

Description Summary

The team is responsible for building, owning, and supporting a world-class data platform for our portfolio managers and their teams. We are building the suite of core components that will underpin the offering of this team for years to come. We are looking for an experienced and eager engineer to join our team in supporting this mandate.

Role Overview:
Design, build and grow a modern data platform and data-intensive applications, from ingestion through ETL, data quality, storage, and consumption/API's
Work closely with quantitative engineers and researchers
Collaborate in a global team environment to understand, engineer, and deliver on business requirements
Strike a balance along the dimensions of feasibility, stability, scalability, and time-to-market when delivering solutions

Qualifications & Requirements:
5+ years of work experience in a data engineering or similar data-intensive capacity
Demonstratable expertise in SQL and relational databases
Strong skills in Python and at least one data-manipulation library/framework (e.g., Pandas, Polars, Dask, Vaex, PySpark)
Strong debugging skills at all levels of the application stack and proven problem-solving ability
Strong knowledge of the data components used in distributed applications (e.g., Kafka, Redis, or other messaging/caching tools)
Experience architecting and building data platforms / ETLs, ideally batch as well as streaming, data lake/warehouse/lakehouse patterns
Experience with column-oriented data storage and serialization formats such as Parquet/Arrow
Experience with code optimization and performance tuning
Excellent communication skills

Additional experience in the following areas is a plus:
Experience building application-level code e.g., REST APIs to expose business logic
Prior usage of tooling such as Prometheus, Grafana, Sentry, etc. for distributed tracing and monitoring metrics
Experience with distributed stateful stream processing (e.g., Kafka Streams, Flink, Arroyo)
Work with financial instruments / software in areas such as research, risk management, portfolio management, reconciliation, order management, etc.
Prior experience with ClickHouse, Snowflake, or KDB

Similar Jobs