Keyrock logo
    K

    Senior Data Engineer

    Keyrock
    BrusselsRemotePraguePortoMilanLondonKrakowZurichZugZagreb
    Remote
    Senior
    Full Time
    3 days ago
    data_engineersenior_roleremote_friendlyfinancial_marketspythonsqlstreaming_datalakehousedata_governance

    Requirements

    • 8+ years of building production data systems that other people rely on.
    • Strong proficiency in Python and SQL: not just being able to write a query, but being able to reason about what the engine is doing with it.
    • Code that's easy for someone else to read, test, and delete later.
    • Strong understanding of data modelling for both streaming and analytical workloads.
    • Efficiency, quality, idempotency, and observability are taken seriously by default.
    • Experience designing and operating streaming systems on Kafka, Redpanda, MSK, or Kinesis, with knowledge of partitioning, consumer groups, offsets, and schema registries.
    • Experience using a time-series store in production (ClickHouse ideally; TimescaleDB, QuestDB, or similar) and understanding table design as a function of query patterns.
    • Experience with lakehouse architecture and reasoning about table layout, partitioning, and compaction as design choices.
    • Ability to build for self-healing and idempotency, ensuring safe reprocessing and system recovery without human intervention.
    • Experience with Docker, Terraform, and CI/CD as integral parts of work.
    • Consideration of cost and performance early in design.
    • Instrumentation of systems with logs, metrics, and traces from day one.
    • Design for data quality and governance including contracts, validation, lineage, and ownership.
    • Ability to reason from first principles, stay pragmatic, and update views based on new information.
    • Ability to treat internal teams as customers and communicate effectively with them.
    • Focus on outcomes over output, shipping smaller simpler working solutions over bigger incomplete ones.
    • Ownership end-to-end: design, ship, operate, improve.
    • Willingness to express opinions and change mind when presented with better arguments.
    • Mentorship and collaboration skills to improve peers and juniors.
    • Curiosity about how markets work, including financial market data such as order books, trades, reference data, portfolios, exposures.
    • Honesty about knowledge gaps and quickness to close them.
    • Understanding of financial market data: order books, trades, reference data, portfolios, exposures; experience in Crypto, TradFi, or both is a strong plus.

    What You'll Do

    • Build streaming and batch pipelines that ingest, normalise, and distribute market, trading, and portfolio data, resilient to feed and exchange failures.
    • Build the self-serve tooling (SDKs, patterns, templates, AI agents) so other teams publish, consume, and build on data products without waiting on us.
    • Own data contracts and schema evolution. Keep schema changes from turning into multi-team coordination events.
    • Design the lakehouse and time-series layer around consumer query patterns.
    • Build and evolve the Data Governance and Data Quality Framework: stale-feed detection, schema validation, range checks, idempotent writes, lineage, ownership, self-healing.
    • Build the derived analytics the business runs on: cross-exchange spreads, VWAP at depth, order book microstructure for the desks; portfolio views, exposure, performance for wealth and asset management.
    • Make observability, cost, and performance first-class from day one.
    • Treat infrastructure as code (Docker, Terraform, CI/CD) alongside our Central Infrastructure Team.
    • Work in the open: write things down, partner closely with Architecture, Infrastructure, Platform, and the rest of the teams.

    Nice to Have

    • Lakehouse experience with Apache Iceberg or Delta Lake.
    • Familiarity with DataHub or similar metadata/lineage platforms.
    • Interest in Rust programming language; fluency not required.

    Benefits

    • A from-scratch mandate to shape the platform, standards, and team culture.
    • Strong partnerships and close working relationships with Architecture, Infrastructure, Platform, and trading desks.
    • Autonomy on how to work with flexible hours, remote-first, and shared business-hours on-call.
    • A competitive salary package with various benefits.
    • A team that likes each other with regular online get-togethers and a yearly onsite meeting.

    About Keyrock

    Keyrock develops scalable, transparent proprietary algorithmic technologies to increase the liquidity of financial assets.

    Belgium
    100 - 250
    Finance