Pogust Goodhead logo

Senior Data Engineer

Pogust Goodhead
2 days ago
On-site
London, England, United Kingdom
Data Engineer
Description

The Opportunity

You will architect and build the data foundations that power litigation to support millions of Clients across the world, create AI services that augment our lawyers and analysts, and deploy intelligent tooling directly into client-facing journeys. The impact is real, the scope is huge, and the team is exceptional.

Main Duties and Responsibilities:

Data Platform Engineering

  • Own the design and build of our Databricks Lakehouse architecture, Delta Lake, Unity Catalog, and more.
  • Build and maintain production-grade pipelines ingesting data from Litify/Salesforce, client submissions, and third-party sources.
  • Champion medallion architecture (bronze/silver/gold) and enforce data quality at every layer.
  • Optimise for performance and cost.

AI & Machine Learning Engineering

  • Design and deploy internal AI services: RAG systems, LLM-powered document analysis, and intelligent data extraction pipelines.
  • Build and integrate AI features into client-facing products.
  • Work with data scientists and legal SMEs to translate complex quantification models into scalable, production-ready services.
  • Stay ahead of the curve on the AI/ML tooling landscape (LangChain, MLflow, Databricks Model Serving, Vector Search).

Data Governance & Quality

  • Implement and enforce data governance through Unity Catalog; lineage, access control, classification, etc.
  • Define and maintain data quality frameworks.
  • Ensure compliance with GDPR and relevant data privacy regulations across all pipelines and services.

Collaboration & Leadership

  • Partner tightly with data colleagues, analysts, legal operations, and product teams to turn requirements into reality.
  • Contribute to architectural decisions in Data and in Tech.
  • Mentor junior engineers and help raise the bar across the team.
  • Document what you build.


Requirements

Must-Haves

  • Wanting to understand our business inside out. Processes, history, systems…not just the data!
  • 5+ years in data engineering, with at least 2 years hands-on with Databricks (Delta Lake, Spark, notebooks, workflows).
  • Strong Python.
  • Solid SQL skills and deep understanding of data modelling for analytics workloads.
  • Azure fluency: Data Factory, Azure Data Lake Storage, Key Vault, and ideally some Synapse/Fabric exposure.
  • Real understanding of data governance, security, and privacy by design.
  • A bias for action.

Strong Advantages

  • Experience with Databricks Unity Catalog, MLflow, or Model Serving.
  • Experience building and shipping AI/ML-adjacent services: LLM integration, vector stores, embedding pipelines, or similar.
  • Exposure to legal tech, case management systems (Salesforce/Litify a bonus), or heavily regulated industries.
  • Knowledge of RAG architectures, LangChain/LlamaIndex, or OpenAI/Azure OpenAI APIs.
  • Experience designing client-facing data products or APIs.
  • Background in financial quantification, insurance, or litigation analytics.


Benefits
  • 25 days’ annual leave (plus 8 Bank Holidays)
  • Private medical insurance
  • Private pension scheme
  • Life assurance
  • Enhanced maternity and paternity leave
  • Employee assistance programme
  • Employee referral bonus
  • E-bikes and gym discounts (available through salary sacrifice scheme)
  • Season ticket loans

Pre-engagement Screening

If we extend an offer of employment to you, we will conduct pre-engagement screening checks where local legislation permits and where relevant to your role and to working within a regulated environment. These checks may include, but are not limited to, verification of your professional and academic qualifications, your eligibility to work in the relevant jurisdiction, any criminal records, your financial stability and obtaining work-related references.

#LI-NG1