Seven Investment Management logo
On-site
London, United Kingdom
Data Engineer

Purpose

Reporting to the Head of Data & Analytics, we are seeking a motivated and detail-oriented Junior Data Engineer to support our Analytics Engineering capability. This role is ideal for someone with foundational experience in data engineering who is eager to deepen their skills in modern data platforms, particularly Azure Databricks and Microsoft Fabric. You will play a key role in maintaining and evolving our medallion architecture, crafting robust data pipelines, and curating high-quality dimensional models to support business intelligence and advanced analytics. You’ll work closely with the Data Analyst team, wider data engineering teams, and governance teams to ensure data is accurate, well-modeled, and traceable.

Responsibilities

  • Data Pipeline Development: Build and maintain scalable data pipelines using PySpark notebooks in Azure Databricks and Microsoft Fabric. Automate ingestion of raw data from various sources into our lakehouse, ensuring reliability and performance.
  • Medallion Architecture: Support the development and maturity of our bronze-silver-gold layer architecture, ensuring clean separation of raw, refined, and curated data.
  • Dimensional Modelling: Apply star schema and other dimensional modelling techniques to transform data into analytics-ready structures.
  • Collaboration: Work closely with analytics, insight, and business teams to understand data requirements and deliver solutions that meet their needs.
  • Governance & Lineage: Implement and maintain data governance practices, including data lineage, documentation, and metadata management.
  • Quality Assurance: Conduct data validation, testing, and monitoring alongside wider testing teams to ensure pipeline integrity and model accuracy.

About You

Knowledge

  • 3-5 years’ experience applying relational data modelling and data warehousing techniques, including dimensional modelling (e.g. star schema and slowly changing dimensions) is essential to the role.
  • Technical experience building and deploying models using PySpark and Python in a data warehouse or data lakehouse environment.
  • Exposure to Delta Lake, lakehouse tables, or similar technologies - familiarity with Azure Databricks or Microsoft Fabric in particular is advantageous.
  • Experience with SQL for data transformation and querying.
  • Experience with Git and version control in a collaborative environment.
  • Familiarity with CI/CD pipelines for data workflows.
  • Awareness of data governance principles and tools (e.g., Purview, Unity Catalog, Fabric Data Governance).
  • An understanding of the structure and purpose of the Financial Advice and Wealth Management markets within the UK Financial Services sector is highly advantageous.
  • Knowledge of the Agile methodology would be beneficial.
Β Qualifications
  • No specific qualifications are required for this role; however, the successful candidate will be expected to complete the Microsoft Certified: Fabric Data Engineer Associate certification within their probation period (6 months).
Skills/Other relevant information
  • Excellent numerical skills are essential.
  • Strong problem-solving and analytical thinking.
  • Willingness to learn and grow in a fast-paced environment.
  • Good communication skills and ability to work collaboratively.
  • Attention to detail and commitment to data quality.

About Us

Not Specified