G MASS logo

Data Engineer (FinTech)

G MASS
Contract
On-site
London, England, United Kingdom
Data Engineer
Description

G MASS is partnering with a global FinTech business to support the build-out of a next-generation data engineering capability within a large-scale enterprise data environment. This role sits within a highly technical engineering team focused on delivering AI-enabled, cloud-native data products used across the business.

You’ll join a collaborative engineering function responsible for designing and delivering robust, scalable data solutions. The work spans data ingestion, transformation, analytics and enablement, supporting both internal stakeholders and client-facing products.

Key responsibilities

  • Design, develop and maintain scalable data pipelines for ingestion, transformation and storage
  • Write efficient SQL queries to support data extraction, manipulation and analysis
  • Use Apache Spark and Python to process large datasets and automate workflows
  • Partner with data scientists, engineers and business stakeholders to translate requirements into data solutions
  • Implement data quality checks and validation to ensure accuracy and reliability
  • Analyse complex datasets to identify trends, patterns and anomalies
  • Produce and maintain clear documentation covering data processes and architecture
  • Support stakeholders with data visualisation and interpretation of outputs


Requirements
  • Strong experience in SQL for data manipulation and analysis
  • Hands-on experience with Apache Spark (Java, SQL or PySpark) and Python
  • Experience working with large, complex datasets in enterprise environments
  • Exposure to cloud data platforms (AWS, Azure or GCP) and big data technologies
  • Confident communicator, comfortable working across technical and non-technical teams
  • Background or exposure to financial services or capital markets

Nice to have

  • Experience using Java in data processing or integration contexts
  • Knowledge of ETL processes and data warehousing concepts
  • Experience with data visualisation tools (Power BI, Tableau, Qlik)
  • Familiarity with data science notebooks (e.g. Jupyter, Zeppelin)
  • Version control experience (Git, GitHub, Bitbucket, Azure DevOps)


Benefits

Initial 6-month contract, with strong possibility of extension and/or permanency.

Salary to be discussed.