G

Technical Lead - Data Engineering

GiGa-Ops Global Solutions
Contract
On-site
Hampshire, United Kingdom
Data Engineer

About the Opportunity

A global technology solutions provider in the data engineering and analytics sector, we empower enterprises to turn raw data into strategic assets at scale. Our innovative team designs, builds, and optimizes data platforms and pipelines that support real-time insights and advanced analytics, driving measurable business impact across multiple industries. We are seeking a Lead Data Engineer to champion best-in-class engineering standards and mentor a high-performing team on-site in the UK.

Role & Responsibilities

  • Lead and mentor a team of data engineers to design, develop, and deploy scalable ETL/ELT pipelines and data platforms.
  • Architect end-to-end data solutions, including ingestion, processing, storage, and orchestration on cloud and on-premise environments.
  • Collaborate with cross-functional stakeholders (Data Science, BI, DevOps) to translate business requirements into robust data models and workflows.
  • Define and enforce best practices for data engineering, including data modeling, quality, security, and performance tuning.
  • Implement CI/CD, monitoring, and documentation standards to ensure reliability and maintainability of data infrastructure.
  • Stay abreast of emerging technologies and drive continuous improvement initiatives across the data engineering landscape.

Primary (Must have skills)*

4+ years of experience in Azure Databricks with PySpark.

2+ years of experience in Databricks workflow & Unity catalog.

3+ years of experience in ADF (Azure Data Factory).

3+ years of experience in ADLS Gen 2.

3+ years of experience in Azure SQL.

5+ years of experience in Azure Cloud platform.

2+ years of experience in Python programming & package builds.

Job Description of Role

Strong experience in implementing secure, hierarchical namespace-based data lake storage for structured/semi-structured data, aligned to bronze-silver-gold layers with ADLS Gen2. Hands-on experience with lifecycle policies, access control (RBAC/ACLs), and folder-level security. Understanding of best practices in file partitioning, retention management, and storage performance optimization.

Capable of developing T-SQL queries, stored procedures, and managing metadata layers on Azure SQL.

Comprehensive experience working across the Azure ecosystem, including networking, security, monitoring, and cost management relevant to data engineering workloads.Understanding of VNets, Private Endpoints, Key Vaults, Managed Identities, and Azure Monitor.Exposure to DevOps tools for deployment automation (e.g., Azure DevOps, ARM/Bicep/Terraform).

Experience in writing modular, testable Python code used in data transformations, utility functions, and packaging reusable components.Familiarity with Python environments, dependency management (pip/Poetry/Conda), and packaging libraries.Ability to write unit tests using PyTest/unittest and integrate with CI/CD pipelines

Preferred

  • Experience with container orchestration (Kubernetes, Docker) and infrastructure-as-code (Terraform).
  • Knowledge of data orchestration tools (e.g., Apache Airflow, Prefect) and streaming frameworks (e.g., Kafka).
  • Understanding of data governance, security standards, and compliance (GDPR, SOX).
  • Relevant certifications in cloud platforms or Big Data technologies.

Benefits & Culture

  • Competitive salary with comprehensive benefits package.
  • Collaborative on-site work environment utilizing cutting-edge technologies.
  • Professional development opportunities, including training, certifications, and conference attendance.