Β Job Description:
We are seeking a highly experienced Azure Data Engineer who will be a technical specialist responsible for the design, development, and delivery of data infrastructure, ensuring high standards and best practices are maintained throughout the project lifecycle.
Role Overview
-
Focus: Implementation and optimisation of Microsoft Fabric environments.
-
Accountability: Full responsibility for technical deliverables and advising on the best methods and tools for successful project outcomes.
Technical Requirements
The Must-Haves:
-
Microsoft Fabric: Deep expertise in Lakehouse, Warehouse, and OneLake architectures.
-
Development: Advanced PySpark, Spark SQL, and T-SQL (Stored Procedures) development/maintenance.
-
Orchestration: Hands-on experience with Fabric Data Pipelines and Dataflow Gen2.
-
Engineering Rigor: Proven experience implementing CI/CD within Fabric and Git-based version control.
-
Analytics: Strong SQL skills tailored for high-performance analytical workloads.
The Nice-to-Haves:
- Experience migrating from Synapse, ADF, or Power BI datasets into Fabric.
- Deep understanding of Delta Lake concepts.
- Familiarity with Power BI semantic models.
- Experience with large datasets and performance tuning.
- Azure Fundamentals (Storage, Entra ID, Networking).
What's on offer:
- Competitive compensation
- Opportunity to gain rapid experience in a major consulting firm
- Hybrid working with 1-2 days on site per week
Β Required Skills:
Performance Tuning
Data
Development
Stored Procedures
Pipelines
Spark
Version Control
CI/CD
Accountability
Consulting
Compensation
Azure
Deliverables
Storage
Infrastructure
Analytics
Networking
Power BI
Maintenance
Design
Engineering
SQL