Data Engineer – Digital
Business: ASUK
Location: Based in Middlesbrough or Goldthorpe
Reports to: Data Platform & Enablement Lead
Altrad are currently looking for a Data Engineer to join the team who will design, build, and maintain scalable, reliable, and secure data pipelines and platforms that enable ASUK to make informed, data-driven decisions.
This role is responsible for ensuring that data from across the organisation is extracted, transformed, governed, and delivered efficiently to support reporting, analytics, automation, and future AI initiatives. The Data Engineer will play a key role in improving data availability, quality, consistency, and platform performance across the enterprise.
Key Deliverables:
Data Engineering & Pipeline Development
-
Design, develop, and maintain metadata-driven ETL and ELT pipelines.
-
Build and support orchestration frameworks for end-to-end data processing.
-
Integrate data from multiple source systems using APIs, webhooks, SQL connections, and file-based interfaces.
-
Develop, optimise, and maintain data warehouse transformations and dimensional data models.
-
Support tabular model development, deployment, and data refresh processes.
-
Ensure data pipelines are scalable, reusable, well-documented, and aligned to agreed standards.
Platform Reliability & Support
-
Implement monitoring, alerting, logging, and automated recovery mechanisms.
-
Investigate, diagnose, and resolve data pipeline issues, defects, and performance bottlenecks.
-
Support platform stability, scalability, security, and performance improvements.
-
Maintain data quality checks and support issue resolution with business and technical stakeholders.
DevOps & Continuous Improvement
-
Follow DevOps, CI/CD, version control, and deployment best practices.
-
Contribute to standardised development patterns, reusable components, and automation frameworks.
-
Work collaboratively with data analysts, BI developers, platform engineers, and business stakeholders.
-
Actively upskill in modern data engineering technologies, cloud services, automation, and analytics tooling.
Tools & Technologies
The role will involve working with, or developing capability in, the following tools and technologies:
-
Azure Data Factory
-
Azure SQL Database / Synapse
-
Azure cloud services
-
Logic Apps and Power Automate
-
Visual Studio
-
Azure DevOps
-
Tabular Editor
-
Power BI
-
SQL
-
APIs and webhooks
-
File-based data integrations
-
CI/CD and source control practices
Key Requirements
Essential
-
Experience designing, building, or supporting data pipelines.
-
Strong SQL skills and experience working with relational databases.
-
Understanding of ETL/ELT processes, data transformation, and data integration patterns.
-
Experience working with cloud-based data platforms, preferably Microsoft Azure.
-
Ability to troubleshoot data issues and support production data processes.
-
Understanding of data warehousing concepts, dimensional modelling, and data quality principles.
-
Good communication skills, with the ability to work with technical and non-technical stakeholders.
-
A proactive approach to learning, problem-solving, and continuous improvement.
Desirable
-
Experience with Azure Data Factory, Azure Synapse, or Azure SQL.
-
Experience with Azure DevOps, CI/CD pipelines, and source control.
-
Knowledge of Power BI datasets, semantic models, or Tabular Editor.
-
Experience integrating data through APIs, webhooks, or automated workflows.
-
Familiarity with monitoring, alerting, and automated data pipeline recovery.
-
Exposure to data governance, metadata-driven frameworks, or enterprise data platforms.