Applied Data Engineering
Emboldening data engineers to create powerful, scalable solutions with end users in mind.
Overview
The Applied Data Engineering Programme is designed to transform data capabilities and drive impactful, scalable business solutions. Apprentices will master a wide range of next-generation technical skills while developing a product-focused mindset, enabling them to critically evaluate workflows and stakeholder needs. Throughout the programme, they will become proficient across a spectrum of engineering operations, ensuring the implementation of high-quality workflows and best practices in critical areas such as data modelling, storage, and change management.
Duration
17 months
Price
£19,000 - eligible for funding by UK Apprenticeship Levy
Level
5
Most qualifications have a difficulty level. There are 9 qualification levels. The higher the level, the more difficult the qualification is.
Qualification
Apprenticeship
Who it's for
Data professionals skilled in Python and SQL, who are looking to progress into a data engineering role.
To apply you'll need
- To have the right to work in the UK
- To have lived in the UK or EEA continuously for the past 3 years
- To have at least a grade of 4/C GCSE (or equivalent) in Maths and Englis
- To have not previously studied the course content
- To not undertake any other qualifications during the apprenticeship
- To be able to apply your learning to your role
Qualifications Received
- Level 5 Data Engineer apprenticeship standard
Upskill your workforce
Contact salesModules
Data systems and architecture
Apprentices gain an understanding of different data infrastructures and analyse their organisation’s architecture, systems, and processes to explore diverse use cases throughout the data management lifecycle.
Data modelling
This module explores data model designs according to business needs. With a grasp of the technical structure of different data systems, apprentices delve into data warehousing, data meshes, and data lakes.
Data pipelines
Apprentices gain the knowledge and tools to build and manage ETL pipelines, exploring how to extract, transfer, integrate, and ingest data. In the process, they assess and incorporate data quality frameworks, define quality metrics, document needs, and establish oversight procedures.
Module four
Automating data pipelines
With tools like Kafka and Airflow, apprentices develop the skills needed to automate ETL data pipelines. This module also covers how to incorporate security, scalability, and governance into pipeline design.
Module five
Testing data pipelines
The final module of phase one focuses on testing, monitoring, and evaluating data pipelines. Topics covered include pipeline design with a focus on observability, pipeline performance monitoring, and optimising data processes for reliability.