Launch Your Career in Tech in As Little as 14 Weeks. For Free.
Unlock your potential with our remote 14 week training programme that will give you the skills and experience you need to become a specialist Data Analyst, Data Engineer, Cloud Engineer or DevOps Engineer.
Your Pathway Into Tech, Fully Funded by The UK Department for Education
Funded for our learners by the Department for Education, the programme gives learners the opportunity to build up sector-specific skills and fast-track to an interview with an employer.
AiCore has been selected as an approved delivery partner due to our impressive track record in successfully delivering high-quality training and career outcomes for our learners.
Four specialist career paths to choose from
Build a solid foundation in software engineering
Software Engineering
Learn the core of writing production ready code, following industry best practices.
You will build two different projects that will help you learn the basic and advanced concepts of Python, and other industry relevant tools such as the command line and version control tools, such as git and GitHub. In the first project you will create a command line assistant that helps you process multiple entries from IMDB. In the second project you build an implementation of the Hangman game using object oriented programming in Python.
Module 1: Python Programming
- Arithmetic Variable Assignment and Strings
- Lists and Sets
- Dictionaries, Tuples and Operators
- Control Flow
- Loops
- Functions
- Error Handling
- The Python environment
- Advanced Python
- Debugging
- Object Oriented Programming
Module 2: The Command Line
- FIle Navigation
- File Manipulations
- File Permissions
- Advanced Command Line Features
Module 3: Git and Github
- Version Control
- Commits and Branches
- Fetching and Merging
- Merge Conflicts
- Pull Requests
- README Files
- GitHub Security
Then choose one of the four specialist career paths
Data Engineering
Learn how to store, share and process various types of data at scale.
Build a complete data solution for a multinational organisation, from data acquisition to analysis . Write Python code to extract large datasets from multiple data sources. Utilise the power of Pandas to clean and analyse the data. Build a STAR based database schema for optimised data storage and access. Perform complex SQL data queries to extract valuable insights and make informed decisions for the organisation.
Build Pinterest's experiment analytics data pipeline which runs thousands of experiments per day and crunches billions of datapoints to provide valuable insights to improve the product.
Module 1: Data Formats and Processing Libraries
- JSON, CSV, XLSX and YAML
- Tabular Data
- Pandas Dataframes
- Advanced Dataframe Operations
- Data Cleaning in Pandas
- Numpy
- Missing Data
Module 2: Web APIs
- Basics of APIs and Communication Protocols
- Working with API Requests
- FastAPI
- Routing with FastAPI
- Sending Data to FastAPI
Module 3: SQL
- What is SQL?
- SQL Setup
- SQL Tools Setup
- SQL Commands
- SQL best practices
- SELECT and Sorting
- The WHERE Clause
- CRUD Creating Tables
- CRUD Altering Tables
- SQL JOINs
- SQL JOIN Types
- SQL Common Aggregations
- SQL GROUP BY
- Creating Subqueries
- Types of Subqueries
- CRUD Subquery Operations
- Common Table Expressions (CTEs)
- pyscopg2 and SQLAlchemy
Module 4: Essential Cloud Technology
- What is the Cloud
- Essential Cloud Concepts
- AWS Identity and Access Management
- AWS CLI
- Introduction to Amazon S3
- S3 Objects and boto3
- Amazon EC2
- Virtual Private Cloud
- IAM Roles
- Amazon RDS
- Billing in AWS
Module 5: Big Data Engineering Foundations
- The Data Engineering Landscape
- Data Pipelines
- Data Ingestion and Data Storage
- Enterprise Data Warehouses
- Batch vs Real-Time Processing
- Structured, Unstructured and Complex Data
Module 6: Data Ingestion
- Principles of Data Ingestion
- Batch Processing
- Real-Time Data Processing
- Kafka Essentials
- Kafka-Python
- Streaming in Kafka
Module 7: Data wrangling and transformation
- Data Transformations: ELT & ETL
- Apache Spark and Pyspark
- Distributed Processing with Spark
- Integrating Spark & Kafka
- Integrating Spark & AWS S3
- Spark Streaming
Module 8: Data Orchestration
- Apache Airflow
- Integrating Airflow & Spark
Module 9: Advanced Cloud Technologies and Databricks
- MSK and MSK Connect
- AWS API Gateway
- Integrating API Gateway with Kafka
- Databricks Essentials
- Integrating Databricks with Amazon S3
- AWS MWAA
- Orchestrating Databricks Workloads on MWAA
- AWS Kinesis
- Integrating Databricks with AWS Kinesis
- Integrating API Gateway with Kinesis
Data Analytics
Learn how to discover and analyse raw data to derive useful patterns, trends, relationships and insights, and communicate these in a visual manner to enhance decision making.
Use statistical methods and visualisation techniques to dive into data for your choice of client from the Retail, Manufacturing or Finance Industries. Load the data from a remote relational database. Clean and transform the data using Pandas. Extract actionable insights from the data using statistical and visualisation techniques, to help the client make key business decisions.
Develop a comprehensive Power BI report for a multinational retailer. As a Data Analyst, you'll cater to the needs of the C-suite executives, delivering key insights. Load data from diverse sources. Construct an efficient data model and formulate advanced measures.Design a user-friendly, multi-page report featuring slicers, pagination, and an interactive map for an enriched experience. This project emphasises the value of visual analytics in communicating critical business data to decision-makers.
Module 1: Data Formats and Processing Libraries
- JSON, CSV, XLSX and YAML
- Tabular Data
- Pandas Dataframes
- Advanced Dataframe Operations
- Data Cleaning in Pandas
- Numpy
- Missing Data
Module 2: SQL
- What is SQL?
- SQL Setup
- SQL Tools Setup
- SQL Commands
- pyscopg2 and SQLAlchemy
Module 3: Essential Cloud Technology
- What is the Cloud
- Essential Cloud Concepts
- AWS Identity and Access Management
- AWS CLI
- Introduction to Amazon S3
- S3 Objects and boto3
- Amazon EC2
- Virtual Private Cloud
- IAM Roles
- Amazon RDS
- Billing in AWS
Module 4: Data Visualisation and EDA
- Populations, Samples and Descriptive Statistics
- Handling Null Values and Skewed Distributions
- Common Visualisation Types
- Exploratory Data Analysis
Module 5: Microsoft Power BI
- Loading and Transforming Data
- Building a Data Model
- Introduction to DAX Expression Language
- Creating Reports and Visuals
- Pagination, Bookmarks and Interactions
Cloud Engineering
Learn how to design, build and manage scalable and reliable cloud-based infrastructure and services to support various applications and workloads.
Build a complete data solution for a multinational organisation, from data acquisition to analysis . Write Python code to extract large datasets from multiple data sources. Utilise the power of Pandas to clean and analyse the data. Build a STAR based database schema for optimised data storage and access. Perform complex SQL data queries to extract valuable insights and make informed decisions for the organisation.
Demonstrate your cloud engineering skills by designing and deploying a Microsoft Azure-based database system, including migration, disaster recovery simulations, and geo-replication, to enhance data management and availability.
Module 1: Data Formats and Processing Libraries
- JSON, CSV, XLSX and YAML
- Tabular Data
- Pandas Dataframes
- Advanced Dataframe Operations
- Data Cleaning in Pandas
- Numpy
- Missing Data
Module 2: Web APIs
- Basics of APIs and Communication Protocols
- Working with API Requests
- FastAPI
- Routing with FastAPI
- Sending Data to FastAPI
Module 3: SQL
- What is SQL?
- SQL Setup
- SQL Tools Setup
- SQL Commands
- SQL best practices
- SELECT and Sorting
- The WHERE Clause
- CRUD Creating Tables
- CRUD Altering Tables
- SQL JOINs
- SQL JOIN Types
- SQL Common Aggregations
- SQL GROUP BY
- Creating Subqueries
- Types of Subqueries
- CRUD Subquery Operations
- Common Table Expressions (CTEs)
- pyscopg2 and SQLAlchemy
Module 4: Essential Cloud Technology
- What is the Cloud
- Essential Cloud Concepts
- AWS Identity and Access Management
- AWS CLII
- Introduction to Amazon S3
- S3 Objects and boto3
- Amazon EC2
- Virtual Private Cloud
- IAM Roles
- Amazon RDS
- Billing in AWS
Module 5: Azure Cloud Essentials
- What is the Cloud?
- Essential Cloud Concepts
- What is Azure?
- Resources and Resource Groups
- Azure Cloud Shell and CLI
Module 6: Azure Compute and Security Services
- Azure Virtual Machines
- Azure Active Directory
- Azure SQL
- Azure SQL Database
- Azure Storage
Module 7: Azure Database Management
- SQL Server Databases on Azure VMs
- Azure Data Studio and Database Migration
- SQL Server Database Backups
- Disaster Recovery in Azure
- Azure AD for Azure SQL Database
DevOps Engineering
Learn how to streamline software delivery, emphasising automation, collaboration, continuous integration and deployment, infrastructure as code, and a culture of continuous improvement.
Demonstrate your DevOps Engineering skills by building a DevOps pipeline to containerise, deploy and and manage a web application on Azure Kubernetes Service (AKS), utilising tools such as git, Docker, Kubernetes and Terraform, while fostering skills in version control, infrastructure as code, and cloud-native deployment practices.
Module 1: Containerisation with Docker
- Introduction to Containerisation and Docker
- Creating Dockerfiles
- Building, Running and Pushing Docker Containers
- Docker Volumes
- Docker Compose
Module 2: Azure Cloud Essentials
- What is the Cloud?
- Essential Cloud Concepts
- What is Azure?
- Resources and Resource Groups
- Azure Cloud Shell and CLI
Module 3: Kubernetes
- Kubernetes Basics
- Kubernetes Workloads
- Kubernetes Networking
- Kubernetes Storage & StatefulSets
- Overview of Azure Kubernetes Service (AKS)
- Security in Kubernetes and AKS
Module 4: Infrastructure as Code with Terraform
- Terraform Basics
- Terraform Variables
- Terraform Modules
- Defining Azure Networking Components with Terraform
- AKS Resources with Terraform
- Terraform Deployments and CI/CD
Module 5: CI/CD with Azure DevOps
- Introduction to Version Control in Azure DevOps
- Azure DevOps Build Pipelines
- Artefact Management in Azure DevOps
- Azure DevOps Release Pipelines
- Integrating Terraform with Azure DevOps
- CI/CD Testing and Validation
Module 6: Kubernetes Deployments
- Kubernetes Manifests
- Kubernetes Deployment Strategies
- CI/CD with Kubernetes
Module 7: Monitoring and Logging with Azure Monitor
- Introduction to Azure Monitor
- Azure Monitor for AKS
- Configuring Logging with Azure Monitor
Module 8: Security with AKS
- AKS Security
- Network Policies in AKS
- Managing Secrets using Azure Key Vault
Career support
Work with our outcomes team to launch your new career.
Programme Schedule
The programme is fully remote and you can work flexibly through morning, afternoon or evening. But, you are expected to commit at least 20 hours per week to the training.
One to one live support available Monday - Friday 9AM - 9PM.
One to one live support available Monday - Friday 9AM - 9PM.
Launch your career with AiCore support
Career playbook
Have your CV, LinkedIn and Github portfolio optimized. Learn how to source your ideal roles.
Get referred by alumni
Our alumni network hire directly from AiCore. Over 15% of AiCore grads get hired this way.
Interview coaching
Feel 100% confident going into any hiring process. Our team will prepare you with general and technical mock interviews.
Industry certification
Get industry-recognised certifications in the most in-demand skills and tools.
Success stories
Eligibility criteria
To qualify for funding by the government, you must meet the following criteria:
- Aged 19 or older on 31st August 2022
- Have the right to work in the UK. This can be checked on www.gov.uk/view-right-to-work
- Meet residency requirements - more info is available here
- Live in England
- Must be looking to secure a tech job after the bootcamp
Frequently Asked Questions
Who are we?
AiCore is a specialist Ai & Data career accelerator. We deliver an immersive programme that will help launch your career in Ai & Data. To-date we have had over 2500 students successfully graduate our programme.
Where will I take classes?
Our programmes are 100% online and available on demand.
Do I need prior knowledge or an academic degree to join?
No, we don’t require any specific degrees or certifications to join the programme.
Do I receive a certification when I complete the programme?
Yes we offer the following certification for each specialisms
Data Engineering: Databricks Certified Data Engineer Associate
Cloud Engineering: Microsoft Azure Fundamentals (AZ-900)
Data Analytics: Microsoft Power Bi Data Analyst Associate (PL-300)
DevOps Engineering: HashiCorp Certified Terraform Associate (003)