OpenHack – Modern Data Warehousing (OHMDW)

 

Course Overview

Please note attendees work together in teams of 5 as a minimum and the pricing advertised is per team of 5.

Who should attend

  • App Developers
  • Customers that are trying to handle and store data from multiple sources
  • Customers who need a DevOps solution that considers data management

Prerequisites

Knowledge Prerequisites To be successful and get the most out of this OpenHack, participants should have existing knowledge of relational database structures and concepts (e.g. tables, joins, SQL) and experience with either SSIS or programing languages like Scala or Python. Previous experience creating ETL pipelines, source control management, automated testing, and build and release automation will help you advance more quickly. Also, recommend familiarity with Azure fundamentals.

Tooling Prerequisites To avoid any delays with downloading or installing tooling, you are encouraged to have the following ready to go!

  • Install your choice of Integrated Development Environment (IDE) Software, i.e. Visual Studio/ Visual Studio Code /Eclipse/IntelliJ
  • Download Azure CLI
  • SQL Server Database Tooling (Azure Data Studio/SSMS)
  • SQL Server Data Tools (including BI tools) – If using Visual Studio for IDE

Post Learning Recommendations

  • Implement a Data Warehouse with Azure SQL Data Warehouse
  • Large-Scale Data Processing with Azure Data Lake Storage Gen2
  • Core Cloud Services - Azure data storage options
  • Azure for the Data Engineer
  • Perform data engineering with Azure Databrick
  • Architect a data platform in Azure

Course Objectives

By the end of the OpenHack, attendees will have built out a technical solution that is a fully operating Modern Data Warehouse with corresponding CI/CD pipeline that takes into account data management – which meets top-quality data consumption requirements, like reliability, scalability, and maintainability.

  • Modern cloud solution that results in higher reliability, scalability, and maintainability of large amounts of data.
  • Introduction to new data storage services to meet unique and multiple data streams needs

Course Content

This OpenHack enables attendees to develop, implement, and operationalize ETL pipelines for a multi-source data warehouse solution on Microsoft Azure. This OpenHack simulates a real-world scenario where an online DVD company’s data is coming in from a mess of disparate sources but needs to be stored in a single location, made sense of, and then used to feed a wide variety of downstream systems. During the “hacking” attendees will focus on 1. systematically ingesting and securing data from multiple sources and then, 2. transforming data to fit business’s required schema and monitor dataflow with levels of DevOps testing.

Technical Scenarios

  • Disparate data sources: ingest data in from multiple, differing data sources into one single location with one normalized schema for standardized downstream use
  • Security of data: protect data at all times while using ETL pipelines
  • DevOps: learn how to use a production pipeline to handle data layer

Technologies Azure Data Lake Storage, Azure Data Factory, Azure Databricks, Azure DevOps, SQL Data Warehouse

Prijs & Delivery methods

Online training

Duur
3 dagen

Prijs
  • 2.195,– €
Klassikale training

Duur
3 dagen

Prijs
  • Nederland: 2.195,– €
  • België: 2.195,– €

Op dit moment is deze training niet beschikbaar in het open rooster. De kans is echter groot dat wij u toch een passende oplossing kunnen bieden. Wij horen graag wat uw specifieke wensen zijn. U bereikt ons via 030 658 2131 of info@flane.nl. We helpen u graag!