Data Engineering
Showing 469–480 of 758 results
Migrate on-premises MongoDB databases to Cosmos DB
You will learn to demonstrate the benefits and processes for moving a MongoDB database to the Azure Cosmos DB for MongoDB API.
Migrate on-premises MySQL databases to Azure Database for MySQL
In this module, you'll learn about the benefits of migrating MySQL workloads to Azure, you'll see how to create an Azure Database for MySQL instance, and learn how to migrate on-premises MySQL databases to Azure.
Migrate on-premises PostgreSQL databases to Azure Database for PostgreSQL
In this module, you'll learn about the benefits of migrating PostgreSQL workloads to Azure, you'll see how to create an Azure Database for PostgreSQL instance and how to migrate on-premises PostgreSQL databases to Azure.
Migrate open-source databases to Azure
Learn how to migrate open-source workloads from PostgreSQL and MySQL databases to the equivalent services in Azure. Explore the processes and tools, and learn how to validate application dependencies to support a successful migration.
Migrate to Azure Database for PostgreSQL flexible server
Azure Database for PostgreSQL Flexible Server supports data migration from PostgreSQL servers. This module covers online and offline migration tools and methods, helping you choose the right approach for your scenario.
Migrating Cassandra and MongoDB workloads to Cosmos DB
This learning path enables students to migrate Cassandra and MongoDB workloads to Cosmos DB.
Migrating SSIS Packages to Azure Data Factory
Lift and shift your SSIS Packages to the cloud. In this course, you will learn the benefits of cloud migration and how to run, schedule, secure and monitor your SSIS Packages with Azure Data Factory v2.
ML Pipelines on Google Cloud
In this course, you will be learning from ML Engineers and Trainers who work with the state-of-the-art development of ML pipelines here at Google Cloud. The first few modules will cover about TensorFlow Extended (or TFX).
Modeling Streaming Data for Processing with Apache Beam
The Apache Beam unified model allows us to process batch as well as streaming data using the same API. Several execution backends such as Google Cloud Dataflow, Apache Spark, and Apache Flink are compatible with Beam.
Modeling Streaming Data for Processing with Apache Spark Structured Streaming
Streaming analytics can be difficult to implement. This course will teach you to model real-time data processing with Spark Structured Streaming.
Modern Data Architecture at a Challenger Bank: Fireside Chat with Jason Maude, Starling Bank
In this session, Jason Maude from Starling Bank answers questions posed by Big Data LDN about why it is possible for challenger banks to be on a level playing field with big banks due to advances in technology, modern team structures and processes.
Modernizing Data Lakes and Data Warehouses with GCP
The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud Platform in technical detail.