Favorites
b/exclusivetutorialsbyBlackDove

Codeless Data Engineering in GCP Beginner to Advanced

This post was published 2 years ago. Download links are most likely obsolete. If that's the case, try asking the uploader to re-upload.

Codeless Data Engineering in GCP Beginner to Advanced

Genre: eLearning | MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 2.19 GB | Duration: 4h 37m

Step by step guide to building four data pipelines in Google Cloud using DataStream, Data Fusion, DataPrep, DataFlow etc

What you'll learn
How to build No Code/Codeless data pipelines in Google Cloud
You will learn to build real-world data pipelines usings tools like Data Fusion, DataPrep and Dataflow
You will learn to transform data using Data Fusion
You will acquire good data engineering skills in Google Cloud
Working with Big Query Data warehouse in Google Cloud

Requirements
Basic understanding of cloud computing
An active google account
basic understanding of what a data lake and data warehouse are is essential but not required
Description
In this course, we will be creating a data lake using Google Cloud Storage and bring data warehouse capabilites to the data lake to form the lakehouse architecture using Google BigQuery. We will be building four no code data pipelines using services such as DataStream, Dataflow, DataPrep, Pub/Sub, Data Fusion, Cloud Storage, BigQuery etc.

The course will follow a logical progression of a real world project implementation with hands on experience of setting up a data lake, creating data pipelines for ingestion and transforming your data in preparation for analytics and reporting.

Chapter 1

We will setup a project in Google Cloud

Introduction to Google Cloud Storage

Introduction to Google BigQuery

Chapter 2 - Data Pipeline 1

We will create a cloud SQL database and populate with data before we start performing complex ETL jobs.

Use DataStream Change Data Capture for streaming data from our Cloud SQL Database into our Data lake built with Cloud Storage

Add a pub/sub notification to our bucket

Create a Dataflow Pipeline for streaming jobs into BigQuery

Chapter 3 - Data Pipeline 2

Introduce Google Data Fusion

Author and monitor ETL jobs for tranforming our data and moving them between different zone of our data lake

We will explore the use of Wrangler in Data Fusion for profiling and understanding our data before we starting performing complex ETL jobs.

Clean and normalise data

Discover and govern data using metadata in Data Fusion

Chapter 4 - Data Pipeline 3

Introduction to Google Pub/Sub

Building a .Net application for publishing data to a Pub/Sub topic

Building a realtime data pipeline for streaming messages to BigQuery

Chapter 5 - Data Pipeline 4

Introduction to Cloud DataPrep

Profile, Author and monitor ETL jobs for tranforming our data using DataPrep

Who this course is for
Data Engineers
Data Architects looking to architect data integration solutions in Google Cloud
Data Scientist, Data Analyts and Database Administrators
Data Scientist, Data Analyts and Database Administrators
Anyone looking to start a career as an Google Cloud Data Engineer

Screenshots

Codeless Data Engineering in GCP Beginner to Advanced

Homepage

Without You And Your Support We Can’t Continue
Please Buy Premium Account From My Links For Support
Click >>Here & Visit My Blog Daily For More Udemy Tutorial. if You Need Update or Links Dead Don't Wait Just PM Me or Leave Comment at This Post

No comments have been posted yet. Please feel free to comment first!

    Load more replies

    Join the conversation!

    Log in or Sign up
    to post a comment.