Rejoindre Se connecter

Thierry Njike

Date d'abonnement : 2021

Ligue d'Argent

18895 points
Introduction to Large Language Models Earned juil. 25, 2025 EDT
Serverless Data Processing with Dataflow: Foundations Earned juil. 23, 2025 EDT
Introduction to Generative AI Earned juil. 23, 2025 EDT
Prepare Data for ML APIs on Google Cloud Earned nov. 1, 2023 EDT
Build a Data Warehouse with BigQuery Earned nov. 1, 2023 EDT
Engineer Data for Predictive Modeling with BigQuery ML Earned oct. 31, 2023 EDT
Build Batch Data Pipelines on Google Cloud Earned oct. 30, 2023 EDT
Build Data Lakes and Data Warehouses on Google Cloud Earned oct. 26, 2023 EDT
Google Cloud Big Data and Machine Learning Fundamentals Earned oct. 25, 2023 EDT
Preparing for your Professional Data Engineer Journey Earned oct. 24, 2023 EDT
Getting Started with Apache Kafka and Confluent Platform on Google Cloud Earned oct. 24, 2023 EDT
Cloud Hero Infra Skills Earned août 6, 2022 EDT
BigQuery for Data Analysis I Earned juil. 30, 2022 EDT

This is an introductory level micro-learning course that explores what large language models (LLM) are, the use cases where they can be utilized, and how you can use prompt tuning to enhance LLM performance. It also covers Google tools to help you develop your own Gen AI apps.

En savoir plus

This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.

En savoir plus

This is an introductory level microlearning course aimed at explaining what Generative AI is, how it is used, and how it differs from traditional machine learning methods. It also covers Google Tools to help you develop your own Gen AI apps.

En savoir plus

Complete the introductory Prepare Data for ML APIs on Google Cloud skill badge to demonstrate skills in the following: cleaning data with Dataprep by Trifacta, running data pipelines in Dataflow, creating clusters and running Apache Spark jobs in Dataproc, and calling ML APIs including the Cloud Natural Language API, Google Cloud Speech-to-Text API, and Video Intelligence API.

En savoir plus

Complete the intermediate Build a Data Warehouse with BigQuery skill badge course to demonstrate skills in the following: joining data to create new tables, troubleshooting joins, appending data with unions, creating date-partitioned tables, and working with JSON, arrays, and structs in BigQuery.

En savoir plus

Complete the intermediate Engineer Data for Predictive Modeling with BigQuery ML skill badge to demonstrate skills in the following: building data transformation pipelines to BigQuery using Dataprep by Trifacta; using Cloud Storage, Dataflow, and BigQuery to build extract, transform, and load (ETL) workflows; and building machine learning models using BigQuery ML.

En savoir plus

In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting. Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.

En savoir plus

While the traditional approaches of using data lakes and data warehouses can be effective, they have shortcomings, particularly in large enterprise environments. This course introduces the concept of a data lakehouse and the Google Cloud products used to create one. A lakehouse architecture uses open-standard data sources and combines the best features of data lakes and data warehouses, which addresses many of their shortcomings.

En savoir plus

This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.

En savoir plus

This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.

En savoir plus

Organizations around the world rely on Apache Kafka to integrate existing systems in real time and build a new class of event streaming applications that unlock new business opportunities. Google and Confluent are in a partnership to deliver the best event streaming service based on Apache Kafka and to build event driven applications and big data pipelines on Google Cloud Platform. In this course, you will first learn how to deploy and create a streaming data pipeline with Apache Kafka, then try out the different functionalities of the Confluent Platform.

En savoir plus

Get hands-on practice with Google Cloud! You will compete with your peers to see who can finish this game with the most points. Speed and accuracy will be used to calculate your scores — earn points by completing the labs accurately and bonus points for speed! Be sure to click “End” where you’re done with each lab to be rewarded your points.

En savoir plus

Welcome Gamers! Learn BigQuery and Cloud SQL, all while having fun! You will compete to see who can finish the game with the highest score. Earn the points by completing the steps in the lab.... and get bonus points for speed! Be sure to click "End" when you're done with each lab to get the maximum points. All players will be awarded the game badge.

En savoir plus