Mauricio Caussin
Date d'abonnement : 2025
Ligue d'Or
4346 points
Date d'abonnement : 2025
In this course, you’ll take a comprehensive journey through the storage solutions available on Google Cloud, specifically tailored for AI and high-performance computing (HPC) workloads. You’ll learn how to choose the right storage for each stage of the ML lifecycle. You’ll explore how to optimize for I/O performance during training, manage massive datasets for data preparation, and serve model artifacts with low latency. Through practical examples and demonstrations, you’ll gain the expertise to design robust storage solutions that accelerate your AI innovation.
Complete the intermediate Perform Predictive Data Analysis in BigQuery skill badge course to demonstrate skills in the following: creating datasets in BigQuery by importing CSV and JSON files; harnessing the power of BigQuery with sophisticated SQL analytical concepts, including using BigQuery ML to train an expected goals model on soccer event data and evaluate the impressiveness of World Cup goals.
This course provides a comprehensive guide to deploying, managing, and optimizing AI and high-performance computing (HPC) workloads on Google Cloud. Through a series of lessons and practical demonstrations, you’ll explore diverse deployment strategies, ranging from highly customizable environments using Google Compute Engine (GCE) to managed solutions like Google Kubernetes Engine (GKE). Specifically, you’ll learn how to create clusters and deploy GKE for inference.
Welcome to the Cloud TPUs course. We'll explore the advantages and disadvantages of TPUs in various scenarios and compare different TPU accelerators to help you choose the right fit. You'll learn strategies to maximize performance and efficiency for your AI models and understand the significance of GPU/TPU interoperability for flexible machine learning workflows. Through engaging content and practical demos, we'll guide you step-by-step in leveraging TPUs effectively.
Complete the intermediate Mitigate Threats and Vulnerabilities with Security Command Center skill badge course to demonstrate skills in the following: preventing and managing environment threats, identifying and mitigating application vulnerabilities, and responding to security anomalies.
Curious about the powerful hardware behind AI? This module breaks down performance-optimized AI computers, showing you why they're so important. We'll explore how CPUs, GPUs, and TPUs make AI tasks super fast, what makes each one unique, and how AI software gets the most out of them. By the end, you'll know exactly how to pick the right GPU for your AI projects, helping you make smart choices for your AI workloads.
In this course, you learn how Gemini, a generative AI-powered collaborator from Google Cloud, helps analyze customer data and predict product sales. You also learn how to identify, categorize, and develop new customers using customer data in BigQuery. Using hands-on labs, you experience how Gemini improves data analysis and machine learning workflows. Duet AI was renamed to Gemini, our next-generation model.
Ready to get started with AI Hypercomputers? This course makes it easy! We'll cover the basics of what they are and how they help AI with AI workloads. You'll learn about the different components inside a hypercomputer, like GPUs, TPUs, and CPUs, and discover how to pick the right deployment approach for your needs.
As the use of enterprise Artificial Intelligence and Machine Learning continues to grow, so too does the importance of building it responsibly. A challenge for many is that talking about responsible AI can be easier than putting it into practice. If you’re interested in learning how to operationalize responsible AI in your organization, this course is for you. In this course, you will learn how Google Cloud does this today, together with best practices and lessons learned, to serve as a framework for you to build your own responsible AI approach.
Complete the introductory Prompt Design in Vertex AI skill badge to demonstrate skills in the following: prompt engineering, image analysis, and multimodal generative techniques, within Vertex AI. Discover how to craft effective prompts, guide generative AI output, and apply Gemini models to real-world marketing scenarios.
This is an introductory-level microlearning course aimed at explaining what responsible AI is, why it's important, and how Google implements responsible AI in their products. It also introduces Google's 3 AI principles.
This is an introductory level micro-learning course that explores what large language models (LLM) are, the use cases where they can be utilized, and how you can use prompt tuning to enhance LLM performance. It also covers Google tools to help you develop your own Gen AI apps.
This is an introductory level microlearning course aimed at explaining what Generative AI is, how it is used, and how it differs from traditional machine learning methods. It also covers Google Tools to help you develop your own Gen AI apps.