Vertex AI is a managed machine learning platform that helps you build, deploy, and scale machine learning models faster and easier.
GitHub: https://github.com/TrilokiDA/Vertex-AI/tree/main
Vertex AI: Pipelines for your MLOps workflowsMárton Kodok
In recent years, one of the biggest trends in applications development has been the rise of Machine Learning solutions, tools, and managed platforms. Vertex AI is a managed unified ML platform for all your AI workloads. On the MLOps side, Vertex AI Pipelines solutions let you adopt experiment pipelining beyond the classic build, train, eval, and deploy a model. It is engineered for data scientists and data engineers, and it’s a tremendous help for those teams who don’t have DevOps or sysadmin engineers, as infrastructure management overhead has been almost completely eliminated.
Based on practical examples we will demonstrate how Vertex AI Pipelines scores high in terms of developer experience, how fits custom ML needs, and analyze results. It’s a toolset for a fully-fledged machine learning workflow, a sequence of steps in the model development, a deployment cycle, such as data preparation/validation, model training, hyperparameter tuning, model validation, and model deployment. Vertex AI comes with all standard resources plus an ML metadata store, a fully managed feature store, and a fully managed pipelines runner.
Vertex AI Pipelines is a managed serverless toolkit, which means you don't have to fiddle with infrastructure or back-end resources to run workflows.
Intro to Vertex AI, unified MLOps platform for Data Scientists & ML EngineersDaniel Zivkovic
#MLOps is a hot buzzword, just like #DevOps before it. It sparked a gold rush for software vendors, so it's hard to choose the best tool for your needs. Vertex AI is a unified MLOps platform for the entire #AI #workflow on #GoogleCloud. It is the 3rd iteration of the Google Cloud #ML platform (since its original launch), and we think they did it right (this time).
That's why #ServerlessTO invited 2 AI/ML gurus from #GCP (Jarek Kazmierczak & Brian Kang) to introduce the #VertexAI you to.
The lecture recording with Q&A is at https://youtu.be/X1S7360ip-k
MEETUP "CODE-ALONG" RESOURCES
Vertex workbench - Managed and User-managed Notebooks
https://cloud.google.com/vertex-ai/docs/workbench/managed/quickstarts
Example that the training code was based on - Fashion MNIST dataset
https://www.tensorflow.org/tutorials/keras/classification
Hyperparameter tuning codelab
https://codelabs.developers.google.com/vertex_hyperparameter_tuning
Vertex pipeline codelabs
https://codelabs.developers.google.com/vertex-pipelines-intro
https://codelabs.developers.google.com/vertex-pipelines-custom-model
CI/CD slides
https://github.com/shivajid/MLOpsCICD/blob/master/presentation/AI%20Workshop%20Day4.pdf
CI/CD github example
https://github.com/shivajid/MLOpsCICD
Model monitoring example
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/master/notebooks/official/model_monitoring/model_monitoring.ipynb
Best practices for MLOps
https://cloud.google.com/architecture/mlops-continuous-delivery-and-automation-pipelines-in-machine-learning
https://cloud.google.com/resources/mlops-whitepaper
Official Vertex AI Github repository
https://github.com/GoogleCloudPlatform/vertex-ai-samples/
MEETUP CHAT LINKS
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/master/notebooks/notebook_template.ipynb
https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/master/notebooks/official/custom
https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/master/notebooks/community/sdk
https://cloud.google.com/architecture/ml-on-gcp-best-practices#model-deployment-and-serving
https://www.youtube.com/watch?v=ntBEQdD1IeQ&list=PLd31CCJlr9FrZazLqRg1Lxq7xw9b6VNP6&index=3
Vertex AI - Unified ML Platform for the entire AI workflow on Google CloudMárton Kodok
Vertex AI is a managed ML platform for practitioners to accelerate experiments and deploy AI models.
Enhanced developer experience
- Build with the groundbreaking ML tools that power Google
- Approachable from the non-ML developer perspective (AutoML, managed models, training)
- Ease the life of a data scientist/ML (has feature store, managed datasets, endpoints, notebooks)
- Infrastructure management overhead have been almost completely eliminated
- Unified UI for the entire ML workflow
- End-to-end integration for data and AI with build pipelines that outperform and solve complex ML tasks
- Explainable AI and TensorBoard to visualize and track ML experiments
AI and ML Series - Leveraging Generative AI and LLMs Using the UiPath Platfor...DianaGray10
📣 AI plays a crucial role in the UiPath Business Automation Platform. In this session you will learn about how the UiPath Business Automation Platform is well-suited for AI, the use of LLM and integrations you can use. Topics include the following:
Introductions.
AI powered automations overview.
Discover why the UiPath Business Automation Platform is well-suited for AI.
LLM + Automation framework and integrations with LangChain.
Generative AI Automation Patterns Demonstration.
👨🏽🤝👨🏻 Speakers:
Dhruv Patel, Senior Sales Solution Architect @UiPath
Russel Alfeche, Technology Leader, RPA @qBotica and UiPath MVP
AI for an intelligent cloud and intelligent edge: Discover, deploy, and manag...James Serra
Discover, manage, deploy, monitor – rinse and repeat. In this session we show how Azure Machine Learning can be used to create the right AI model for your challenge and then easily customize it using your development tools while relying on Azure ML to optimize them to run in hardware accelerated environments for the cloud and the edge using FPGAs and Neural Network accelerators. We then show you how to deploy the model to highly scalable web services and nimble edge applications that Azure can manage and monitor for you. Finally, we illustrate how you can leverage the model telemetry to retrain and improve your content.
Vertex AI: Pipelines for your MLOps workflowsMárton Kodok
In recent years, one of the biggest trends in applications development has been the rise of Machine Learning solutions, tools, and managed platforms. Vertex AI is a managed unified ML platform for all your AI workloads. On the MLOps side, Vertex AI Pipelines solutions let you adopt experiment pipelining beyond the classic build, train, eval, and deploy a model. It is engineered for data scientists and data engineers, and it’s a tremendous help for those teams who don’t have DevOps or sysadmin engineers, as infrastructure management overhead has been almost completely eliminated.
Based on practical examples we will demonstrate how Vertex AI Pipelines scores high in terms of developer experience, how fits custom ML needs, and analyze results. It’s a toolset for a fully-fledged machine learning workflow, a sequence of steps in the model development, a deployment cycle, such as data preparation/validation, model training, hyperparameter tuning, model validation, and model deployment. Vertex AI comes with all standard resources plus an ML metadata store, a fully managed feature store, and a fully managed pipelines runner.
Vertex AI Pipelines is a managed serverless toolkit, which means you don't have to fiddle with infrastructure or back-end resources to run workflows.
Intro to Vertex AI, unified MLOps platform for Data Scientists & ML EngineersDaniel Zivkovic
#MLOps is a hot buzzword, just like #DevOps before it. It sparked a gold rush for software vendors, so it's hard to choose the best tool for your needs. Vertex AI is a unified MLOps platform for the entire #AI #workflow on #GoogleCloud. It is the 3rd iteration of the Google Cloud #ML platform (since its original launch), and we think they did it right (this time).
That's why #ServerlessTO invited 2 AI/ML gurus from #GCP (Jarek Kazmierczak & Brian Kang) to introduce the #VertexAI you to.
The lecture recording with Q&A is at https://youtu.be/X1S7360ip-k
MEETUP "CODE-ALONG" RESOURCES
Vertex workbench - Managed and User-managed Notebooks
https://cloud.google.com/vertex-ai/docs/workbench/managed/quickstarts
Example that the training code was based on - Fashion MNIST dataset
https://www.tensorflow.org/tutorials/keras/classification
Hyperparameter tuning codelab
https://codelabs.developers.google.com/vertex_hyperparameter_tuning
Vertex pipeline codelabs
https://codelabs.developers.google.com/vertex-pipelines-intro
https://codelabs.developers.google.com/vertex-pipelines-custom-model
CI/CD slides
https://github.com/shivajid/MLOpsCICD/blob/master/presentation/AI%20Workshop%20Day4.pdf
CI/CD github example
https://github.com/shivajid/MLOpsCICD
Model monitoring example
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/master/notebooks/official/model_monitoring/model_monitoring.ipynb
Best practices for MLOps
https://cloud.google.com/architecture/mlops-continuous-delivery-and-automation-pipelines-in-machine-learning
https://cloud.google.com/resources/mlops-whitepaper
Official Vertex AI Github repository
https://github.com/GoogleCloudPlatform/vertex-ai-samples/
MEETUP CHAT LINKS
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/master/notebooks/notebook_template.ipynb
https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/master/notebooks/official/custom
https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/master/notebooks/community/sdk
https://cloud.google.com/architecture/ml-on-gcp-best-practices#model-deployment-and-serving
https://www.youtube.com/watch?v=ntBEQdD1IeQ&list=PLd31CCJlr9FrZazLqRg1Lxq7xw9b6VNP6&index=3
Vertex AI - Unified ML Platform for the entire AI workflow on Google CloudMárton Kodok
Vertex AI is a managed ML platform for practitioners to accelerate experiments and deploy AI models.
Enhanced developer experience
- Build with the groundbreaking ML tools that power Google
- Approachable from the non-ML developer perspective (AutoML, managed models, training)
- Ease the life of a data scientist/ML (has feature store, managed datasets, endpoints, notebooks)
- Infrastructure management overhead have been almost completely eliminated
- Unified UI for the entire ML workflow
- End-to-end integration for data and AI with build pipelines that outperform and solve complex ML tasks
- Explainable AI and TensorBoard to visualize and track ML experiments
AI and ML Series - Leveraging Generative AI and LLMs Using the UiPath Platfor...DianaGray10
📣 AI plays a crucial role in the UiPath Business Automation Platform. In this session you will learn about how the UiPath Business Automation Platform is well-suited for AI, the use of LLM and integrations you can use. Topics include the following:
Introductions.
AI powered automations overview.
Discover why the UiPath Business Automation Platform is well-suited for AI.
LLM + Automation framework and integrations with LangChain.
Generative AI Automation Patterns Demonstration.
👨🏽🤝👨🏻 Speakers:
Dhruv Patel, Senior Sales Solution Architect @UiPath
Russel Alfeche, Technology Leader, RPA @qBotica and UiPath MVP
AI for an intelligent cloud and intelligent edge: Discover, deploy, and manag...James Serra
Discover, manage, deploy, monitor – rinse and repeat. In this session we show how Azure Machine Learning can be used to create the right AI model for your challenge and then easily customize it using your development tools while relying on Azure ML to optimize them to run in hardware accelerated environments for the cloud and the edge using FPGAs and Neural Network accelerators. We then show you how to deploy the model to highly scalable web services and nimble edge applications that Azure can manage and monitor for you. Finally, we illustrate how you can leverage the model telemetry to retrain and improve your content.
🔹How will AI-based content-generating tools change your mission and products?
🔹This complimentary webinar [ON-DEMAND] explores multiple use cases that drive adoption in their early adopter customer base to provide product leaders with insights into the future of generative AI-powered businesses, and the potential generative AI holds for driving innovation and improving business processes.
Applying DevOps to Databricks can be a daunting task. In this talk this will be broken down into bite size chunks. Common DevOps subject areas will be covered, including CI/CD (Continuous Integration/Continuous Deployment), IAC (Infrastructure as Code) and Build Agents.
We will explore how to apply DevOps to Databricks (in Azure), primarily using Azure DevOps tooling. As a lot of Spark/Databricks users are Python users, will will focus on the Databricks Rest API (using Python) to perform our tasks.
MLOps and Data Quality: Deploying Reliable ML Models in ProductionProvectus
Looking to build a robust machine learning infrastructure to streamline MLOps? Learn from Provectus experts how to ensure the success of your MLOps initiative by implementing Data QA components in your ML infrastructure.
For most organizations, the development of multiple machine learning models, their deployment and maintenance in production are relatively new tasks. Join Provectus as we explain how to build an end-to-end infrastructure for machine learning, with a focus on data quality and metadata management, to standardize and streamline machine learning life cycle management (MLOps).
Agenda
- Data Quality and why it matters
- Challenges and solutions of Data Testing
- Challenges and solutions of Model Testing
- MLOps pipelines and why they matter
- How to expand validation pipelines for Data Quality
Generative AI: Past, Present, and Future – A Practitioner's PerspectiveHuahai Yang
Generative AI: Past, Present, and Future – A Practitioner's Perspective
As the academic realm grapples with the profound implications of generative AI
and related applications like ChatGPT, I will present a grounded view from my
experience as a practitioner. Starting with the origins of neural networks in
the fields of logic, psychology, and computer science, I trace its history and
align it within the wider context of the pursuit of artificial intelligence.
This perspective will also draw parallels with historical developments in
psychology. Against this backdrop, I chart a proposed trajectory for the future.
Finally, I provide actionable insights for both academics and enterprising
individuals in the field.
In this session, you'll get all the answers about how ChatGPT and other GPT-X models can be applied to your current or future project. First, we'll put in order all the terms – OpenAI, GPT-3, ChatGPT, Codex, Dall-E, etc., and explain why Microsoft and Azure are often mentioned in this context. Then, we'll go through the main capabilities of the Azure OpenAI and respective usecases that might inspire you to either optimize your product or build a completely new one.
Managing and Versioning Machine Learning Models in PythonSimon Frid
Practical machine learning is becoming messy, and while there are lots of algorithms, there is still a lot of infrastructure needed to manage and organize the models and datasets. Estimators and Django-Estimators are two python packages that can help version data sets and models, for deployment and effective workflow.
Using MLOps to Bring ML to Production/The Promise of MLOpsWeaveworks
In this final Weave Online User Group of 2019, David Aronchick asks: have you ever struggled with having different environments to build, train and serve ML models, and how to orchestrate between them? While DevOps and GitOps have made huge traction in recent years, many customers struggle to apply these practices to ML workloads. This talk will focus on the ways MLOps has helped to effectively infuse AI into production-grade applications through establishing practices around model reproducibility, validation, versioning/tracking, and safe/compliant deployment. We will also talk about the direction for MLOps as an industry, and how we can use it to move faster, with more stability, than ever before.
The recording of this session is on our YouTube Channel here: https://youtu.be/twsxcwgB0ZQ
Speaker: David Aronchick, Head of Open Source ML Strategy, Microsoft
Bio: David leads Open Source Machine Learning Strategy at Azure. This means he spends most of his time helping humans to convince machines to be smarter. He is only moderately successful at this. Previously, David led product management for Kubernetes at Google, launched GKE, and co-founded the Kubeflow project. David has also worked at Microsoft, Amazon and Chef and co-founded three startups.
Sign up for a free Machine Learning Ops Workshop: http://bit.ly/MLOps_Workshop_List
Weaveworks will cover concepts such as GitOps (operations by pull request), Progressive Delivery (canary, A/B, blue-green), and how to apply those approaches to your machine learning operations to mitigate risk.
Azure OpenAI Service provides REST API access to OpenAI's powerful language models, including the GPT-3, GPT-4, DALL-E, Codex, and Embeddings model series. These models can be easily adapted to any specific task, including but not limited to content generation, summarization, semantic search, translation, transformation, and code generation. Microsoft offers the accessibility of the service through REST APIs, Python or C# SDK, or the Azure OpenAI Studio.
The catalyst for the success of automobiles came not through the invention of the car but rather through the establishment of an innovative assembly line. History shows us that the ability to mass produce and distribute a product is the key to driving adoption of any innovation, and machine learning is no different. MLOps is the assembly line of Machine Learning and in this presentation we will discuss the core capabilities your organization should be focused on to implement a successful MLOps system.
The Conversational AI Journey - What to ExpectAggregage
Understanding the work involved before and after you deploy a virtual agent makes all the difference between a poor customer experience and one that’s on par with your best live agent. What’s surprising to many is that, firstly, getting alignment from internal stakeholders may be the biggest challenge in the conversational AI journey, and, secondly, most of the work to improve virtual agent performance actually happens after going live.
Join us for this webinar as Gary Davis, SmartAction CEO, shares insights and best practices for implementing AI virtual agents in the contact center. After deploying conversational AI for more than 100 leading brands, we’ve learned a few lessons on what can make — or break –– the automated customer experience.
You’ll learn:
•The internal stakeholders you need to involve and engage to make your conversational AI project a successful one
•How automating a customer service call isn’t as simple as using a script from a human interaction
•What happens after go-live, and how to monitor, fine-tune, and train your virtual agent
•The potential ROI when conversational automation is done right
PuppetConf 2017: Unlocking Azure with Puppet Enterprise- Keiran Sweet, Source...Puppet
For the last year Sourced has been assisting a large Canadian based financial organization migrate workloads to Microsoft's Azure public cloud platform. As part of this deployment, Puppet is leveraged to ensure high levels of automation and compliance across the environment. In this updated session we will walk through our approach to integrating Puppet in Azure environments to ensure that automation, security, compliance and infrastructure as code is at the forefront.
Sitecore 8.2 Update 1 on Azure Web AppsRob Habraken
The sildes of my presentation on the Sitecore User Group Netherlands meetup on December 7th 2016, hosted by Colours in Den Bosch, presenting and demoing the provisioning of Sitecore into Azure using Azure Web Apps. Note that these slides do not contain the demo itself. For the demo, view the recording of the presentation or read my blog post, both accessable via https://www.robhabraken.nl
🔹How will AI-based content-generating tools change your mission and products?
🔹This complimentary webinar [ON-DEMAND] explores multiple use cases that drive adoption in their early adopter customer base to provide product leaders with insights into the future of generative AI-powered businesses, and the potential generative AI holds for driving innovation and improving business processes.
Applying DevOps to Databricks can be a daunting task. In this talk this will be broken down into bite size chunks. Common DevOps subject areas will be covered, including CI/CD (Continuous Integration/Continuous Deployment), IAC (Infrastructure as Code) and Build Agents.
We will explore how to apply DevOps to Databricks (in Azure), primarily using Azure DevOps tooling. As a lot of Spark/Databricks users are Python users, will will focus on the Databricks Rest API (using Python) to perform our tasks.
MLOps and Data Quality: Deploying Reliable ML Models in ProductionProvectus
Looking to build a robust machine learning infrastructure to streamline MLOps? Learn from Provectus experts how to ensure the success of your MLOps initiative by implementing Data QA components in your ML infrastructure.
For most organizations, the development of multiple machine learning models, their deployment and maintenance in production are relatively new tasks. Join Provectus as we explain how to build an end-to-end infrastructure for machine learning, with a focus on data quality and metadata management, to standardize and streamline machine learning life cycle management (MLOps).
Agenda
- Data Quality and why it matters
- Challenges and solutions of Data Testing
- Challenges and solutions of Model Testing
- MLOps pipelines and why they matter
- How to expand validation pipelines for Data Quality
Generative AI: Past, Present, and Future – A Practitioner's PerspectiveHuahai Yang
Generative AI: Past, Present, and Future – A Practitioner's Perspective
As the academic realm grapples with the profound implications of generative AI
and related applications like ChatGPT, I will present a grounded view from my
experience as a practitioner. Starting with the origins of neural networks in
the fields of logic, psychology, and computer science, I trace its history and
align it within the wider context of the pursuit of artificial intelligence.
This perspective will also draw parallels with historical developments in
psychology. Against this backdrop, I chart a proposed trajectory for the future.
Finally, I provide actionable insights for both academics and enterprising
individuals in the field.
In this session, you'll get all the answers about how ChatGPT and other GPT-X models can be applied to your current or future project. First, we'll put in order all the terms – OpenAI, GPT-3, ChatGPT, Codex, Dall-E, etc., and explain why Microsoft and Azure are often mentioned in this context. Then, we'll go through the main capabilities of the Azure OpenAI and respective usecases that might inspire you to either optimize your product or build a completely new one.
Managing and Versioning Machine Learning Models in PythonSimon Frid
Practical machine learning is becoming messy, and while there are lots of algorithms, there is still a lot of infrastructure needed to manage and organize the models and datasets. Estimators and Django-Estimators are two python packages that can help version data sets and models, for deployment and effective workflow.
Using MLOps to Bring ML to Production/The Promise of MLOpsWeaveworks
In this final Weave Online User Group of 2019, David Aronchick asks: have you ever struggled with having different environments to build, train and serve ML models, and how to orchestrate between them? While DevOps and GitOps have made huge traction in recent years, many customers struggle to apply these practices to ML workloads. This talk will focus on the ways MLOps has helped to effectively infuse AI into production-grade applications through establishing practices around model reproducibility, validation, versioning/tracking, and safe/compliant deployment. We will also talk about the direction for MLOps as an industry, and how we can use it to move faster, with more stability, than ever before.
The recording of this session is on our YouTube Channel here: https://youtu.be/twsxcwgB0ZQ
Speaker: David Aronchick, Head of Open Source ML Strategy, Microsoft
Bio: David leads Open Source Machine Learning Strategy at Azure. This means he spends most of his time helping humans to convince machines to be smarter. He is only moderately successful at this. Previously, David led product management for Kubernetes at Google, launched GKE, and co-founded the Kubeflow project. David has also worked at Microsoft, Amazon and Chef and co-founded three startups.
Sign up for a free Machine Learning Ops Workshop: http://bit.ly/MLOps_Workshop_List
Weaveworks will cover concepts such as GitOps (operations by pull request), Progressive Delivery (canary, A/B, blue-green), and how to apply those approaches to your machine learning operations to mitigate risk.
Azure OpenAI Service provides REST API access to OpenAI's powerful language models, including the GPT-3, GPT-4, DALL-E, Codex, and Embeddings model series. These models can be easily adapted to any specific task, including but not limited to content generation, summarization, semantic search, translation, transformation, and code generation. Microsoft offers the accessibility of the service through REST APIs, Python or C# SDK, or the Azure OpenAI Studio.
The catalyst for the success of automobiles came not through the invention of the car but rather through the establishment of an innovative assembly line. History shows us that the ability to mass produce and distribute a product is the key to driving adoption of any innovation, and machine learning is no different. MLOps is the assembly line of Machine Learning and in this presentation we will discuss the core capabilities your organization should be focused on to implement a successful MLOps system.
The Conversational AI Journey - What to ExpectAggregage
Understanding the work involved before and after you deploy a virtual agent makes all the difference between a poor customer experience and one that’s on par with your best live agent. What’s surprising to many is that, firstly, getting alignment from internal stakeholders may be the biggest challenge in the conversational AI journey, and, secondly, most of the work to improve virtual agent performance actually happens after going live.
Join us for this webinar as Gary Davis, SmartAction CEO, shares insights and best practices for implementing AI virtual agents in the contact center. After deploying conversational AI for more than 100 leading brands, we’ve learned a few lessons on what can make — or break –– the automated customer experience.
You’ll learn:
•The internal stakeholders you need to involve and engage to make your conversational AI project a successful one
•How automating a customer service call isn’t as simple as using a script from a human interaction
•What happens after go-live, and how to monitor, fine-tune, and train your virtual agent
•The potential ROI when conversational automation is done right
PuppetConf 2017: Unlocking Azure with Puppet Enterprise- Keiran Sweet, Source...Puppet
For the last year Sourced has been assisting a large Canadian based financial organization migrate workloads to Microsoft's Azure public cloud platform. As part of this deployment, Puppet is leveraged to ensure high levels of automation and compliance across the environment. In this updated session we will walk through our approach to integrating Puppet in Azure environments to ensure that automation, security, compliance and infrastructure as code is at the forefront.
Sitecore 8.2 Update 1 on Azure Web AppsRob Habraken
The sildes of my presentation on the Sitecore User Group Netherlands meetup on December 7th 2016, hosted by Colours in Den Bosch, presenting and demoing the provisioning of Sitecore into Azure using Azure Web Apps. Note that these slides do not contain the demo itself. For the demo, view the recording of the presentation or read my blog post, both accessable via https://www.robhabraken.nl
Sitecore development approach evolution – destination helixPeter Nazarov
Sitecore Development Approach Evolution – Destination Helix
Sitecore officially recommended Helix as a set of overall design principles and conventions for Sitecore development around 18 month ago at SUGCON 2016 alongside with an official implementation example - Habitat. Why was it necessary? What are the benefits? Has it worked in practice? Peter Nazarov will share the outlook on why and how a combination of Sitecore Helix and Habitat benefits the business and development users of Sitecore in practice.