Join Dr. Greg Loughnane and Chris Alexiuk in this exciting webinar to learn all about the tooling, processes, and team structure you need to build and operate performant, reliable, and scalable production-grade LLM applications!
Azure OpenAI Service provides REST API access to OpenAI's powerful language models, including the GPT-3, GPT-4, DALL-E, Codex, and Embeddings model series. These models can be easily adapted to any specific task, including but not limited to content generation, summarization, semantic search, translation, transformation, and code generation. Microsoft offers the accessibility of the service through REST APIs, Python or C# SDK, or the Azure OpenAI Studio.
In this session, you'll get all the answers about how ChatGPT and other GPT-X models can be applied to your current or future project. First, we'll put in order all the terms – OpenAI, GPT-3, ChatGPT, Codex, Dall-E, etc., and explain why Microsoft and Azure are often mentioned in this context. Then, we'll go through the main capabilities of the Azure OpenAI and respective usecases that might inspire you to either optimize your product or build a completely new one.
LLMOps for Your Data: Best Practices to Ensure Safety, Quality, and CostAggregage
Join Shreya Rajpal, CEO of Guardrails AI, and Travis Addair, CTO of Predibase, in this exclusive webinar to learn all about leveraging the part of AI that constitutes your IP – your data – to build a defensible AI strategy for the future!
Use Case Patterns for LLM Applications (1).pdfM Waleed Kadous
What are the "use case patterns" for deploying LLMs into production? Understanding these will allow you to spot "LLM-shaped" problems in your own industry.
Azure OpenAI Service provides REST API access to OpenAI's powerful language models, including the GPT-3, GPT-4, DALL-E, Codex, and Embeddings model series. These models can be easily adapted to any specific task, including but not limited to content generation, summarization, semantic search, translation, transformation, and code generation. Microsoft offers the accessibility of the service through REST APIs, Python or C# SDK, or the Azure OpenAI Studio.
In this session, you'll get all the answers about how ChatGPT and other GPT-X models can be applied to your current or future project. First, we'll put in order all the terms – OpenAI, GPT-3, ChatGPT, Codex, Dall-E, etc., and explain why Microsoft and Azure are often mentioned in this context. Then, we'll go through the main capabilities of the Azure OpenAI and respective usecases that might inspire you to either optimize your product or build a completely new one.
LLMOps for Your Data: Best Practices to Ensure Safety, Quality, and CostAggregage
Join Shreya Rajpal, CEO of Guardrails AI, and Travis Addair, CTO of Predibase, in this exclusive webinar to learn all about leveraging the part of AI that constitutes your IP – your data – to build a defensible AI strategy for the future!
Use Case Patterns for LLM Applications (1).pdfM Waleed Kadous
What are the "use case patterns" for deploying LLMs into production? Understanding these will allow you to spot "LLM-shaped" problems in your own industry.
Unlocking the Power of Generative AI An Executive's Guide.pdfPremNaraindas1
Generative AI is here, and it can revolutionize your business. With its powerful capabilities, this technology can help companies create more efficient processes, unlock new insights from data, and drive innovation. But how do you make the most of these opportunities?
This guide will provide you with the information and resources needed to understand the ins and outs of Generative AI, so you can make informed decisions and capitalize on the potential. It covers important topics such as strategies for leveraging large language models, optimizing MLOps processes, and best practices for building with Generative AI.
AZConf 2023 - Considerations for LLMOps: Running LLMs in productionSARADINDU SENGUPTA
With the recent explosion in development and interest in large language, vision and speech models, it has become apparent that running large models in production will be a key driver in enterprise adoption of ML. Traditional MLOps, i.e. running machine learning models in production, already has so many variabilities to address starting from data integrity, data drift and model optimization. Running a large model (language or vision) in production keeping in mind business requirements is different altogether. In this talk, I will try to explain the general framework for LLMOps and certain considerations while designing a system for inferencing a large model.
This talk will be covered in sub-topics:
1. Model Optimization
2. Model fine-tuning
3. Model Editing
4. Model Serving and deployment
5. Model metrics monitoring
6. Embedding and artifact management
In each sub-topic, a brief understanding of the current open-source tool sets will also be mentioned so that tool-chain selection is a bit easier.
Leveraging Generative AI & Best practicesDianaGray10
In this event we will cover:
- What is Generative AI and how it is being for future of work.
- Best practices for developing and deploying generative AI based models in productions.
- Future of Generative AI, how generative AI is expected to evolve in the coming years.
* "Responsible AI Leadership: A Global Summit on Generative AI"
*April 2023 guide for experts and policymakers
* Developing and governing generative AI systems
* + 100 thought leaders and practitioners participated
* Recommendations for responsible development, open innovation & social progress
* 30 action-oriented recommendations aim
* Navigate AI complexities
Generative AI: Past, Present, and Future – A Practitioner's PerspectiveHuahai Yang
Generative AI: Past, Present, and Future – A Practitioner's Perspective
As the academic realm grapples with the profound implications of generative AI
and related applications like ChatGPT, I will present a grounded view from my
experience as a practitioner. Starting with the origins of neural networks in
the fields of logic, psychology, and computer science, I trace its history and
align it within the wider context of the pursuit of artificial intelligence.
This perspective will also draw parallels with historical developments in
psychology. Against this backdrop, I chart a proposed trajectory for the future.
Finally, I provide actionable insights for both academics and enterprising
individuals in the field.
This session was presented at the AWS Community Day in Munich (September 2023). It's for builders that heard the buzz about Generative AI but can’t quite grok it yet. Useful if you are eager to connect the dots on the Generative AI terminology and get a fast start for you to explore further and navigate the space. This session is largely product agnostic and meant to give you the fundamentals to get started.
This presentation presents an overview of the challenges and opportunities of generative artificial intelligence in Web3. It includes a brief research history of generative AI as well as some of its immediate applications in Web3.
Retrieval Augmented Generation in Practice: Scalable GenAI platforms with k8s...Mihai Criveti
Mihai is the Principal Architect for Platform Engineering and Technology Solutions at IBM, responsible for Cloud Native and AI Solutions. He is a Red Hat Certified Architect, CKA/CKS, a leader in the IBM Open Innovation community, and advocate for open source development. Mihai is driving the development of Retrieval Augmentation Generation platforms, and solutions for Generative AI at IBM that leverage WatsonX, Vector databases, LangChain, HuggingFace and open source AI models.
Mihai will share lessons learned building Retrieval Augmented Generation, or “Chat with Documents” platforms and APIs that scale, and deploy on Kubernetes. His talk will cover use cases for Generative AI, limitations of Large Language Models, use of RAG, Vector Databases and Fine Tuning to overcome model limitations and build solutions that connect to your data and provide content grounding, limit hallucinations and form the basis of explainable AI. In terms of technology, he will cover LLAMA2, HuggingFace TGIS, SentenceTransformers embedding models using Python, LangChain, and Weaviate and ChromaDB vector databases. He’ll also share tips on writing code using LLM, including building an agent for Ansible and containers.
Scaling factors for Large Language Model Architectures:
• Vector Database: consider sharding and High Availability
• Fine Tuning: collecting data to be used for fine tuning
• Governance and Model Benchmarking: how are you testing your model performance
over time, with different prompts, one-shot, and various parameters
• Chain of Reasoning and Agents
• Caching embeddings and responses
• Personalization and Conversational Memory Database
• Streaming Responses and optimizing performance. A fine tuned 13B model may
perform better than a poor 70B one!
• Calling 3rd party functions or APIs for reasoning or other type of data (ex: LLMs are
terrible at reasoning and prediction, consider calling other models)
• Fallback techniques: fallback to a different model, or default answers
• API scaling techniques, rate limiting, etc.
• Async, streaming and parallelization, multiprocessing, GPU acceleration (including
embeddings), generating your API using OpenAPI, etc.
Let's talk about GPT: A crash course in Generative AI for researchersSteven Van Vaerenbergh
This talk delves into the extraordinary capabilities of the emerging technology of generative AI, outlining its recent history and emphasizing its growing influence on scientific endeavors. Through a series of practical examples tailored for researchers, we will explore the transformative influence of these powerful tools on scientific tasks such as writing, coding, data wrangling and literature review.
Conversational AI and Chatbot IntegrationsCristina Vidu
Conversational AI and Chatbots (or rather - and more extensively - Virtual Agents) offer great benefits, especially in combination with technologies like RPA or IDP. Corneliu Niculite (Presales Director - EMEA @DRUID AI) and Roman Tobler (CEO @Routinuum & UiPath MVP) are discussing Conversational AI and why Virtual Agents play a significant role in modern ways of working. Moreover, Corneliu will be displaying how to build a Workflow and showcase an Accounts Payable Use Case, integrating DRUID and UiPath Robots.
📙 Agenda:
The focus of our meetup is around the following areas - with a lot of room to discuss and share experiences:
- What is "Conversational AI" and why do we need Chatbots (Virtual Agents);
- Deep-Dive to a DRUID-UiPath Integration via an Accounts Payable Use Case;
- Discussion, Q&A
Speakers:
👨🏻💻 Corneliu Niculite, Presales Director - EMEA DRUID AI
👨🏼💻 Roman Tobler, UiPath MVP, Co-Founder & CEO Routinuum GmbH
This session streamed live on March 8, 2023, 16:00 PM CET.
Check out our upcoming events at: community.uipath.com
Contact us at: community@uipath.com
Accelerating Path to Production for Generative AI-powered ApplicationsHostedbyConfluent
"In this session, we will discuss some recent developments in Generative AI and how those can be leveraged to build intelligent applications. Learn how to bring the power of large language models (LLMs) to your private, real-time operational data across multiple data types. We will talk about improving the accuracy of LLMs in your applications by leveraging Retrieval Augmented Generation, which provides proprietary knowledge to the LLM.
From real-time responses to sophisticated interactions, learn how you can easily build a range of AI-driven experiences that leverage your operational data with minimal complexity.
MongoDB Atlas provides native vector search capabilities and a flexible document model all within an enterprise-ready developer data platform empowering teams to iterate quickly on applications enriched with generative AI. Coupling Atlas with Confluent makes it easier to leverage streaming data when informing LLMs with proprietary data."
Regulating Generative AI - LLMOps pipelines with TransparencyDebmalya Biswas
The growing adoption of Gen AI, esp. LLMs, has re-ignited the discussion around AI Regulations — to ensure that AI/ML systems are responsibly trained and deployed. Unfortunately, this effort is complicated by multiple governmental organizations and regulatory bodies releasing their own guidelines and policies with little to no agreement on the definition of terms.
Rather than trying to understand and regulate all types of AI, we recommend a different (and practical) approach in this talk based on AI Transparency —
to transparently outline the capabilities of the AI system based on its training methodology and set realistic expectations with respect to what it can (and cannot) do.
We outline LLMOps architecture patterns and show how the proposed approach can be integrated at different stages of the LLMOps pipeline capturing the model's capabilities. In addition, the AI system provider also specifies scenarios where (they believe that) the system can make mistakes, and recommends a ‘safe’ approach with guardrails for those scenarios.
Chat GPT 4 can pass the American state bar exam, but before you go expecting to see robot lawyers taking over the courtroom, hold your horses cowboys – we're not quite there yet. That being said, AI is becoming increasingly more human-like, and as a VC we need to start thinking about how this new wave of technology is going to affect the way we build and run businesses. What do we need to do differently? How can we make sure that our investment strategies are reflecting these changes? It's a brave new world out there, and we’ve got to keep the big picture in mind!
Sharing here with you what we at Cavalry Ventures found out during our Generative AI deep dive.
Presentation of the Semantic Knowledge Graph research paper at the 2016 IEEE 3rd International Conference on Data Science and Advanced Analytics (Montreal, Canada - October 18th, 2016)
Abstract—This paper describes a new kind of knowledge representation and mining system which we are calling the Semantic Knowledge Graph. At its heart, the Semantic Knowledge Graph leverages an inverted index, along with a complementary uninverted index, to represent nodes (terms) and edges (the documents within intersecting postings lists for multiple terms/nodes). This provides a layer of indirection between each pair of nodes and their corresponding edge, enabling edges to materialize dynamically from underlying corpus statistics. As a result, any combination of nodes can have edges to any other nodes materialize and be scored to reveal latent relationships between the nodes. This provides numerous benefits: the knowledge graph can be built automatically from a real-world corpus of data, new nodes - along with their combined edges - can be instantly materialized from any arbitrary combination of preexisting nodes (using set operations), and a full model of the semantic relationships between all entities within a domain can be represented and dynamically traversed using a highly compact representation of the graph. Such a system has widespread applications in areas as diverse as knowledge modeling and reasoning, natural language processing, anomaly detection, data cleansing, semantic search, analytics, data classification, root cause analysis, and recommendations systems. The main contribution of this paper is the introduction of a novel system - the Semantic Knowledge Graph - which is able to dynamically discover and score interesting relationships between any arbitrary combination of entities (words, phrases, or extracted concepts) through dynamically materializing nodes and edges from a compact graphical representation built automatically from a corpus of data representative of a knowledge domain.
Unlocking the Power of Generative AI An Executive's Guide.pdfPremNaraindas1
Generative AI is here, and it can revolutionize your business. With its powerful capabilities, this technology can help companies create more efficient processes, unlock new insights from data, and drive innovation. But how do you make the most of these opportunities?
This guide will provide you with the information and resources needed to understand the ins and outs of Generative AI, so you can make informed decisions and capitalize on the potential. It covers important topics such as strategies for leveraging large language models, optimizing MLOps processes, and best practices for building with Generative AI.
AZConf 2023 - Considerations for LLMOps: Running LLMs in productionSARADINDU SENGUPTA
With the recent explosion in development and interest in large language, vision and speech models, it has become apparent that running large models in production will be a key driver in enterprise adoption of ML. Traditional MLOps, i.e. running machine learning models in production, already has so many variabilities to address starting from data integrity, data drift and model optimization. Running a large model (language or vision) in production keeping in mind business requirements is different altogether. In this talk, I will try to explain the general framework for LLMOps and certain considerations while designing a system for inferencing a large model.
This talk will be covered in sub-topics:
1. Model Optimization
2. Model fine-tuning
3. Model Editing
4. Model Serving and deployment
5. Model metrics monitoring
6. Embedding and artifact management
In each sub-topic, a brief understanding of the current open-source tool sets will also be mentioned so that tool-chain selection is a bit easier.
Leveraging Generative AI & Best practicesDianaGray10
In this event we will cover:
- What is Generative AI and how it is being for future of work.
- Best practices for developing and deploying generative AI based models in productions.
- Future of Generative AI, how generative AI is expected to evolve in the coming years.
* "Responsible AI Leadership: A Global Summit on Generative AI"
*April 2023 guide for experts and policymakers
* Developing and governing generative AI systems
* + 100 thought leaders and practitioners participated
* Recommendations for responsible development, open innovation & social progress
* 30 action-oriented recommendations aim
* Navigate AI complexities
Generative AI: Past, Present, and Future – A Practitioner's PerspectiveHuahai Yang
Generative AI: Past, Present, and Future – A Practitioner's Perspective
As the academic realm grapples with the profound implications of generative AI
and related applications like ChatGPT, I will present a grounded view from my
experience as a practitioner. Starting with the origins of neural networks in
the fields of logic, psychology, and computer science, I trace its history and
align it within the wider context of the pursuit of artificial intelligence.
This perspective will also draw parallels with historical developments in
psychology. Against this backdrop, I chart a proposed trajectory for the future.
Finally, I provide actionable insights for both academics and enterprising
individuals in the field.
This session was presented at the AWS Community Day in Munich (September 2023). It's for builders that heard the buzz about Generative AI but can’t quite grok it yet. Useful if you are eager to connect the dots on the Generative AI terminology and get a fast start for you to explore further and navigate the space. This session is largely product agnostic and meant to give you the fundamentals to get started.
This presentation presents an overview of the challenges and opportunities of generative artificial intelligence in Web3. It includes a brief research history of generative AI as well as some of its immediate applications in Web3.
Retrieval Augmented Generation in Practice: Scalable GenAI platforms with k8s...Mihai Criveti
Mihai is the Principal Architect for Platform Engineering and Technology Solutions at IBM, responsible for Cloud Native and AI Solutions. He is a Red Hat Certified Architect, CKA/CKS, a leader in the IBM Open Innovation community, and advocate for open source development. Mihai is driving the development of Retrieval Augmentation Generation platforms, and solutions for Generative AI at IBM that leverage WatsonX, Vector databases, LangChain, HuggingFace and open source AI models.
Mihai will share lessons learned building Retrieval Augmented Generation, or “Chat with Documents” platforms and APIs that scale, and deploy on Kubernetes. His talk will cover use cases for Generative AI, limitations of Large Language Models, use of RAG, Vector Databases and Fine Tuning to overcome model limitations and build solutions that connect to your data and provide content grounding, limit hallucinations and form the basis of explainable AI. In terms of technology, he will cover LLAMA2, HuggingFace TGIS, SentenceTransformers embedding models using Python, LangChain, and Weaviate and ChromaDB vector databases. He’ll also share tips on writing code using LLM, including building an agent for Ansible and containers.
Scaling factors for Large Language Model Architectures:
• Vector Database: consider sharding and High Availability
• Fine Tuning: collecting data to be used for fine tuning
• Governance and Model Benchmarking: how are you testing your model performance
over time, with different prompts, one-shot, and various parameters
• Chain of Reasoning and Agents
• Caching embeddings and responses
• Personalization and Conversational Memory Database
• Streaming Responses and optimizing performance. A fine tuned 13B model may
perform better than a poor 70B one!
• Calling 3rd party functions or APIs for reasoning or other type of data (ex: LLMs are
terrible at reasoning and prediction, consider calling other models)
• Fallback techniques: fallback to a different model, or default answers
• API scaling techniques, rate limiting, etc.
• Async, streaming and parallelization, multiprocessing, GPU acceleration (including
embeddings), generating your API using OpenAPI, etc.
Let's talk about GPT: A crash course in Generative AI for researchersSteven Van Vaerenbergh
This talk delves into the extraordinary capabilities of the emerging technology of generative AI, outlining its recent history and emphasizing its growing influence on scientific endeavors. Through a series of practical examples tailored for researchers, we will explore the transformative influence of these powerful tools on scientific tasks such as writing, coding, data wrangling and literature review.
Conversational AI and Chatbot IntegrationsCristina Vidu
Conversational AI and Chatbots (or rather - and more extensively - Virtual Agents) offer great benefits, especially in combination with technologies like RPA or IDP. Corneliu Niculite (Presales Director - EMEA @DRUID AI) and Roman Tobler (CEO @Routinuum & UiPath MVP) are discussing Conversational AI and why Virtual Agents play a significant role in modern ways of working. Moreover, Corneliu will be displaying how to build a Workflow and showcase an Accounts Payable Use Case, integrating DRUID and UiPath Robots.
📙 Agenda:
The focus of our meetup is around the following areas - with a lot of room to discuss and share experiences:
- What is "Conversational AI" and why do we need Chatbots (Virtual Agents);
- Deep-Dive to a DRUID-UiPath Integration via an Accounts Payable Use Case;
- Discussion, Q&A
Speakers:
👨🏻💻 Corneliu Niculite, Presales Director - EMEA DRUID AI
👨🏼💻 Roman Tobler, UiPath MVP, Co-Founder & CEO Routinuum GmbH
This session streamed live on March 8, 2023, 16:00 PM CET.
Check out our upcoming events at: community.uipath.com
Contact us at: community@uipath.com
Accelerating Path to Production for Generative AI-powered ApplicationsHostedbyConfluent
"In this session, we will discuss some recent developments in Generative AI and how those can be leveraged to build intelligent applications. Learn how to bring the power of large language models (LLMs) to your private, real-time operational data across multiple data types. We will talk about improving the accuracy of LLMs in your applications by leveraging Retrieval Augmented Generation, which provides proprietary knowledge to the LLM.
From real-time responses to sophisticated interactions, learn how you can easily build a range of AI-driven experiences that leverage your operational data with minimal complexity.
MongoDB Atlas provides native vector search capabilities and a flexible document model all within an enterprise-ready developer data platform empowering teams to iterate quickly on applications enriched with generative AI. Coupling Atlas with Confluent makes it easier to leverage streaming data when informing LLMs with proprietary data."
Regulating Generative AI - LLMOps pipelines with TransparencyDebmalya Biswas
The growing adoption of Gen AI, esp. LLMs, has re-ignited the discussion around AI Regulations — to ensure that AI/ML systems are responsibly trained and deployed. Unfortunately, this effort is complicated by multiple governmental organizations and regulatory bodies releasing their own guidelines and policies with little to no agreement on the definition of terms.
Rather than trying to understand and regulate all types of AI, we recommend a different (and practical) approach in this talk based on AI Transparency —
to transparently outline the capabilities of the AI system based on its training methodology and set realistic expectations with respect to what it can (and cannot) do.
We outline LLMOps architecture patterns and show how the proposed approach can be integrated at different stages of the LLMOps pipeline capturing the model's capabilities. In addition, the AI system provider also specifies scenarios where (they believe that) the system can make mistakes, and recommends a ‘safe’ approach with guardrails for those scenarios.
Chat GPT 4 can pass the American state bar exam, but before you go expecting to see robot lawyers taking over the courtroom, hold your horses cowboys – we're not quite there yet. That being said, AI is becoming increasingly more human-like, and as a VC we need to start thinking about how this new wave of technology is going to affect the way we build and run businesses. What do we need to do differently? How can we make sure that our investment strategies are reflecting these changes? It's a brave new world out there, and we’ve got to keep the big picture in mind!
Sharing here with you what we at Cavalry Ventures found out during our Generative AI deep dive.
Presentation of the Semantic Knowledge Graph research paper at the 2016 IEEE 3rd International Conference on Data Science and Advanced Analytics (Montreal, Canada - October 18th, 2016)
Abstract—This paper describes a new kind of knowledge representation and mining system which we are calling the Semantic Knowledge Graph. At its heart, the Semantic Knowledge Graph leverages an inverted index, along with a complementary uninverted index, to represent nodes (terms) and edges (the documents within intersecting postings lists for multiple terms/nodes). This provides a layer of indirection between each pair of nodes and their corresponding edge, enabling edges to materialize dynamically from underlying corpus statistics. As a result, any combination of nodes can have edges to any other nodes materialize and be scored to reveal latent relationships between the nodes. This provides numerous benefits: the knowledge graph can be built automatically from a real-world corpus of data, new nodes - along with their combined edges - can be instantly materialized from any arbitrary combination of preexisting nodes (using set operations), and a full model of the semantic relationships between all entities within a domain can be represented and dynamically traversed using a highly compact representation of the graph. Such a system has widespread applications in areas as diverse as knowledge modeling and reasoning, natural language processing, anomaly detection, data cleansing, semantic search, analytics, data classification, root cause analysis, and recommendations systems. The main contribution of this paper is the introduction of a novel system - the Semantic Knowledge Graph - which is able to dynamically discover and score interesting relationships between any arbitrary combination of entities (words, phrases, or extracted concepts) through dynamically materializing nodes and edges from a compact graphical representation built automatically from a corpus of data representative of a knowledge domain.
How Azure helps to build better business processes and customer experiences w...Maxim Salnikov
Artificial Intelligence is not the future, it is NOW. Cloud technology empowers developers and technology leaders to benefit from AI effectively and responsibly with the models and tools they need. In this session, we go through the portfolio of Azure AI services and run some demos to showcase how AI can improve daily life, safety, productivity, accessibility, and business outcomes.
Building and deploying LLM applications with Apache AirflowKaxil Naik
Behind the growing interest in Generate AI and LLM-based enterprise applications lies an expanded set of requirements for data integrations and ML orchestration. Enterprises want to use proprietary data to power LLM-based applications that create new business value, but they face challenges in moving beyond experimentation. The pipelines that power these models need to run reliably at scale, bringing together data from many sources and reacting continuously to changing conditions.
This talk focuses on the design patterns for using Apache Airflow to support LLM applications created using private enterprise data. We’ll go through a real-world example of what this looks like, as well as a proposal to improve Airflow and to add additional Airflow Providers to make it easier to interact with LLMs such as the ones from OpenAI (such as GPT4) and the ones on HuggingFace, while working with both structured and unstructured data.
In short, this shows how these Airflow patterns enable reliable, traceable, and scalable LLM applications within the enterprise.
https://airflowsummit.org/sessions/2023/keynote-llm/
Haystack 2018 - Algorithmic Extraction of Keywords Concepts and VocabulariesMax Irwin
Presentation as given to the Haystack Conference, which outlines research and techniques for automatic extraction of keywords, concepts, and vocabularies from text corpora.
Building Generative AI-infused apps: what's possible and how to startMaxim Salnikov
In this session, we'll explore different scenarios where the features of Generative AI can provide added value to an IT solution. We'll also learn how to begin developing your own application powered by AI. Using Azure OpenAI service as an illustration, we'll examine the various APIs it offers, review the best practices of Prompt Engineering, explore different ways to incorporate your own data into the process, and take a glance at several tools and resources that make the developer experience more seamless.
Tensors for topic modeling and deep learning on AWS SagemakerAnima Anandkumar
Tensors are higher order extensions of matrices that can incorporate multiple modalities and encode higher order relationships in data. This session will present recently developed tensor algorithms for topic modeling and deep learning with vastly improved performance over existing methods.
Topic models enable automated categorization of large document corpora, without requiring labeled data for training. They go beyond simple clustering since they allow for documents to have multiple topics. Tensor methods provide a fast and a guaranteed method for training these models. They incorporate co-occurrence statistics of triplets of words in documents. We are releasing a fast and a robust implementation that vastly outperform existing solutions while providing significantly faster training times and better topic quality. Moreover, training and inference are decoupled in our algorithm, so the user can select the relevant part based on their requirements. We will present benchmarks across multiple datasets of different sizes and AWS instance types, and provide notebook examples.
Hadoop clusters can store nearly everything in a cheap and blazingly fast way to your data lake. Answering questions and gaining insights out of this ever growing stream becomes the decisive part for many businesses. Increasingly data has a natural structure as a graph, with vertices linked by edges, and many questions arising about the data involve graph traversals or other complex queries, for which one does not have an a priori given bound on the length of paths.
All in AI: LLM Landscape & RAG in 2024 with Mark Ryan (Google) & Jerry Liu (L...Daniel Zivkovic
Serverless Toronto's 6th-anniversary event helps IT pros understand and prepare for the #GenAI tsunami ahead. You'll gain situational awareness of the LLM Landscape, receive condensed insights, and actionable advice about RAG in 2024 from Google AI Lead Mark Ryan and LlamaIndex creator Jerry Liu. We chose #RAG (Retrieval-Augmented Generation) because it is the predominant paradigm for building #LLM (Large Language Model) applications in enterprises today - and that's where the jobs will be shifting. Here is the recording: https://youtu.be/P5xd1ZjD-Os?si=iq8xibj5pJsJ62oW
A changing market landscape and open source innovations are having a dramatic impact on the consumability and ease of use of data science tools. Join this session to learn about the impact these trends and changes will have on the future of data science. If you are a data scientist, or if your organization relies on cutting edge analytics, you won't want to miss this!
The need for sophistication in modern search engine implementationsBen DeMott
The need for more sophisticated search implementations is often at odds with the limited feature set available in modern out of the box open source search engines.
This presentation discusses the challenges associated with properly modeling information within a domain and why it's critically needed.
Discovering User's Topics of Interest in Recommender SystemsGabriel Moreira
This talk introduces the main techniques of Recommender Systems and Topic Modeling.
Then, we present a case of how we've combined those techniques to build Smart Canvas (www.smartcanvas.com), a service that allows people to bring, create and curate content relevant to their organization, and also helps to tear down knowledge silos.
We present some of Smart Canvas features powered by its recommender system, such as:
- Highlight relevant content, explaining to the users which of his topics of interest have generated each recommendation.
- Associate tags to users’ profiles based on topics discovered from content they have contributed. These tags become searchable, allowing users to find experts or people with specific interests.
- Recommends people with similar interests, explaining which topics brings them together.
We give a deep dive into the design of our large-scale recommendation algorithms, giving special attention to our content-based approach that uses topic modeling techniques (like LDA and NMF) to discover people’s topics of interest from unstructured text, and social-based algorithms using a graph database connecting content, people and teams around topics.
Our typical data pipeline that includes the ingestion millions of user events (using Google PubSub and BigQuery), the batch processing of the models (with PySpark, MLib, and Scikit-learn), the online recommendations (with Google App Engine, Titan Graph Database and Elasticsearch), and the data-driven evaluation of UX and algorithms through A/B testing experimentation. We also touch topics about non-functional requirements of a software-as-a-service like scalability, performance, availability, reliability and multi-tenancy and how we addressed it in a robust architecture deployed on Google Cloud Platform.
The Relevance of the Apache Solr Semantic Knowledge GraphTrey Grainger
The Semantic Knowledge Graph is an Apache Solr plugin that can be used to discover and rank the relationships between any arbitrary queries or terms within the search index. It is a relevancy swiss army knife, able to discover related terms and concepts, disambiguate different meanings of terms given their context, cleanup noise in datasets, discover previously unknown relationships between entities across documents and fields, rank lists of keywords based upon conceptual cohesion to reduce noise, summarize documents by extracting their most significant terms, generate recommendations and personalized search, and power numerous other applications involving anomaly detection, significance/relationship discovery, and semantic search. This talk will walk you through how to setup and use this plugin in concert with other open source tools (probabilistic query parser, SolrTextTagger for entity extraction) to parse, interpret, and much more correctly model the true intent of user searches than traditional keyword-based search approaches.
Similar to LLMs in Production: Tooling, Process, and Team Structure (20)
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Unlocking Employee Potential with the Power of Continuous FeedbackAggregage
https://www.humanresourcestoday.com/frs/26832980/unlocking-employee-potential-with-the-power-of-continuous-feedback
Recent studies show that only 21% of employees feel their performance and growth are within their control. What if the answer to employee development and high performance lies elsewhere?
Enter continuous feedback. Imagine a work environment where feedback isn't a dreaded annual event, but a constant source of growth. Join us to discover how ongoing, actionable feedback empowers your team to take ownership of their performance, boosting engagement and development. After all, when surveyed, almost all employees say they want and crave timely feedback!
Objectives:
• Navigate employee challenges with feedback and equip yourself with effective delivery methods.
• Learn how to cultivate a thriving workforce through frequent feedback conversations.
• Gain practical strategies to turn you into a feedback pro, improving communication, empowering your team, and unlocking employee potential.
The Key to Sustainable Energy Optimization: A Data-Driven Approach for Manufa...Aggregage
Join us for a practical webinar, hosted by Kevin Kai Wong of Emergent Energy, where we'll explore how leveraging data-rich energy management solutions can drive operational excellence in the evolving landscape of energy intelligence and sustainability in manufacturing!
From Awareness to Action: An HR Guide to Making Accessibility AccessibleAggregage
https://www.humanresourcestoday.com/frs/26293486/from-awareness-to-action--an-hr-guide-to-making-accessibility-accessible
Making accessibility accessible for organizations of all sizes may seem complex, but it doesn’t have to be.
Prepare to broaden your understanding of Disability, Cultural Competency, and Inclusion with this insightful webinar. We’ll explore disability as a vibrant culture, understand the nuances of reasonable accommodations under the ADA, and navigate the complexities of undue hardship while challenging the status quo of accessibility practices. This session will offer practical strategies for creating a company culture of accessibility, ranging from cost-effective initiatives to moderate investments, ensuring an environment where every individual feels valued, respected, and included.
We'll cover:
• Introduction to Disability, Cultural Competency, and Inclusion
• Defining reasonable accommodation and undue hardship
• The power of intention in inclusion and how to empower employees with disabilities
• Types of accessibility
• How to create a company culture of accessibility at any size
The Path to Product Excellence: Avoiding Common Pitfalls and Enhancing Commun...Aggregage
https://www.productmanagementtoday.com/frs/26795801/the-path-to-product-excellence--avoiding-common-pitfalls-and-enhancing-communication
In the fast-paced world of digital innovation, success is often accompanied by a multitude of challenges - like the pitfalls lurking at every turn, threatening to derail the most promising projects. But fret not, this webinar is your key to effective product development!
Join us for an enlightening session to empower you to lead your team to greater heights. Through compelling storytelling and actionable insights, learn to overcome challenges like misaligned objectives, communication breakdowns, and resistance to change.
Takeaways:
• Uncover and navigate through common pitfalls that are plaguing product teams today.
• Explore proven solutions, laying the groundwork for triumphant product launches.
• Gain inspiration from real-world success examples from top digital companies, offering invaluable insights into their winning strategies.
• Discover how the symbiotic relationship between product managers, UX/UI designers, and developers can transform pitfalls into opportunities, propelling your product outcomes to unprecedented heights.
How to Leverage Behavioral Science Insights for Direct Mail SuccessAggregage
Join Neal Boornazian and Nancy Harhut to discover proven, actionable strategies to leverage behavioral science in your direct mail today, and leave this webinar with a competitive advantage that lets you easily boost your engagement and response rates!
Sales & Marketing Alignment: How to Synergize for SuccessAggregage
While many B2B organizations continue to struggle with aligning their marketing and sales teams, they can take practical steps to unify both teams and simplify their approach. In this webinar, Carlos Hidalgo, CEO of Digital Exhaust and B2B expert, will show you how to solve your company's alignment troubles to meet organizational growth objectives!
How Automation is Driving Efficiency Through the Last Mile of ReportingAggregage
https://www.corporatefinancebrief.com/frs/26690636/how-automation-is-driving-efficiency-through-the-last-mile-of-reporting
As organizations strive for agility and efficiency, it's imperative for finance leaders to embrace innovative technologies and redefine traditional processes. Join us as we explore the pivotal role of digitalization and automation in reshaping what is commonly referred to as the “last mile of reporting”.
We’ll deep-dive into why digitalization is no longer a choice, but a necessity for finance departments to stay competitive in a fast-paced environment touching on:
• 2024 trends for the Office of the CFO: A review of today’s automation revolution within the finance department as it faces evolving internal and external challenges.
• Leveraging automation for efficiency and accuracy: Learn how automation tools and technologies can streamline repetitive tasks, reduce manual errors, and free up valuable resources for more strategic initiatives.
• Enhancing transparency and stakeholder confidence: See how robust disclosure management practices contribute to increased transparency, fostering trust among stakeholders, including investors, regulators, and internal decision-makers.
• Overcoming challenges and embracing change: Gain practical strategies and best practices for overcoming common barriers to digital transformation within finance departments and learn how to effectively manage change to maximize the benefits of automation.
Planning your Restaurant's Path to ProfitabilityAggregage
Join James Kahler, COO of Full Course, in this new session all about where to spend and where to save when operating and expanding your restaurant for maximum profitability!
The Engagement Engine: Strategies for Building a High-Performance CultureAggregage
https://www.humanresourcestoday.com/frs/26766735/the-engagement-engine--strategies-for-building-a-high-performance-culture
Many companies strive for a positive culture with happy employees. But what if you could achieve more? High-performing cultures are the McLarens of the business world, leaving Camrys in the dust. They unlock exceptional results by fostering innovation, engagement, and continuous growth.
In this webinar, we'll demystify the concept and provide practical steps to kickstart the journey toward a high-performing culture in your organization. Drawing on research and real-world examples, we'll discuss the fundamental elements that contribute to such a culture, including trust, feedback loops, and fostering curiosity and growth mindsets. You'll learn how to transform your company from a reliable work environment into an engine for peak performance.
Join us to discover:
• The High-Performance Difference: We'll explore the key characteristics that set high-performing cultures apart. These cultures attract and retain top talent who crave a dynamic and stimulating work environment. Leaders set the tone by embodying company values and inspiring employees with a clear vision.
• Building the Foundation: We'll break down the essential building blocks for a high-performing culture. This includes fostering psychological safety and trust, where employees feel comfortable taking risks and learning from mistakes. Clear goals and focused roadmaps keep everyone aligned, while roadblocks are identified and removed to empower teams to thrive.
• A Culture of Growth: High-performing cultures go beyond simply measuring numbers. They embrace a growth mindset, constantly seeking to learn and improve. This includes a commitment to open and honest feedback, delivered in a way that motivates and develops employees.
Driving Business Impact for PMs with Jon HarmerAggregage
https://www.productmanagementtoday.com/frs/26551585/driving-business-impact-for-pms
Move from feature factory to customer outcomes and drive impact in your business!
This session will provide you with a comprehensive set of tools to help you develop impactful products by shifting from output-based thinking to outcome-based thinking. You will deepen your understanding of your customers and their needs as well as identifying and de-risking the different kinds of hypotheses built into your roadmap. Understand how your work contributes to your company's strategy and learn to apply frameworks to ensure your features solve user problems that drive business impact.
Learning objectives:
• Learn how to prioritize the most impactful opportunities: Identify the most impactful opportunities using Impact Mapping and other framing techniques, shift from output orientation to outcome/impact orientation.
• Grow your user empathy skills: Better understand users and the problem space they are working in through Journey Maps that are customized for Product Managers.
• Understand the risks and hypotheses built into your roadmap: By making explicit the different hypotheses in your plan and identifying the riskiest ones, you will be able to quickly validate the riskiest assumptions and improve your outcomes.
• Create actual artifacts for your products: With the practical experience provided in this session, apply these tools to real-world product management scenarios to build journey and impact maps for actual users & products.
Strategic Project Finance Essentials: A Project Manager’s Guide to Financial ...Aggregage
Empower yourself as a project manager with insights that directly influence the financial landscape and strategic direction of your organization!
Join us for a deep dive into the world of financial strategy, as we dissect key metrics that drive CFOs and business leaders’ investment decisions. This session will equip you with the necessary tools to craft compelling business cases as well as a comprehensive understanding of the crucial distinction between capital expenditure and operational expenditure, and its profound impact on financial statements.
During this webinar, we’ll cover the following:
• Three Critical Metrics: Net Present Value (NPV), Internal Rate of Return (IRR), and Payback Period
• Why tracking capital spend is important
• How project spend classification shapes the portrayal on an income statement
• Classification of capital expenditure (CapEx) versus. operational expenditure (OpEx), and its impact on financial statements and EBITDA
The Retention Ripple Effect: Nonprofit Staff and Donor DynamicsAggregage
https://www.nonprofittech.com/frs/26320757/the-real-nonprofit-retention-issue---it-s-not-what-you-think
Across the nonprofit sector, organizations invest heavily in donor retention efforts, yet the struggle of cultivating lasting relationships remains. While attracting new donors is crucial, the lack of repeat donors poses significant financial risks.
Through a comprehensive analysis of industry data, experts argue that there is a direct correlation between donor burnout, donor retention, and the talent retention crisis. By unpacking this relationship, we emphasize the importance of cultivating a dedicated workforce to enhance donor retention and drive sustainable growth. 📈
Industry experts Andrew Olsen and Kat Landa will explore:
• A data-driven analysis of the current retention crisis in the nonprofit sector 📊
• How talent retention and donor retention challenges faced by nonprofit organizations go hand in hand 🤝
• The key role of organizational leaders in addressing the retention crisis head on 🔑
• Actionable strategies to combat the retention crisis and foster long-term donor relationships 💡
Breaking the Burnout Cycle: Empowering Managers for ExcellenceAggregage
https://www.humanresourcestoday.com/frs/26375534/breaking-the-burnout-cycle--empowering-managers-for-excellence
In the fast-paced world of work, burnout has emerged as a critical issue. Alarming statistics reveal two in five U.S. workers experience feeling burned out. However, the situation is even more dire among managers, with nearly half reporting burnout, often hidden behind their responsibilities and the desire to uplift their teams. Recognizing the severity of this problem is crucial. Join Bonusly’s Head of People, Adri Glover, and Sr. People Partner, Mollie Hinz, as we delve into the unique challenges faced by managers and provide actionable insights for addressing and preventing manager burnout.
We will not only explore the distinct signs of manager burnout but also how to identify the warning signals. We will share practical strategies for alleviating manager burnout and discuss how prioritizing the well-being of your managers will, in turn, enhance team performance and culture.
In this webinar you will learn:
• Recognizing Warning Signs: Understand and identify the four key warning signs of manager burnout.
• Practical Strategies for Alleviation: Gain insights into data-backed methods for managing burnout.
• Turning Burnout into Engagement: Explore how prioritizing the well-being of managers can lead to stronger team performance and company culture, turning burnout into an opportunity for growth and resilience.
Strategic CX: A Deep Dive into Voice of the Customer Insights for ClarityAggregage
In this interactive session, Nicholas Zeisler will delve into fundamental questions about VoC, and will explore why you’re doing VoC in the first place, how you can do it better, and what that means when it comes to acquiring and analyzing customer insights!
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...Aggregage
https://www.productmanagementtoday.com/frs/26116444/the-data-metaverse--unpacking-the-roles--use-cases--and-tech-trends-in-data-and-ai
Embark on a transformation journey into the heart of the data ecosystem! This webinar is your gateway to a deeper comprehension of the foundations that drive the data industry and will equip you with the knowledge needed to navigate the evolving landscape. Delve into the diverse use cases where data analytics plays a pivotal role. We’ll explore how these applications are transforming with the introduction of Gen AI, and discuss the anticipated use cases for 2024 and beyond. Join us for a forward-looking exploration of the future data landscape!
Key objectives:
• Introduction to the structures and ownership dynamics of data platform, analytics and AI teams, along with an exploration of various roles in the data ecosystem.
• Delve into the distinctive roles and responsibilities of a Platform PM compared to other Product Managers.
• Examine real world use cases, both internal and external, where data analytics is applied, and understand its evolution with the introduction of Gen AI.
• Anticipated future use cases as we project into 2024 and beyond.
• Explore the array of tools and technologies driving data transformation across different stages and states, from source to destination.
How to Build an Experimentation Culture for Data-Driven Product DevelopmentAggregage
In this webinar, Margaret-Ann Seger, Head of Product at Statsig, will teach you how to build an experimentation culture from the ground-up, graduating from just getting started with data-driven development to operating at the level of a FAANG company!
Bridging the Gap: The Intersection of DEI Initiatives and Employee BenefitsAggregage
https://www.humanresourcestoday.com/frs/26116903/bridging-the-gap--the-intersection-of-dei-initiatives-and-employee-benefits
Unlock the secrets to transforming your organization’s employee benefits into a strategic tool for Diversity, Equity, and Inclusion (DEI). During this informative session, we will discuss common pitfalls in traditional benefits and then delve into the essence of DEI in employee-centric benefit offerings. This involves not only defining DEI in the workplace but also understanding the pivotal role that employee benefits play in fostering a diverse and inclusive environment.
The session will provide actionable insights into creating a robust inclusive benefits strategy, emphasizing the importance of understanding the diverse needs of employees and tailoring benefits to support different demographics, family styles, and generations. Don’t miss this opportunity to revolutionize your approach to employee benefits and make a lasting impact on your workplace culture.
Key Audience Takeaways:
• Understanding how DEI intersects with Employee Benefits
• Creating an inclusive benefits strategy
• Affordable and inclusive benefit offerings
• Overcoming resistance and challenges
• Strategies for implementation
• Measuring success and impact
Mapping Digital Transformation: Retail’s Strategic ShiftAggregage
https://www.onlineretailtoday.com/frs/26085324/mapping-digital-transformation--retail-s-strategic-shift
Digital transformation in retail is so much more than new technology. You need to get your whole organization, from entry-level workers to executives, on board with the new tech, new skills, and culture changes that digital transformation brings. Leading this mindset shift can be a daunting task… but that’s where this webinar comes in!
Join our panel of experts as they guide you through the challenges of digital transformation, preparing you to avoid common mistakes and make the most of incredible opportunities.
This session will cover:
• How to prepare your organization for the changes and challenges that come with digital transformation
• Gaining support for transformation from key stakeholders
• Creating a timeline, defining success, assessing your skills, and rallying your team for the journey ahead
• Navigating the complex, challenging, and growth-enabling path of transitioning your business to a new tech architecture
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
2. Have a question or comment for our
panelists?
Use this QR code to engage with our
speakers, or visit the link in the chat!
Having an audio issue?
Try dialing in by phone!
Dial: +1 312 626 6799
Webinar ID: 819 1469 6007
Passcode: 385318
Closed Captioning is available
for this webinar!
3. Our Panelists
Tony Karrer
Founder & CEO TechEmpower,
Founder & CTO Aggregage
Greg Loughnane
Founder & CEO of
AI Makerspace
Chris Alexiuk
Co-Founder & CTO at
AI Makerspace
5. BY THE END OF TODAY...
Understand processes for building and
improving production LLM applications
Overview of industry-standard tooling
How to leverage LangSmith
6. OVERVIEW
LLM Ops, LLM OS, and “The New Stack”
Leading Tooling
Meet LangSmith
Conclusions, Q&A
44. 🧩3 EASY PIECES TO RETRIEVAL
Ask a question
1.
Search database for stuff similar to question
2.
Return the stuff
3.
45. 📇INDEX (THE DATABASE)
Split docs into chunks
1.
Create embeddings for each chunk
2.
Store embeddings in vector store index
3.
Embeddings Vector Store Index
Documents
Raw Source
Documents Chunked Documents
[0.1,0.4,-0.6,...]
[0.2,0.3,-0.4,...]
[0.8,0.3,-0.1,...]
46. 🐕RETRIEVERS
Embeddings Vector Store Index
Documents
Raw Source
Documents Chunked Documents
[0.1,0.4,-0.6,...]
Query
INPUT
[0.1,0.4,-0.6,...]
Find Nearest Neighbors
Context: From source 1
Context: From source 2
Context: From source
🐕
[0.2,0.3,-0.4,...]
[0.8,0.3,-0.1,...]
47. [0.1, 0.4, -0.6, ...]
Ryan was ...
Query
Find Nearest
Neighbours
(cosine similarity)
Vector Database
App Logic
INPUT
“Query...”
Embedding Model
48. [0.1, 0.4, -0.6, ...]
Use the provided context to answer the user's query.
You may not answer the user's query unless there is specific
context in the following text.
If you do not know the answer, or cannot answer, please respond
with "I don't know".
Context:
{context}
User Query:
{user_query}
Query
Embedding Model Chat Model
Prompt Templates
INPUT
“Query...”
Find Nearest
Neighbours
(cosine similarity)
Vector Database
App Logic
49. Embedding Model Chat Model
Vector Store
Find Nearest
Neighbours
(cosine similarity)
Return document(s)
from
Nearest Neighbours
[0.1, 0.4, -0.6, ...]
Prompt Templates
Vector Database
App Logic App Logic
Use the provided context to answer the user's query.
You may not answer the user's query unless there is specific
context in the following text.
If you do not know the answer, or cannot answer, please respond
with "I don't know".
Context:
{context}
User Query:
{user_query}
Context: ref 1
Context: ref 2
Context: ref 3
Context: ref 4
Ryan was ...
Query
INPUT
“Query”
50. Embedding Model Chat Model
Vector Store
Find Nearest
Neighbours
(cosine similarity)
Return document(s)
from
Nearest Neighbours
[0.1, 0.4, -0.6, ...]
Prompt Templates
Vector Database
App Logic App Logic
Use the provided context to answer the user's query.
You may not answer the user's query unless there is specific
context in the following text.
If you do not know the answer, or cannot answer, please respond
with "I don't know".
Context:
{context}
User Query:
{user_query}
Context: ref 1
Context: ref 2
Context: ref 3
Context: ref 4
Answer
Query
INPUT
OUTPUT
“Query”
51. Embedding Model Chat Model
Vector Store
Find Nearest
Neighbours
(cosine similarity)
Return document(s)
from
Nearest Neighbours
[0.1, 0.4, -0.6, ...]
Prompt Templates
Vector Database
App Logic App Logic
Use the provided context to answer the user's query.
You may not answer the user's query unless there is specific
context in the following text.
If you do not know the answer, or cannot answer, please respond
with "I don't know".
Context:
{context}
User Query:
{user_query}
Context: ref 1
Context: ref 2
Context: ref 3
Context: ref 4
Answer
Query
INPUT
OUTPUT
“Query”
Dense Vector Retrieval
In-Context Learning
52. Embedding Model Chat Model
Vector Store
Find Nearest
Neighbours
(cosine similarity)
Return document(s)
from
Nearest Neighbours
[0.1, 0.4, -0.6, ...]
Prompt Templates
Vector Database
App Logic App Logic
Use the provided context to answer the user's query.
You may not answer the user's query unless there is specific
context in the following text.
If you do not know the answer, or cannot answer, please respond
with "I don't know".
Context:
{context}
User Query:
{user_query}
Context: ref 1
Context: ref 2
Context: ref 3
Context: ref 4
Answer
Query
INPUT
OUTPUT
“Query”
Dense Vector Retrieval
In-Context Learning
65. Search OpenAI blog for top k resources, rerank
1.
Ask specific questions related to content
2.
Return answers to questions with sources
3.
OpenAI RAG Flow
74. THE AGE OF THE AI ENGINEER
“A wide range of AI tasks that used to
take 5 years and a research team to
accomplish in 2013, now just require API
docs and a spare afternoon in 2023.”
It is now possible to build what used
to take months in a single day!
76. CONCLUSIONS
Best-practice tools are out there!
LangSmith-like tooling is the most comprehensive
Building
Prompt Engineering, RAG, Fine-Tuning
Improvement
Depends on Building!
Eval varies
Lots of work for data scientist and AI Engineers!
77. Q&A
Tony Karrer
Founder & CEO TechEmpower,
Founder & CTO Aggregage
Dr. Greg Loughnane
Founder & CEO of
AI Makerspace
Chris Alexiuk
Co-Founder & CTO at
AI Makerspace
Tara Dwyer
Webinar Manager
/in/tonykarrer/
aggregage.com
/in/gregloughnane/
aimakerspace.io
/in/csalexiuk/
aimakerspace.io
/in/taradwyer/
artificialintelligencezone.com
JOIN THE GENERATIVE AI FOR TECHNOLOGY LEADERS LINKEDIN GROUP
FOR THOUGHTFUL DISCUSSION AND Q&A! VISIT THE LINK OR SCAN THE QR CODE!
bit.ly/genaitechleaders