This training aims to give an overview of what Entity Framework is and to provide you with some tips and tricks and links to let you improve the way you work with it.
Microsoft Entity Framework is an object-relational mapper that bridges the gap between object-oriented programming languages and relational databases. The presentation introduced Entity Framework, discussed its architecture including the conceptual data model and entity data model, and demonstrated CRUD operations and other core functionality. It also provided an overview of Entity Framework's history and versions.
This is an introduction session about Microsoft Entity Framework 4.0 (year 2011), since then the technology has evolved and matured in many ways and some of the limitations had been mitigated.
This document provides an overview of Entity Framework 4 and how to use it in different scenarios. It discusses Code First, Model First, and Database First approaches. For Code First, it demonstrates how to create data classes and contexts and add a connection string. For Model First, it shows how to create an entity data model, generate a database from the model, and use the model in an MVC project. It also discusses how to generate models from an existing database. Finally, it demonstrates integrating Entity Framework with an MVC 3 and Razor project, including adding controllers and views to display and edit data.
VSUG Day 2010 Summer - Using ADO.NET Entity FrameworkAtsushi Fukui
This presentataion slide describes Microsoft.NET Framework 4 ADO.NET Entity Framework and was used for VSUG Day 2010 Summer in Japan.
VSUG (Visual Studio Users Group)
The document provides an overview of Entity Framework 4.0. It discusses the history of data access frameworks leading up to EF4, key features of EF4 like the Entity Data Model (EDM), and patterns for developing with EF4 such as repositories, unit of work, and POCO objects. It also covers querying EF4 models using LINQ to Entities, testing with EF4 through interfaces like IObjectSet, and enabling lazy loading for related entities.
This document discusses the evolution of data access from 1990 to 2010, focusing on object-relational mapping (ORM) techniques. It provides an overview of ORM as an abstraction technique for working with relational data as objects. The document outlines several ORM options available for .NET developers and describes Microsoft's strategic ORM technologies - LINQ to SQL and the ADO.NET Entity Framework. It provides details on Entity Framework's Entity Data Model and how to consume an EDM to query and manage data.
Getting started with entity framework 6 code first using mvc 5Ehtsham Khan
This document summarizes steps for creating an ASP.NET MVC 5 application using Entity Framework 6 Code First to access data. It describes creating a data model with Student, Enrollment, and Course entities, a database context, and test data initialization. It also covers creating a Student controller and views to display and manage student data, adding basic CRUD functionality, sorting, filtering, paging, and connection resiliency.
ADO.NET is a data access technology that allows applications to connect to and manipulate data from various data sources. It provides a common object model for data access that can be used across different database systems through data providers. The core objects in ADO.NET include the Connection, Command, DataReader, DataAdapter and DataSet. Data can be accessed in ADO.NET using either a connected or disconnected model. The disconnected model uses a DataSet to cache data locally, while the connected model directly executes commands against an open connection.
Microsoft Entity Framework is an object-relational mapper that bridges the gap between object-oriented programming languages and relational databases. The presentation introduced Entity Framework, discussed its architecture including the conceptual data model and entity data model, and demonstrated CRUD operations and other core functionality. It also provided an overview of Entity Framework's history and versions.
This is an introduction session about Microsoft Entity Framework 4.0 (year 2011), since then the technology has evolved and matured in many ways and some of the limitations had been mitigated.
This document provides an overview of Entity Framework 4 and how to use it in different scenarios. It discusses Code First, Model First, and Database First approaches. For Code First, it demonstrates how to create data classes and contexts and add a connection string. For Model First, it shows how to create an entity data model, generate a database from the model, and use the model in an MVC project. It also discusses how to generate models from an existing database. Finally, it demonstrates integrating Entity Framework with an MVC 3 and Razor project, including adding controllers and views to display and edit data.
VSUG Day 2010 Summer - Using ADO.NET Entity FrameworkAtsushi Fukui
This presentataion slide describes Microsoft.NET Framework 4 ADO.NET Entity Framework and was used for VSUG Day 2010 Summer in Japan.
VSUG (Visual Studio Users Group)
The document provides an overview of Entity Framework 4.0. It discusses the history of data access frameworks leading up to EF4, key features of EF4 like the Entity Data Model (EDM), and patterns for developing with EF4 such as repositories, unit of work, and POCO objects. It also covers querying EF4 models using LINQ to Entities, testing with EF4 through interfaces like IObjectSet, and enabling lazy loading for related entities.
This document discusses the evolution of data access from 1990 to 2010, focusing on object-relational mapping (ORM) techniques. It provides an overview of ORM as an abstraction technique for working with relational data as objects. The document outlines several ORM options available for .NET developers and describes Microsoft's strategic ORM technologies - LINQ to SQL and the ADO.NET Entity Framework. It provides details on Entity Framework's Entity Data Model and how to consume an EDM to query and manage data.
Getting started with entity framework 6 code first using mvc 5Ehtsham Khan
This document summarizes steps for creating an ASP.NET MVC 5 application using Entity Framework 6 Code First to access data. It describes creating a data model with Student, Enrollment, and Course entities, a database context, and test data initialization. It also covers creating a Student controller and views to display and manage student data, adding basic CRUD functionality, sorting, filtering, paging, and connection resiliency.
ADO.NET is a data access technology that allows applications to connect to and manipulate data from various data sources. It provides a common object model for data access that can be used across different database systems through data providers. The core objects in ADO.NET include the Connection, Command, DataReader, DataAdapter and DataSet. Data can be accessed in ADO.NET using either a connected or disconnected model. The disconnected model uses a DataSet to cache data locally, while the connected model directly executes commands against an open connection.
ADO.NET provides a set of classes for working with data in .NET applications. It offers improvements over ADO such as support for disconnected data access, XML transport of data, and a programming model designed for modern applications. The core classes of ADO.NET include the Connection class for establishing a connection to a data source, the Command class for executing queries and stored procedures, the DataReader class for sequential access to query results, and the DataAdapter class for populating a DataSet and updating data in the data source. Developers use ADO.NET to connect to databases, retrieve data using DataAdapters, generate DataSets to store and manipulate the data, and display it using list-bound controls like DropDownLists and
This document provides an overview of Entity Framework (EF), an object-relational mapping (ORM) framework that allows .NET applications to access and manipulate relational data as objects. It discusses EF concepts like the DbContext class, entity classes, associations, and change tracking. It demonstrates basic EF workflows and shows how to perform CRUD operations, execute LINQ queries, extend entity classes, and attach/detach objects. The document also provides homework assignments related to using EF with the Northwind sample database.
Entity Framework and Domain Driven DesignJulie Lerman
Given at Oredev 2013 (Nov 2013 in Malmo Sweden). This presentaiton is about the intersection of Entity Framework (EF ) and Domain Driven Design (DDD) and gives pointers about *not* worrying about EF when implementing your domain in code and what you can expect when it's time to implement the persistence layer. There is a video of me giving this presentation on Vimeo at http://vimeo.com/78893724
Gatorade Case Study: Fueling fast and future-focused brand insights through a...ZappiStore
Having sponsored several major sporting events over the past year, including the NBA playoffs, the team behind the Gatorade sports drink needed to reaffirm that their strategy of targeting competitive athletes at sports events was an effective route to reach the general market.
In order to respond at research questions, Snapshot tool deployed by Added Value and Zappistore is presented.
Check out, how they used this tool to take strategic decisions.
If you want to learn more about this case study, check this article: http://goo.gl/MPIrHw
More information on the product : https://www.zappistore.com
This document provides an overview of ADO.NET, which is a set of classes in the .NET Framework that allows developers to access and manipulate data. It discusses the connected and disconnected architectures in ADO.NET using connection, command, data reader, data adapter, and dataset objects. The connected architecture relies on an open connection to the database, while the disconnected architecture allows caching data in memory for offline access using datasets.
This document discusses how data is represented in computer systems. It covers basic units of data like bits and bytes and larger units like kilobytes and megabytes. It also explains binary and hexadecimal number systems. Additionally, it discusses how other data types like characters, images, sound, and computer instructions are represented and stored in binary format. Key concepts covered include character sets, pixels, metadata, sample rates, bit rates, opcodes, and operands.
This document covers key concepts related to computer communications and networking. It defines standalone computers and networks, and describes local area networks (LANs) and wide area networks (WANs). The document outlines common network hardware, topologies like bus, ring and star, and addresses like IP addresses and MAC addresses. It also discusses internet hardware, protocols, security, policies, and file formats for sharing information over networks and the internet. Concepts like domains, packets, compression and HTML are introduced for understanding how data is transmitted and retrieved online.
The document provides an overview of ADO.NET, which is Microsoft's data access technology for .NET applications to connect to and manipulate data in various data stores. It discusses key ADO.NET concepts like connections, commands, data readers, data adapters, datasets and how they are used to work with different data providers like SQL Server, OLE DB, and ODBC. It also covers data binding using data grids and filtering data views.
This document provides an overview of ADO.NET compared to ADO and describes the main objects used in ADO.NET for data access like the Connection, Command, DataReader, DataAdapter, DataSet and DataView objects. It discusses how ADO.NET uses a disconnected model with the DataSet object to cache and manage data across tiers compared to ADO's coupled model. The document also includes code examples of creating a DataReader and populating a DataSet using a DataAdapter.
The document provides information about ADO.NET, which is a data access technology that enables applications to connect to data stores and manipulate data. It discusses key ADO.NET concepts like the object model, different classes like DataSet, DataAdapter, and DataReader. It also covers how to work with ADO.NET in a connected or disconnected manner, use parameters, and perform basic data operations like selecting, inserting, updating and deleting data.
This chapter covers programming with data in databases using ADO.NET. It discusses accessing and modifying database data declaratively using data source controls or programmatically using ADO.NET classes like DbConnection, DbCommand, and DbDataReader. It also covers filling DataSet objects with data from databases using DbDataAdapter and executing transactions across multiple databases.
ADO.NET is a set of libraries included with the .NET Framework that help communicate with various data stores from .NET applications. The ADO.NET libraries include classes for connecting to a data source, submitting queries, and processing results. ADO.NET also allows for disconnected data access using objects like the DataSet which allows data to be cached and edited offline. The core ADO.NET objects include connections, commands, data readers, data adapters and data sets which provide functionality similar to but also improvements over ADO.
The document discusses ADO.NET and how it provides disconnected data access through the use of datasets, data adapters, and data providers. It covers the core ADO.NET objects like connection, command, data reader, and data adapter. It provides examples of loading data from databases into datasets using data adapters and binding datasets to controls for display and editing. The .NET framework supports multiple data providers for different database systems like SQL Server, Oracle, OLE DB, and ODBC.
The document introduces Windows Presentation Foundation (WPF) as a new graphical display system for Windows applications. It discusses key WPF features like vector graphics, rich text, animation, audio/video support, styles/templates, commands, and using XAML for declarative UI definition. XAML separates the user interface definition from business logic code and allows different teams to work on UI and code simultaneously.
Answers the following questions:
What is prototyping?
What are the different types of prototypes?
What is it used for?
How do you prototype for usability testing?
The Reactive Extensions (Rx) is a library for composing asynchronous and event-based programs using observable sequences and LINQ-style query operators. Here is an overview of Rx with examples at the end.
Why another test framework in dotnet ? In this presentation, I will try to convince you to switch to xUnit. Main concepts & extensibility points are covered here. Happy testing !
ADO.NET provides a set of classes for working with data in .NET applications. It offers improvements over ADO such as support for disconnected data access, XML transport of data, and a programming model designed for modern applications. The core classes of ADO.NET include the Connection class for establishing a connection to a data source, the Command class for executing queries and stored procedures, the DataReader class for sequential access to query results, and the DataAdapter class for populating a DataSet and updating data in the data source. Developers use ADO.NET to connect to databases, retrieve data using DataAdapters, generate DataSets to store and manipulate the data, and display it using list-bound controls like DropDownLists and
This document provides an overview of Entity Framework (EF), an object-relational mapping (ORM) framework that allows .NET applications to access and manipulate relational data as objects. It discusses EF concepts like the DbContext class, entity classes, associations, and change tracking. It demonstrates basic EF workflows and shows how to perform CRUD operations, execute LINQ queries, extend entity classes, and attach/detach objects. The document also provides homework assignments related to using EF with the Northwind sample database.
Entity Framework and Domain Driven DesignJulie Lerman
Given at Oredev 2013 (Nov 2013 in Malmo Sweden). This presentaiton is about the intersection of Entity Framework (EF ) and Domain Driven Design (DDD) and gives pointers about *not* worrying about EF when implementing your domain in code and what you can expect when it's time to implement the persistence layer. There is a video of me giving this presentation on Vimeo at http://vimeo.com/78893724
Gatorade Case Study: Fueling fast and future-focused brand insights through a...ZappiStore
Having sponsored several major sporting events over the past year, including the NBA playoffs, the team behind the Gatorade sports drink needed to reaffirm that their strategy of targeting competitive athletes at sports events was an effective route to reach the general market.
In order to respond at research questions, Snapshot tool deployed by Added Value and Zappistore is presented.
Check out, how they used this tool to take strategic decisions.
If you want to learn more about this case study, check this article: http://goo.gl/MPIrHw
More information on the product : https://www.zappistore.com
This document provides an overview of ADO.NET, which is a set of classes in the .NET Framework that allows developers to access and manipulate data. It discusses the connected and disconnected architectures in ADO.NET using connection, command, data reader, data adapter, and dataset objects. The connected architecture relies on an open connection to the database, while the disconnected architecture allows caching data in memory for offline access using datasets.
This document discusses how data is represented in computer systems. It covers basic units of data like bits and bytes and larger units like kilobytes and megabytes. It also explains binary and hexadecimal number systems. Additionally, it discusses how other data types like characters, images, sound, and computer instructions are represented and stored in binary format. Key concepts covered include character sets, pixels, metadata, sample rates, bit rates, opcodes, and operands.
This document covers key concepts related to computer communications and networking. It defines standalone computers and networks, and describes local area networks (LANs) and wide area networks (WANs). The document outlines common network hardware, topologies like bus, ring and star, and addresses like IP addresses and MAC addresses. It also discusses internet hardware, protocols, security, policies, and file formats for sharing information over networks and the internet. Concepts like domains, packets, compression and HTML are introduced for understanding how data is transmitted and retrieved online.
The document provides an overview of ADO.NET, which is Microsoft's data access technology for .NET applications to connect to and manipulate data in various data stores. It discusses key ADO.NET concepts like connections, commands, data readers, data adapters, datasets and how they are used to work with different data providers like SQL Server, OLE DB, and ODBC. It also covers data binding using data grids and filtering data views.
This document provides an overview of ADO.NET compared to ADO and describes the main objects used in ADO.NET for data access like the Connection, Command, DataReader, DataAdapter, DataSet and DataView objects. It discusses how ADO.NET uses a disconnected model with the DataSet object to cache and manage data across tiers compared to ADO's coupled model. The document also includes code examples of creating a DataReader and populating a DataSet using a DataAdapter.
The document provides information about ADO.NET, which is a data access technology that enables applications to connect to data stores and manipulate data. It discusses key ADO.NET concepts like the object model, different classes like DataSet, DataAdapter, and DataReader. It also covers how to work with ADO.NET in a connected or disconnected manner, use parameters, and perform basic data operations like selecting, inserting, updating and deleting data.
This chapter covers programming with data in databases using ADO.NET. It discusses accessing and modifying database data declaratively using data source controls or programmatically using ADO.NET classes like DbConnection, DbCommand, and DbDataReader. It also covers filling DataSet objects with data from databases using DbDataAdapter and executing transactions across multiple databases.
ADO.NET is a set of libraries included with the .NET Framework that help communicate with various data stores from .NET applications. The ADO.NET libraries include classes for connecting to a data source, submitting queries, and processing results. ADO.NET also allows for disconnected data access using objects like the DataSet which allows data to be cached and edited offline. The core ADO.NET objects include connections, commands, data readers, data adapters and data sets which provide functionality similar to but also improvements over ADO.
The document discusses ADO.NET and how it provides disconnected data access through the use of datasets, data adapters, and data providers. It covers the core ADO.NET objects like connection, command, data reader, and data adapter. It provides examples of loading data from databases into datasets using data adapters and binding datasets to controls for display and editing. The .NET framework supports multiple data providers for different database systems like SQL Server, Oracle, OLE DB, and ODBC.
The document introduces Windows Presentation Foundation (WPF) as a new graphical display system for Windows applications. It discusses key WPF features like vector graphics, rich text, animation, audio/video support, styles/templates, commands, and using XAML for declarative UI definition. XAML separates the user interface definition from business logic code and allows different teams to work on UI and code simultaneously.
Answers the following questions:
What is prototyping?
What are the different types of prototypes?
What is it used for?
How do you prototype for usability testing?
The Reactive Extensions (Rx) is a library for composing asynchronous and event-based programs using observable sequences and LINQ-style query operators. Here is an overview of Rx with examples at the end.
Why another test framework in dotnet ? In this presentation, I will try to convince you to switch to xUnit. Main concepts & extensibility points are covered here. Happy testing !
A really quick introduction to Microsoft Azure Storage and all of its services. It's one of the core components of Azure and it's really important to understand it if you want to "move to the cloud".
This document provides an overview of Akka.NET, an actor model framework for .NET. It discusses how Akka.NET uses message passing between immutable messages to build distributed and concurrent applications. It also covers key Akka.NET concepts like actors, actor systems, supervision strategies, and plugins for clustering, persistence and remoting.
'Scenario Driven Design' allow programmers to make more usable APIs and avoid performance issues. REST principles are often misunderstood and programmers expose their raw data model without any logic. Think about your scenarios first !
The document provides tips for effectively managing email in Outlook. It recommends using only 3 folders - Inbox, Reference, and Personal. Categories should be set up for emails like @Read and @Waiting to help with organization. Search folders allow filtering emails in different categories. The four D's model - Do, Delegate, Defer, Delete - is presented as a decision-making framework for handling emails. Calendar, tasks, and rules are also discussed as tools for staying organized. Questions can be directed to the presenter, Clive, by email.
Performance doesn’t have the same definition between system administrators, developpers and business teams. What is Performance ? High CPU usage, not scalable web site, low business transaction rate per sec, slow response time, … This presentation is about maths, code performance, load testing, web performance, best practices, … Working on performance optimizaton is a very broad topic. It’s important to really understand main concepts and to have a clean and strong methodology because it could be a very time consumming activity. Happy reading !
Because we are not only shipping code and we are no longer Microsoft developers but .NET developers, it's time to open your mind and to see what is offering the OSS world.
Docker is an amazing tool.
Docker did popularize container and brought a way to manage it.
Ok, seems to be cool, but why do developers care?
- Static application environment: we know exactly what we are running
- Repeatable, runnable artifact: we can deploy everywhere, anytime
- Loosely coupled: we can manage, isolate, and compose at environment level easily
Please have a look to this Betclic presentation and remember that .NET CLR are coming in GNU/linux world!
Flyway is a light database migration tools:
- Migrate the database from a list of sql migration scripts (schemas and data).
- Each script is prefixed by a version number that determine the version of the database.
- The execution trace of the scripts is saved in a "schemas_version" table.
- Automatically find which scripts to execute to upgrate a database to a specific version.
NDepend is a static analysis tool for .NET managed code. This tool supports a large number of code metrics, allows for visualization of dependencies using directed graphs and dependency matrix. The tools also performs code base snapshots comparison, and validation of architectural and quality rules.
This document summarizes Jurgen Appelo's book "Management 3.0" and provides examples of management workout exercises. It discusses that Management 1.0 is bad management, while Management 2.0 tries to do the right thing but fails due to a lack of understanding of social systems. Management 3.0 does the right thing through good understanding. Seventeen management workout themes are then outlined that support engaging people, improving systems, and delighting clients.
A mixed introduction of Lean and Agile concepts targeted at business audience, presenting 3 key lean concepts (MVP, short feedback loop, cost of delay).
The document discusses features and changes in ASP.NET vNext, the future version of ASP.NET. It describes how vNext uses project.json for dependencies instead of references, allows editing code without recompiling, and merges MVC, Web API and Web Pages into a single framework. It also discusses tools for building, running and deploying vNext applications in Visual Studio 2015 and how the runtime will be more modular and cross-platform compared to previous versions of ASP.NET.
Since the introduction of C#, async/await concepts are still misunderstood by many developers.
Async programming tries to solve three problems (Offloading, Concurrency, Scalability) in a mean abstraction.
This presentation is a good starting point to asynchronous programming in .net. There are many links and references, so do not hesitate to go deeper.
This document discusses mobile UX trends from October 2014. It covers interfaces, use of space and content, colors, pictures and effects, gestures, and animations. Specific trends mentioned include simplified interfaces focusing on key actions, use of layered and circular interface elements, infographics, blurred backgrounds, large images, swipe gestures, and animations that guide users without overusing motion effects. Examples are provided for many of these trends from apps like FIFA, Airbnb, Vine, and Google Glass. Guidelines are also referenced from Apple, Android, Windows, and other sources.
The Model View ViewModel (MVVM) is an architectural pattern originated by Microsoft as a specialization of the Presentation Model (Martin Fowler). Similar to MVC, MVVM is suitable for client applications (Xaml-based, Xamarin, SPA, ...) because it facilitates a clear separation between the UI and the Business Logic. Examples with WPF, MvvmCross, AngularJs. It also contains solutions for common use cases.
Recommendations are everywhere : music, movies, books, social medias, e-commerce web sites… The Web is leaving the era of search and entering one of discovery. This quick introduction will help you to understand this vast topic and why you should use it.
In one of our weekly training, we’ve talked about Git. Here is a quick overview of the main concepts, basic commands and branching strategy, how to work with Git, how to contribute to an OSS project, …
This document provides an overview of AngularJS best practices, covering topics such as file organization, naming conventions, modules, controllers, services, directives, and scope. It discusses organizing code by feature and type, using namespacing prefixes, understanding modules and their organization, defining controller, service and directive roles, communicating between components, avoiding FOUC, and thinking declaratively. Specific practices are covered for minification, services creation, directives usage, scope interfaces, and controllers versus link functions.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Webinar: Designing a schema for a Data WarehouseFederico Razzoli
Are you new to data warehouses (DWH)? Do you need to check whether your data warehouse follows the best practices for a good design? In both cases, this webinar is for you.
A data warehouse is a central relational database that contains all measurements about a business or an organisation. This data comes from a variety of heterogeneous data sources, which includes databases of any type that back the applications used by the company, data files exported by some applications, or APIs provided by internal or external services.
But designing a data warehouse correctly is a hard task, which requires gathering information about the business processes that need to be analysed in the first place. These processes must be translated into so-called star schemas, which means, denormalised databases where each table represents a dimension or facts.
We will discuss these topics:
- How to gather information about a business;
- Understanding dictionaries and how to identify business entities;
- Dimensions and facts;
- Setting a table granularity;
- Types of facts;
- Types of dimensions;
- Snowflakes and how to avoid them;
- Expanding existing dimensions and facts.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
2. Agenda
• A brief history…
• What’s an ORM?
• Entity Framework Architecture
• DB First / Model First and the (in)famous EDMX
• Inheritance
• Code First / Code Second
• Eager / Lazy & Explicit Loading
• Performance / Profiling
204/10/2013 Entity Framework Training
3. A brief history…
EF releases and versioning is a bit of a mess :-)
• EF aka EF1 aka EF 3.5 was released with . NET 3.5 (VS2008)
• Basic ORM functionalities / DB first only
• EF 4 was released with . NET 4.0 (VS2010)
• POCO support / Lazy loading
• EF 4.1
• DBContext API / Code First / Nuget package
• EF 4.1.1 and then EF 4.2
• Mainly bug fixes
• EF 4.3
• Code First Migration
• EF 4.3.1
• Bug fixes / better LocalDB support
304/10/2013 Entity Framework Training
4. A brief history…
MS guys were the first to admit that this versioning was not clear and that’s why
they rationalize how they were naming and distributing their releases
• EF 5
• This release can be used in Visual Studio 2010 and Visual Studio 2012 to write
applications that target .NET 4.0 and .NET 4.5
• But when targeting .NET 4.5 you have enum support / table-valued functions /
performance improvements / multiple-diagrams per model
• EF 6
• Async Query and Save / Testability improvements / DbSet.AddRange &
RemoveRange / DbChangeTracker.HasChanges / Dependency Resolution / Code
First Mapping to Insert/Update/Delete Stored Procedures / …
Have a look at this post on ADO .NET Blog where MS guys were calling for
feedback about versioning issues
404/10/2013 Entity Framework Training
7. EF Architecture
704/10/2013 Entity Framework Training
• Object Services
• This is where the DBContext is, which represents the session of interaction between
the applications and the data source. It provides the facilities for tracking changes
and managing identities, concurrency, and relationships. And save changes towards
the DB.
• EntityClient Data Provider
• This provider manages connections, translates entity queries into data source-
specific queries, and returns a data reader that the Entity Framework uses to
materialize entity data into objects.
• Data Providers
• This is the lowest layer which translates L2E queries via command tree to native
SQL expression and executes it against the specific DBMS system
9. DB model VS Conceptual model
• A DB model is designed to address storage problems. It’s also optimized to deal
with performance while fetching data.
• A Business object model is designed to handle business needs.
• They should not look like each other!
• Entity Framework allow you to do so. Whatever approach you’ll choose, you’ll
be able to isolate both models and handle the mapping, well… in the mapping
layer ;-)
904/10/2013 Entity Framework Training
10. DB model VS Conceptual model
Please read this pretty interesting article where Rowan Miller and José A. Blakeley
answer that question: “Do we still have an impedance mismatch problem? “
To conclude, I’d like to quote that comment I found on an article about impedance
mismatch, that somehow summarizes some useless debates:
“RDBMS has its advantages, OOP has its advantages, and SOA also has its
advantages. These things aren't just useless extra "layers", they're tools which all
serve different purposes (RDBMS for working with large datasets, SOA for
implementing complicated business rules, OOP for writing more type-safe and
maintainable code).
If you stop viewing these concepts as being at war with each other and start
viewing them as different perspectives of the same solution, all the long-winded
arguments about OOP and ORM and whatever start to look a lot more like lame
excuses for ordinary human inertia (i.e. the overwhelming urge to keep doing
things the way you're doing them now, at any cost).” ;-)
1004/10/2013 Entity Framework Training
11. The (in)famous EDMX
An EDMX file is just an XML file.
Just give that a try: Right-click on one EDMX file in Visual Studio and choose “Open
with…” menu option and select the XML entry as follows:
1104/10/2013 Entity Framework Training
12. The (in)famous EDMX
As I said, EDMX is just an XML file:
1204/10/2013
An EDMX is a combination of 3 different parts that make up the whole thing:
• <edmx:StorageModels> aka the SSDL
• <edmx:ConceptualModels> aka the CSDL
• <edmx:Mappings> aka the MSL
…you can even edit those sections manually, but at your own risk! :-)
13. The (in)famous EDMX
1304/10/2013
By default, the EDMX is embedded in the project assembly it belongs to:
The connection string will look like this:
Entity Framework Training
14. The (in)famous EDMX
14
It’s also possible to generate CSDL, SSDL & MSL by changing the "Meta Artifact
Processing" property of your model to "Copy to Output Directory".
3 separate XML files
Yes, it can be useful in some particular cases…
The connection string will look like as right below, did you notice the changes?
Last piece of advice, if you want to edit the XML files by yourself, checkout the
cookbook first! ;-) Entity Framework Training
15. The (in)famous EDMX
1504/10/2013
You might want to have a look at the Edmgen2 tool.
As written on the Edmgen2 homepage :
EdmGen2 is a command-line tool for the Microsoft ADO.NET Entity Framework.
EdmGen.exe can only read and write the CSDL, SSDL & MSL file formats. However,
EdmGen2.exe can read and write the EDMX file format used by the Visual Studio
design tools.
Additionally, EdmGen2.exe contains some experimental functionality not found in
EdmGen.exe. […] with the ability to identify inheritance relationships in relational
database schema […] and constructs a suitable model and mapping,
To be complete, have a look at this article to see the changes that were made to
the EDMX schema since V1 specifications to the current V3.
Entity Framework Training
16. The (in)famous EDMX
1604/10/2013
The goal of this training is not to learn how to manipulate the Entity Data Model
Designer in Visual Studio, you certainly all know at least the basic features.
If you need to go into further details, have a look a this MSDN Library article called
“Modeling and Mapping with Entity Framework Designer”. It contains everything
you’ll need to be able to map CSDL with SSDL like a chef!
By the way, now you should understand easily why the CSDL is not always updated
even if you perform an “Update Model From DB Wizard” towards the EDMX. It will
work only in case of new things added in the DB model. As I said, CSDL and SSDL
must live their own life. Entity Framework cannot decide for you whether it needs
to remove an entity property if one column has been altered/deleted in the DB.
Every modification made in your DB schema should impact your MSL rather than
your CSDL.
Entity Framework Training
17. The (in)famous EDMX
1704/10/2013
• Table-per-concrete class mapping.
• Unmapped abstract types. When you
create an abstract entity type with the
Entity Designer, the type must be mapped
to a table or view.
• Creating conditions on association
mappings.
• Mapping associations directly to stored
procedures. Mapping many-to-many
associations is not supported.
• Creating conditions on Function
Import mappings.
• Annotations.
• Query views.
• Models that contain references to other
models.
• Creating associations without
corresponding navigation properties.
• Adding or editing storage model objects.
(Deleting storage model objects is
supported.)
• Adding, editing, or deleting functions that
are defined in the conceptual model.
One thing you should know about the Entity Designer is that it has some limitations…
But things are getting better, have a look at the J. Lerman article about the new
Entity Designer in VS2012, more to come with VS2013 and EF6, stay tuned!
Entity Framework Training
19. Inheritance
1904/10/2013
Let’s see how those mapping strategies work with the simple example from the
Ado.Net Blog.
The entities:
Entity Framework Training
20. Inheritance
2004/10/2013
Table Per Hierarchy (TPH)
TPH inheritance uses one database table to maintain data for all of the entity types
in an inheritance hierarchy.
The table contains a column which I have named ‘Type’. This column acts as the
discriminator, which helps determine the whether a bike is a TTBike or a
MountainBike.
Entity Framework Training
21. Inheritance
2104/10/2013
Table Per Type (TPT)
In TPT there is a base table which stores all of the common information. There is
also one or more derived entities which store the attributes that are unique to that
derived entity.
The EF Designer uses the TPT strategy by default and so any inheritance in the
model will be mapped to separate tables.
Entity Framework Training
22. Inheritance
2204/10/2013
Table Per Concrete Class (TPC)
Table per Concrete class creates a table for each derived (or concrete) entity and
does not create a table for the base abstract entity.
TPC is supported by the Entity Framework at runtime but are not supported by the
EF Designer. If you want to use TPC you have two options: use Code First, or
manually edit the EDMX file.
Entity Framework Training
23. Inheritance
2304/10/2013
If you have a conceptual model with object inheritance, use the
OfType<TResultType> to limit the query to only result of a specific type.
Entity Framework Training
foreach (var course in department.Courses.OfType<OnlineCourse>())
{
Console.WriteLine(" Online - {0}", course.Title);
}
Anyway, before going on and implementing one of those inheritance mapping
strategies, ask your DBA first! ;-)
24. Mapping
2404/10/2013
We just saw that it was kinda easy to map a DB model with an Object model that
are different (and should stay different).
And it’s even easier with Code First / Code Second approaches! But we’ll see that
just after.
But there’s one important thing to mention: We all agree that DB model should
not drive the way we design our Object model. But it should neither be the case
the other way round!
If you modified your Object model, don’t take the T-SQL scripts EF will generate to
update your DB accordingly, for granted.
Ask your DBA if it’s ok or not before validating any changes in the DB schema!
They know better than EF ;-)
Entity Framework Training
25. Code First / Code Second
2504/10/2013
Code First approach lets you define your conceptual model and mapping to the DB
model using code only. With Code First, you get rid off the EDMX!
What’s Code Second?
It’s pretty much the same, it means only that you can do it with an existing DB.
How?
Using the EF Power Tools and use the Reverse Engineer Code First option!
By the way EF Power Tools offers other great features:
• Reverse Engineer Code First
• Customize Reverse Engineer Templates
• View Entity Data Model(Read-only) / XML / DDL SQL
• Generate (pre-compiled) views
Entity Framework Training
Easy!
26. Code First / Code Second
2604/10/2013
Let’s see how Code First works…
Create your classes:
public class Blog
{
public int BlogId { get; set; }
public string Name { get; set; }
public virtual List<Post> Posts { get; set; }
}
public class Post
{
public int PostId { get; set; }
public string Title { get; set; }
public string Content { get; set; }
public int BlogId { get; set; }
public virtual Blog Blog { get; set; }
}
Entity Framework Training
Create your context :
public class BloggingContext : DbContext
{
public DbSet<Blog> Blogs { get; set; }
public DbSet<Post> Posts { get; set; }
}
Yeah, it’s that easy!
By convention DbContext has created a database for you.
Store data
using (var db = new BloggingContext())
{
var blog = new Blog { Name = « MyBlog » };
db.Blogs.Add(blog);
db.SaveChanges();
}
27. Code First / Code Second - Conventions
2704/10/2013
EF was able to create a DB based on the model because it uses conventions!
• Primary Key Convention
• If a property on a class is named “ID” or <className>ID:
public int DepartmentID { get; set; }
• Type Discovery
• Your context exposes DbSet properties for the types that you want to be part of the model. Code
First will include these types and also will pull in any referenced types, even if the referenced
types are defined in a different assembly.
• If your types participate in an inheritance hierarchy, it is enough to define a DbSet property for
the base class, and the derived types will be automatically included, if they are in the same
assembly as the base class.
• Complex Types Convention
• If no primary key can be inferred then the type is automatically registered as a complex type
Entity Framework Training
OVER
28. Code First / Code Second - Conventions
2804/10/2013
• Relationship Convention
• Any property with the same data type as the primary key and with a name like:
• <navigation property name><principal primary key property name>
• <principal class name><primary key property name>
• <principal primary key property name>
will represent a Foreign Key public class Department
{
// Primary key
public int DepartmentID { get; set; }
public string Name { get; set; }
// Navigation property
public virtual ICollection<Course> Courses { get; set; }
}
public class Course
{
// Primary key
public int CourseID { get; set; }
public string Title { get; set; }
// Foreign key
public int DepartmentID { get; set; }
// Navigation properties
public virtual Department Department { get; set; }
}
• Code First infers the
multiplicity of the
relationship based on the
nullability of the foreign key
• If a foreign key on the
dependent entity is not
nullable, then Code First sets
cascade delete
29. Code First / Code Second - Conventions
2904/10/2013
• Connection String Convention
• DbContext uses the namespace qualified name of your derived context class as the database
name and creates a connection string for this database using either SQL Express or LocalDb.
• Removing Conventions
• You can tell EF not to use some conventions like this:
modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
• Pluggable Conventions (EF6)
• Build your own ones!
Entity Framework Training
30. Code First / Code Second - Annotations
3004/10/2013
If your classes do not follow any kind of EF conventions, you can use attributes called
DataAnnotations
One of the coolest thing about DataAnnotations is that it’s shared between other
frameworks like ASP .Net MVC!
If you set as property as required:
[Required]
public string Title { get; set; }
The DB column will be set as “not null” and the MVC application will perform client
side validation, even dynamically building a message using the property and
annotation names.
Entity Framework Training
31. Code First / Code Second - Annotations
3104/10/2013
Here’s the list of the most common DataAnnotations:
• Key: Primary key
• Required: Not null
• MaxLength and MinLength: Obvious ;-)
• NotMapped: Not need to be stored
• ComplexType: Entity without primary key
• ConcurrencyCheck: If someone has modified the data in the meantime, it will
fail and throw a DbUpdateConcurrencyException
• TimeStamp: Concurrency based on Timestamp
• Table and Column: change the name of the tables and columns
• DatabaseGenerated: computed properties
• InverseProperty and ForeignKey: Relationship attributes
For relationships, go there!
32. Code First / Code Second – Fluent API
3204/10/2013
We saw that Code First allows you to work with Conventions (over configuration) but
this approach will obviously fit only small projects or POCs.
DataAnnotations is pushing the mapping capabilities a step forward, but keep in mind
that it has limitations as well, here’s a non exhaustive list of what it cannot do:
• The precision of a DateTime property
• The precision and scale of numeric properties
• A String or Binary property as fixed-length
• A String property as non-unicode
• The on-delete behavior of relationships
• Advanced mapping strategies
Here comes the Fluent API! (here & here)
The DataAnnotations only cover a subset of the fluent API functionality!
Entity Framework Training
33. Code First / Code Second – Fluent API
3304/10/2013
The code first fluent API is most commonly accessed by overriding the
OnModelCreating method on your derived DbContext.
34. Code First / Code Second – Fluent API
3404/10/2013
Each Builder can define their mappings
Entity Framework Training
35. Code First / Code Second – Fluent API
3504/10/2013
Doing that way, you can easily separate your mapping files from your model!
The code becomes clearer and no reference is made to any EF libraries when
you’re designed your business model. Which is not possible with DataAnnotations.
There are so many options that the Fluent API offers that it would take hours to
describe them.
Please go and read those 2 articles to go further in details with the Fluent API
• Configuring/Mapping Properties and Types with the Fluent API
• Configuring Relationships with the Fluent API
Entity Framework Training
36. Code First – Migrations
3604/10/2013
A word on Code First Migrations.
Entity Framework Code First Migrations enable changes to your model to be
propagated to your database through code. It’s based on Active Record migrations
(the primary data access technology used by the Rails framework).
It’s an easy way to be able to generate the scripts needed to go from one version
of your DB model to the V+1 or to be able to downgrade a schema version as well.
I won’t go any further, but as it’s a really nice feature, go and see more details on
MSDN!
Entity Framework Training
37. Eager / Lazy & Explicit Loading
3704/10/2013
Entity Framework allows you fetch for data and load their relationships in many
ways.
3 of them are:
• Eager Loading
• Lazy Loading
• Explicit Loading
Once again all of them have their pros & cons and should be used with attention!
What is the best choice between multiple requests against the database versus a
single request that may contain a large payload. It may be appropriate to use eager
loading in some parts of your application and lazy loading in other parts.
We’ll try to see that…
Entity Framework Training
38. Eager / Lazy & Explicit Loading
3804/10/2013
Let’s have a look at the Lazy Loading versus Eager Loading cheat sheet from MSDN
Seems that it won’t be a good option for our sites to switch on Lazy loading!
Entity Framework Training
39. Eager / Lazy & Explicit Loading
3904/10/2013
One good option seems to load exactly what we need.
But this can be done in several ways as well!
While including related entities in a query is powerful, it's important to understand
what's happening under the covers. Let’s look how the .Include() works…
As stated on MSDN:
It takes a relatively long time for a query with multiple Include statements in it to
go through our internal plan compiler to produce the store command. The majority
of this time is spent trying to optimize the resulting query. The generated store
command will contain an Outer Join or Union for each Include, depending on your
mapping. Queries like this will bring in large connected graphs from your database
in a single payload, which will acerbate any bandwidth issues, especially when
there is a lot of redundancy in the payload (i.e. with multiple levels of Include to
traverse associations in the one-to-many direction).
Customers.Include(c => c.Orders) (click if you dare!)
40. Eager / Lazy & Explicit Loading
4004/10/2013
SELECT [Project1].[C1] AS [C1],
[Project1].[CustomerID] AS [CustomerID],
[Project1].[CompanyName] AS [CompanyName],
[Project1].[ContactName] AS [ContactName],
[Project1].[ContactTitle] AS [ContactTitle],
[Project1].[Address] AS [Address],
[Project1].[City] AS [City],
[Project1].[Region] AS [Region],
[Project1].[PostalCode] AS [PostalCode],
[Project1].[Country] AS [Country],
[Project1].[Phone] AS [Phone],
[Project1].[Fax] AS [Fax],
[Project1].[C2] AS [C2],
[Project1].[OrderID] AS [OrderID],
[Project1].[CustomerID1] AS [CustomerID1],
[Project1].[EmployeeID] AS [EmployeeID],
[Project1].[OrderDate] AS [OrderDate],
[Project1].[RequiredDate] AS [RequiredDate],
[Project1].[ShippedDate] AS [ShippedDate],
[Project1].[ShipVia] AS [ShipVia],
[Project1].[Freight] AS [Freight],
[Project1].[ShipName] AS [ShipName],
[Project1].[ShipAddress] AS [ShipAddress],
[Project1].[ShipCity] AS [ShipCity],
[Project1].[ShipRegion] AS [ShipRegion],
[Project1].[ShipPostalCode] AS [ShipPostalCode],
[Project1].[ShipCountry] AS [ShipCountry]
FROM ( SELECT
[Extent1].[CustomerID] AS [CustomerID],
[Extent1].[CompanyName] AS [CompanyName],
[Extent1].[ContactName] AS [ContactName],
[Extent1].[ContactTitle] AS [ContactTitle],
[Extent1].[Address] AS [Address],
[Extent1].[City] AS [City],
[Extent1].[Region] AS [Region],
[Extent1].[PostalCode] AS [PostalCode],
[Extent1].[Country] AS [Country],
[Extent1].[Phone] AS [Phone],
[Extent1].[Fax] AS [Fax],
1 AS [C1],
[Extent2].[OrderID] AS [OrderID],
[Extent2].[CustomerID] AS [CustomerID1],
[Extent2].[EmployeeID] AS [EmployeeID],
[Extent2].[OrderDate] AS [OrderDate],
[Extent2].[RequiredDate] AS [RequiredDate],
[Extent2].[ShippedDate] AS [ShippedDate],
[Extent2].[ShipVia] AS [ShipVia],
[Extent2].[Freight] AS [Freight],
[Extent2].[ShipName] AS [ShipName],
[Extent2].[ShipAddress] AS [ShipAddress],
[Extent2].[ShipCity] AS [ShipCity],
[Extent2].[ShipRegion] AS [ShipRegion],
[Extent2].[ShipPostalCode] AS [ShipPostalCode],
[Extent2].[ShipCountry] AS [ShipCountry],
CASE WHEN ([Extent2].[OrderID] IS NULL) THEN CAST(NULL AS int) ELSE 1 END AS [C2]
FROM [dbo].[Customers] AS [Extent1]
LEFT OUTER JOIN [dbo].[Orders] AS [Extent2] ON [Extent1].[CustomerID] = [Extent2].[CustomerID]
WHERE N'UK' = [Extent1].[Country]
) AS [Project1]
ORDER BY [Project1].[CustomerID] ASC, [Project1].[C2] ASC
41. Eager / Lazy & Explicit Loading
4104/10/2013
Well, well, well… what are the options then?
• Try to reduce the number of Include statements in your query to just bring in
the data you need
• Break your query into a smaller sequence of subqueries
Rather than that: Do that:
42. Eager / Lazy & Explicit Loading
4204/10/2013
And what is Explicit Loading exactly?
Even with lazy loading disabled it is still possible to lazily load related entities, but
it must be done with an explicit call. To do so you use the Load method on the
related entity’s entry.
var blog = context.Blogs.Find(1);
// Load the posts related to a given blog
context.Entry(blog).Collection(p => p.Posts).Load();
The Query method provides access to the underlying query that the Entity
Framework will use when loading related entities.
// Load the posts with the 'entity-framework' tag related to a given blog
context.Entry(blog)
.Collection(b => b.Posts)
.Query()
.Where(p => p.Tags.Contains("entity-framework")
.Load();
Entity Framework Training
43. Eager / Lazy & Explicit Loading
4304/10/2013
Wanna be the king of loading entities?
Go there!
Entity Framework Training
44. Performance / Profiling
4404/10/2013
ORM’s are often considered by DBA’s as the evil!
One of their fears if the T-SQL generated by Entity Framework. We have seen that
in some examples we can only give them reason!
The T-SQL can be ugly and lead to really poor performance.
Here comes the profiling!
There’s a lot of tools that allow you to do so. Please read this article from Julie
Lerman where she explains how to perform profiling (see the comments as well).
Once again, do not hesitate to talk to your DBA for any kind of advice!
Entity Framework Training
45. Performance / Profiling
45
There are also a lot of best practice when it’s time to solve performance issues.
Here’s an non exhaustive list of what can be done:
• Cold vs. Warm Query Execution
• (mapping) View generation
• Moving your model to a separate assembly
• Caching (objects , results, query plans & metadata)
• Complied & Auto-compiled queries
• NoTracking queries
• Inhetirance strategies
• Upgrade to EF5 ;-)
• Lazy vs Eager
• …
Go here & here & here for much more details!
46. Appendix
4604/10/2013
ADO .Net Blog
Julie Lerman’s Blog, Rowan Miller’s Blog, Arthur Vicker’s Blog, Alex Jame’s Blog
T4 Templates and the Entity Framework
Effort - Entity Framework Unit Testing Tool
What’s Best for Unit Testing in EF? It depends, dude!
Creating an Entity Framework Data Model for an ASP.NET MVC Application (1 of 10)
Add/Attach and Entity States
Table-Valued Functions (TVFs)
Extending And Customizing Code First Models – Part 1 Of 2
Code First Insert/Update/Delete Stored Procedure Mapping (EF6)
Me and Entity Framework on StackOverflow
Entity Framework Training
48. About Betclic
• Betclic Everest Group, one of the world leaders in online gaming, has a unique portfolio
comprising various complementary international brands: Betclic, Everest Gaming, bet-at-
home.com, Expekt…
• Active in 100 countries with more than 12 million customers worldwide, the Group is
committed to promoting secure and responsible gaming and is a member of several
international professional associations including the EGBA (European Gaming and Betting
Association) and the ESSA (European Sports Security Association).
• Through our brands, Betclic Everest Group places expertise, technological know-how and
security at the heart of our strategy to deliver an on-line gaming offer attuned to the passion
of our players.