The Data Source Layer patterns introduced in lecture L08 are structural patterns. They deal with moving data to and from the database. They try to solve the impedance mismatch problem. However, they do not have any logic as to when and under what circumstances data should be loaded to from the database and when written. Clearly accessing the database is slow, and doing so needlessly hurts performance.
In this lecture we look at the behavioural problem. How to solve issues like writing only changed record back into the database, loading only the needed data once, and how to load the data model partially.
We introduce three new design patterns: Unit of Work, Identity Map and Lazy Load.
Vist the course home page: http://www.olafurandri.com/?page_id=2573
In this lecture we look at the patterns in chapter 18 in the textbook (Patterns of Enterprise Application Architecture). The lecture is in two parts. First we go through each of the patterns and explain each.
Then in the second part we look at a problem we have to solve and try to get the patterns to show themselves at the time they are needed.
Now that we have looked several design patterns, from the databases to web presentation, we are now ready to look at the application as a whole. In this lecture we examine the considerations we face when creating an application architecture and we look at each of the three layers.
The lecture presents one way of designing enterprise applications. The goal is to create scalable services.
We also look at the Play framework in more detail and look at REST.
This document provides an overview of different data source patterns in software design including Table Data Gateway, Row Data Gateway, Active Record, Data Mapper, and Record Set. It discusses how these patterns work, when each one should be used, and examples of implementing them. It also covers some key objectives in connecting applications to data sources like relational databases and hiding SQL from the domain layer.
The document summarizes new features in .NET 3.5 SP1, including enhancements to ADO.NET Entity Framework, ADO.NET Data Services, ASP.NET routing, and ASP.NET dynamic data. It provides an overview and demonstrations of each technology. Key points covered include using Entity Framework to bridge the gap between object-oriented and relational models, consuming entity data models via LINQ queries or object services, and using data services to expose data over HTTP in a RESTful manner.
This presenation explains basics of ETL (Extract-Transform-Load) concept in relation to such data solutions as data warehousing, data migration, or data integration. CloverETL is presented closely as an example of enterprise ETL tool. It also covers typical phases of data integration projects.
This document provides an overview of Entity Framework Code First, including its basic workflow, database initialization strategies, configuring domain classes using data annotations and fluent API, modeling relationships like one-to-one, one-to-many and many-to-many, and performing migrations using automated and code-based approaches. Code First allows writing classes first and generating the database, starting from EF 4.1, and supports domain-driven design principles.
Older (2008) presentation I gave internally to SunGard to educate developers on C# and LINQ. LINQ still rocks, and the concepts I cover are important language features while C# developers should be asked in interviews event today.
Excerpt from the CloverETL Basic Training slides.
The basic course lasts 3 days and covers basic principles, CloverETL Designer walkthrough, transaction analysis, lookups, database connections, working with structured data, XML etc.
More at www.cloveretl.com/services/training
In this lecture we look at the patterns in chapter 18 in the textbook (Patterns of Enterprise Application Architecture). The lecture is in two parts. First we go through each of the patterns and explain each.
Then in the second part we look at a problem we have to solve and try to get the patterns to show themselves at the time they are needed.
Now that we have looked several design patterns, from the databases to web presentation, we are now ready to look at the application as a whole. In this lecture we examine the considerations we face when creating an application architecture and we look at each of the three layers.
The lecture presents one way of designing enterprise applications. The goal is to create scalable services.
We also look at the Play framework in more detail and look at REST.
This document provides an overview of different data source patterns in software design including Table Data Gateway, Row Data Gateway, Active Record, Data Mapper, and Record Set. It discusses how these patterns work, when each one should be used, and examples of implementing them. It also covers some key objectives in connecting applications to data sources like relational databases and hiding SQL from the domain layer.
The document summarizes new features in .NET 3.5 SP1, including enhancements to ADO.NET Entity Framework, ADO.NET Data Services, ASP.NET routing, and ASP.NET dynamic data. It provides an overview and demonstrations of each technology. Key points covered include using Entity Framework to bridge the gap between object-oriented and relational models, consuming entity data models via LINQ queries or object services, and using data services to expose data over HTTP in a RESTful manner.
This presenation explains basics of ETL (Extract-Transform-Load) concept in relation to such data solutions as data warehousing, data migration, or data integration. CloverETL is presented closely as an example of enterprise ETL tool. It also covers typical phases of data integration projects.
This document provides an overview of Entity Framework Code First, including its basic workflow, database initialization strategies, configuring domain classes using data annotations and fluent API, modeling relationships like one-to-one, one-to-many and many-to-many, and performing migrations using automated and code-based approaches. Code First allows writing classes first and generating the database, starting from EF 4.1, and supports domain-driven design principles.
Older (2008) presentation I gave internally to SunGard to educate developers on C# and LINQ. LINQ still rocks, and the concepts I cover are important language features while C# developers should be asked in interviews event today.
Excerpt from the CloverETL Basic Training slides.
The basic course lasts 3 days and covers basic principles, CloverETL Designer walkthrough, transaction analysis, lookups, database connections, working with structured data, XML etc.
More at www.cloveretl.com/services/training
Introduction to SQL Alchemy - SyPy June 2013Roger Barnes
A very brief introduction to SQLAlchemy, covering the core, ORM, database concepts and a high-level comparison to the Django ORM.
IPython notebook demo content is at http://nbviewer.ipython.org/urls/dl.dropboxusercontent.com/s/90s3abt64vxd4r4/SQLAlchemy.ipynb?token_hash=AAEWmGa8Kng0qijeH29NnPtjblOCTe387vRUxLDOpbyCKg&dl=1
The document discusses different strategies for building a data warehouse - an enterprise-wide strategy that builds a comprehensive warehouse initially versus a data mart strategy that begins with a single mart and adds more over time. It also covers key aspects of building a data warehouse like extracting, transforming, and loading data from various sources, dealing with data quality issues, and the role of metadata.
The document provides an overview of Oracle for beginners, including the different editions of Oracle database, data types in Oracle such as character, numeric, date, and LOB data types. It also discusses how to create and alter Oracle tables, including adding, modifying and dropping columns, as well as renaming tables and columns. Primary keys in Oracle tables are also covered at a high level.
The DataWeave Language is a powerful template engine that allows you to transform data to and from any kind of format (XML, CSV, JSON, Pojos, Maps, etc).
DataWeave is a new language for querying and transforming data that contains a data access layer enabling large payloads and random access without costly conversions. An example transforms a JSON file to XML using the DataWeave component in MuleSoft, which has input, DataWeave code, and output sections. The DataWeave code defines the mappings and output format, and changing the output type transforms the data to CSV or Java objects.
This document introduces Oracle9i and relational database concepts. It discusses Oracle9i features like scalability and reliability. It also explains that a relational database consists of tables related through primary and foreign keys that can be accessed using SQL. The Oracle database server allows storage and querying of data across these tables.
This document provides an overview of basic relational database management system (RDBMS) concepts. It defines key terms like tables, records, fields and relationships. It also describes the relational model, ER diagrams and SQL. Common RDBMS like MySQL, SQL Server and Oracle are introduced. Basic SQL operators for queries are shown along with examples. The document serves as an introduction to fundamental RDBMS concepts.
The Road to U-SQL: Experiences in Language Design (SQL Konferenz 2017 Keynote)Michael Rys
APL was an early language with high-dimensional arrays and nested data models. Pascal and C/C++ introduced procedural programming with structured control flow. Other influences included Lisp for functional programming and Prolog for logic programming. SQL introduced declarative expressions with procedural control flow for data processing. Modern languages combine aspects of declarative querying, imperative programming, and support for both structured and unstructured data models. Key considerations in language design include support for parallelism, distribution, extensibility, and optimization.
This document summarizes new features in SQL Server 2008 for .NET developers, including spatial data support, BLOB storage using Filestream, enhancements to T-SQL, new date/time types, improved integration with Visual Studio, and business intelligence tools like Analysis Services, Integration Services, and Reporting Services.
This document summarizes new features in SQL Server 2008 for .NET developers, including spatial data support, BLOB storage using Filestream, enhancements to T-SQL, new date/time types, improved integration with Visual Studio, and business intelligence tools like Analysis Services, Integration Services, and Reporting Services.
software development practices like procedural coding are like training wheels, they help when we start development, but are detrimental later. This presentation lists few such practices and their alternatives
Functional Programming With Lambdas and Streams in JDK8IndicThreads
The significant new language feature in Java SE 8 is the introduction of Lambda expressions, a way of defining and using anonymous functions. On its own this provides a great way to simplify situations where we would typically use an inner class today. However, Java SE 8 also introduces a range of new classes in the standard libraries that are designed specifically to take advantage of Lambdas. These are primarily included in two new packages: java.util.stream and java.util.function.
After a brief discussion of the syntax and use of Lambda expressions this session will focus on how to use Streams to greatly simplify the way bulk and aggregate operations are handled in Java. We will look at examples of how a more functional approach can be taken in Java using sources, intermediate operations and terminators. We will also discuss how this can lead to improvements in performance for many operations through the lazy evaluation of Streams and how code can easily be made parallel by changing the way the Stream is created.
Session at the IndicThreads.com Confence held in Pune, India on 27-28 Feb 2015
http://www.indicthreads.com
http://pune15.indicthreads.com
Interactive Questions and Answers - London Information Retrieval MeetupSease
Answers to some questions about Natural Language Search, Language Modelling (Google Bert, OpenAI GPT-3), Neural Search and Learning to Rank made during our London Information Retrieval Meetup (December).
ODBC (Open Database Connectivity) was Microsoft's first database access technology. It provided a C interface that allowed applications to access data from different database management systems (DBMS) using a standardized call level interface. While widely adopted, it had some drawbacks including requiring a C interface and putting a burden on drivers to emulate a relational database for non-relational data sources.
Hand Coding ETL Scenarios and Challengesmark madsen
Overview of some of the scenarios that lead one to hand-coding over tools, description of the challenges faces, and some practices to deal with the problems.
Architectural Anti Patterns - Notes on Data Distribution and Handling FailuresGleicon Moraes
The document discusses several architectural anti-patterns related to data distribution and handling failures when using relational database management systems (RDBMS). It describes anti-patterns such as using tables as caches, queues, log files, or for dynamic schema creation. It also discusses abusing RDBMS features like stored procedures and triggers for application logic as well as using tables for distributed locking. The document is presented as a slide deck covering these anti-patterns to avoid when designing distributed systems that use RDBMS for data storage and access.
The is the RFC for AvocadoDB's query language. AvocadoDB is an open source nosql database (see www.avocadodb.org) offering a mixture of data models like key value pairs, documents and graphs.
The REST API for AvocadoDB is already available and stable and people are writing APIs using it. Awesome. As AvocacoDB offers more complex data structures like graphs and lists REST is not enough. We implemented a first version of a query language some time ago which is very similar to SQL and UNQL.
Then we realized that this approach was not completely satisfying as some queries cannot expressed very well with it, especially multi-valued attributes/lists. UNQL addresses this partly, but does not go far enough. Another issue are graphs. AvocadoDB supports querying graphs, neither SQL nor UNQL offer any "natural" graph traversal facilities.
As we did not find any existing query language that addresses the problems we found we had to define a new query language which is presented in the presentation.
Have some feedback on this? Come to www.avocadodb.org and tell us what you think about it. :-)
A Framework for Verifying UML Behavioral Models (CAiSE Doctoral Consortium 2009)Elena Planas
The document presents a framework for verifying UML behavioral models. The framework aims to verify correctness properties like syntactic correctness, executability, completeness, and redundancy. It uses static analysis techniques and provides corrective feedback to designers. The framework takes various UML diagrams as input and detects issues through properties defined for actions. The goal is to help designers verify behavioral specifications without simulation.
The USC Creative Media & Behavioral Health Center conducts research at the intersection of behavioral science, medicine, and public health. It develops transmedia stories and games using emerging technologies to create innovative assessment and treatment techniques for various health issues. Some of its major research initiatives investigate using entertainment and technology to improve outcomes for obesity, brain development, and motor skills. It has experience in areas like nutrition, exercise, anxiety, depression, and more. The center works across the lifespan from childhood to older age.
This document provides guidance for dealing with special behavioral problems in pupils including disrespect, avoidance of work, fighting, disobedience, and bullying. It recommends maintaining a calm and respectful manner, giving choices with consequences for avoidance, conducting investigations for fighting and reporting to administrators, talking privately and avoiding arguments for disobedience, and evaluating and reporting bullying while enhancing lessons on friendship.
Introduction to SQL Alchemy - SyPy June 2013Roger Barnes
A very brief introduction to SQLAlchemy, covering the core, ORM, database concepts and a high-level comparison to the Django ORM.
IPython notebook demo content is at http://nbviewer.ipython.org/urls/dl.dropboxusercontent.com/s/90s3abt64vxd4r4/SQLAlchemy.ipynb?token_hash=AAEWmGa8Kng0qijeH29NnPtjblOCTe387vRUxLDOpbyCKg&dl=1
The document discusses different strategies for building a data warehouse - an enterprise-wide strategy that builds a comprehensive warehouse initially versus a data mart strategy that begins with a single mart and adds more over time. It also covers key aspects of building a data warehouse like extracting, transforming, and loading data from various sources, dealing with data quality issues, and the role of metadata.
The document provides an overview of Oracle for beginners, including the different editions of Oracle database, data types in Oracle such as character, numeric, date, and LOB data types. It also discusses how to create and alter Oracle tables, including adding, modifying and dropping columns, as well as renaming tables and columns. Primary keys in Oracle tables are also covered at a high level.
The DataWeave Language is a powerful template engine that allows you to transform data to and from any kind of format (XML, CSV, JSON, Pojos, Maps, etc).
DataWeave is a new language for querying and transforming data that contains a data access layer enabling large payloads and random access without costly conversions. An example transforms a JSON file to XML using the DataWeave component in MuleSoft, which has input, DataWeave code, and output sections. The DataWeave code defines the mappings and output format, and changing the output type transforms the data to CSV or Java objects.
This document introduces Oracle9i and relational database concepts. It discusses Oracle9i features like scalability and reliability. It also explains that a relational database consists of tables related through primary and foreign keys that can be accessed using SQL. The Oracle database server allows storage and querying of data across these tables.
This document provides an overview of basic relational database management system (RDBMS) concepts. It defines key terms like tables, records, fields and relationships. It also describes the relational model, ER diagrams and SQL. Common RDBMS like MySQL, SQL Server and Oracle are introduced. Basic SQL operators for queries are shown along with examples. The document serves as an introduction to fundamental RDBMS concepts.
The Road to U-SQL: Experiences in Language Design (SQL Konferenz 2017 Keynote)Michael Rys
APL was an early language with high-dimensional arrays and nested data models. Pascal and C/C++ introduced procedural programming with structured control flow. Other influences included Lisp for functional programming and Prolog for logic programming. SQL introduced declarative expressions with procedural control flow for data processing. Modern languages combine aspects of declarative querying, imperative programming, and support for both structured and unstructured data models. Key considerations in language design include support for parallelism, distribution, extensibility, and optimization.
This document summarizes new features in SQL Server 2008 for .NET developers, including spatial data support, BLOB storage using Filestream, enhancements to T-SQL, new date/time types, improved integration with Visual Studio, and business intelligence tools like Analysis Services, Integration Services, and Reporting Services.
This document summarizes new features in SQL Server 2008 for .NET developers, including spatial data support, BLOB storage using Filestream, enhancements to T-SQL, new date/time types, improved integration with Visual Studio, and business intelligence tools like Analysis Services, Integration Services, and Reporting Services.
software development practices like procedural coding are like training wheels, they help when we start development, but are detrimental later. This presentation lists few such practices and their alternatives
Functional Programming With Lambdas and Streams in JDK8IndicThreads
The significant new language feature in Java SE 8 is the introduction of Lambda expressions, a way of defining and using anonymous functions. On its own this provides a great way to simplify situations where we would typically use an inner class today. However, Java SE 8 also introduces a range of new classes in the standard libraries that are designed specifically to take advantage of Lambdas. These are primarily included in two new packages: java.util.stream and java.util.function.
After a brief discussion of the syntax and use of Lambda expressions this session will focus on how to use Streams to greatly simplify the way bulk and aggregate operations are handled in Java. We will look at examples of how a more functional approach can be taken in Java using sources, intermediate operations and terminators. We will also discuss how this can lead to improvements in performance for many operations through the lazy evaluation of Streams and how code can easily be made parallel by changing the way the Stream is created.
Session at the IndicThreads.com Confence held in Pune, India on 27-28 Feb 2015
http://www.indicthreads.com
http://pune15.indicthreads.com
Interactive Questions and Answers - London Information Retrieval MeetupSease
Answers to some questions about Natural Language Search, Language Modelling (Google Bert, OpenAI GPT-3), Neural Search and Learning to Rank made during our London Information Retrieval Meetup (December).
ODBC (Open Database Connectivity) was Microsoft's first database access technology. It provided a C interface that allowed applications to access data from different database management systems (DBMS) using a standardized call level interface. While widely adopted, it had some drawbacks including requiring a C interface and putting a burden on drivers to emulate a relational database for non-relational data sources.
Hand Coding ETL Scenarios and Challengesmark madsen
Overview of some of the scenarios that lead one to hand-coding over tools, description of the challenges faces, and some practices to deal with the problems.
Architectural Anti Patterns - Notes on Data Distribution and Handling FailuresGleicon Moraes
The document discusses several architectural anti-patterns related to data distribution and handling failures when using relational database management systems (RDBMS). It describes anti-patterns such as using tables as caches, queues, log files, or for dynamic schema creation. It also discusses abusing RDBMS features like stored procedures and triggers for application logic as well as using tables for distributed locking. The document is presented as a slide deck covering these anti-patterns to avoid when designing distributed systems that use RDBMS for data storage and access.
The is the RFC for AvocadoDB's query language. AvocadoDB is an open source nosql database (see www.avocadodb.org) offering a mixture of data models like key value pairs, documents and graphs.
The REST API for AvocadoDB is already available and stable and people are writing APIs using it. Awesome. As AvocacoDB offers more complex data structures like graphs and lists REST is not enough. We implemented a first version of a query language some time ago which is very similar to SQL and UNQL.
Then we realized that this approach was not completely satisfying as some queries cannot expressed very well with it, especially multi-valued attributes/lists. UNQL addresses this partly, but does not go far enough. Another issue are graphs. AvocadoDB supports querying graphs, neither SQL nor UNQL offer any "natural" graph traversal facilities.
As we did not find any existing query language that addresses the problems we found we had to define a new query language which is presented in the presentation.
Have some feedback on this? Come to www.avocadodb.org and tell us what you think about it. :-)
A Framework for Verifying UML Behavioral Models (CAiSE Doctoral Consortium 2009)Elena Planas
The document presents a framework for verifying UML behavioral models. The framework aims to verify correctness properties like syntactic correctness, executability, completeness, and redundancy. It uses static analysis techniques and provides corrective feedback to designers. The framework takes various UML diagrams as input and detects issues through properties defined for actions. The goal is to help designers verify behavioral specifications without simulation.
The USC Creative Media & Behavioral Health Center conducts research at the intersection of behavioral science, medicine, and public health. It develops transmedia stories and games using emerging technologies to create innovative assessment and treatment techniques for various health issues. Some of its major research initiatives investigate using entertainment and technology to improve outcomes for obesity, brain development, and motor skills. It has experience in areas like nutrition, exercise, anxiety, depression, and more. The center works across the lifespan from childhood to older age.
This document provides guidance for dealing with special behavioral problems in pupils including disrespect, avoidance of work, fighting, disobedience, and bullying. It recommends maintaining a calm and respectful manner, giving choices with consequences for avoidance, conducting investigations for fighting and reporting to administrators, talking privately and avoiding arguments for disobedience, and evaluating and reporting bullying while enhancing lessons on friendship.
This document discusses various behavioral problems in children, including habit problems, eating problems, personality problems, anti-social problems, sleep problems, speech problems, and scholastic problems. Specific problems covered in more depth include thumb sucking, breath holding spells, pica, and infantile colic. For each problem, the document discusses definition, etiology, clinical features, management, and related topics. The goal is to provide information on identifying and addressing common behavioral issues in pediatric patients.
The document outlines the research process and design chapters of a study, including introducing the theoretical framework and hypotheses, as well as describing different types of research designs and how to develop an appropriate design. It also provides details on formulating the research problem, conducting preliminary data collection and literature review, and the proper formats for citing references and sources in the study.
Hypotheses and its types
Theoretical framework vs. Conceptual Framework
Scope and Limitations
Limitations vs. Delimitations
Kinds of Variables
Assumptions
Definition of Terms
Research Methodology & Thesis Topic Proposalsetaurisani
The document outlines Elizabeth Taurisani's directed research methodology and potential thesis topics under her professor Tom Klinkowstein. It describes a 10 step research methodology process including identifying resources, collecting research, identifying patterns, developing hypotheses, and finalizing conclusions. It then presents three potential thesis topics: the effects of ethical advertising on consumer children, motivations for baby boomers participating in gaming, and how multi-platform storytelling can impact documentary experiences among digital natives.
Research methodology for behavioral researchrip1971
The document provides an overview of research methodology for behavioral research. It aims to introduce research methodology and multivariate data analysis to new Ph.D. students. Topics covered include conceptualization, measurement, research design, multivariate analysis, and structural equation modeling. The goal is to provide hands-on experience with techniques like LISREL for analyzing behavioral research questions.
Spark for Behavioral Analytics Research: Spark Summit East talk by John W uSpark Summit
This presentation reports our experience on using the machine learning techniques in Apache Spark ecosystem to understand the user behavior in a number of applications. In this context, Spark makes the vast computing power of a large high-performance computing system available to the behavioral economists without requiring the application scientists to learn about parallel computing. To illustrate the effectiveness of this approach, we focus on a compute-intensive task of establishing baseline for studying the impact of policies on consumer behavior. The gold standard for this type of baseline is a randomized control group, however, this control group can only provide a group-level reference, not for individual consumers. In many cases, the self-selection bias along with other factors can make it extremely difficult to generate a unbiased control group. By harnessing the computing power of Spark, we are able to learn the behavior pattern for each individual user and therefore create a much more precise baseline for behavioral analysis. We will use two use cases to illustrate the approach: a residential electricity usage study and a traffic pattern prediction study.
This document defines key terms related to theoretical and conceptual frameworks, including concepts, constructs, variables, conceptual framework, and theoretical framework. It explains that a conceptual framework consists of concepts and proposed relationships between concepts, while a theoretical framework is based on existing theories. The purposes of conceptual and theoretical frameworks are to clarify concepts, propose relationships between concepts, provide context for interpreting findings, and stimulate further research and theory development.
Hönnunarmunstur eru aðferðir við að leysa algeng vandamál. En notkun þeirra felst í að takast á við verkefni sem þarf að leysa. Fyrst koma verkefnin og svo finnum við hvaða munstur hentar miðað við þær forsendur sem við höfum. Oft þegar verið er að vinna að lausnum þá er alls ekki hægt að gera allan hugbúnaðinn í einu, heldur þarf að "hakka" suma hluti meðan unnið er í öðru. Til þess að lenda ekki í technical dept þarf að vera með stöðugt refactoring.
Í þessum fyrirlestri skoðum við vandamál og rifjum upp grunnmunstrin (base patterns) sem við kynntumst í L05 Design Patterns. Þá skoðum við hvernig við leysum tengingu við póstþjón.
02._Object-Oriented_Programming_Concepts.pptYonas D. Ebren
This document discusses object-oriented programming concepts and provides an example of analyzing a software development process using object-oriented principles. It describes a typical 5-step software development process of analysis, design, implementation, testing, and maintenance. It then introduces object-oriented programming concepts like modeling a problem as a set of collaborating objects and components. As an example, it analyzes the classic game Tetris in terms of its objects like pieces and boards, their properties and capabilities.
core & advanced java classes in Mumbai
best core & advanced java classes in Mumbai with job assistance.
our features are:
expert guidance by it industry professionals
lowest fees of 5000
practical exposure to handle projects
well equiped lab
after course resume writing guidance
1. The document introduces data structures and discusses primitive data structures like integers, booleans, and characters. It also discusses linear data structures like arrays, stacks, queues, and linked lists, as well as non-linear structures like trees and graphs.
2. Stacks are described as ordered collections that follow the LIFO principle. Basic stack operations like push, pop, full, and empty are explained along with algorithms to implement push and pop.
3. Applications of stacks include converting infix expressions to postfix and prefix notation, solving mazes using a scratch pad stack, and text editing operations like delete that utilize stacks.
The document discusses stacks and queues as linear data structures. It defines a stack as a first-in last-out (LIFO) structure where elements are inserted and deleted from one end. Stacks are commonly used to handle function calls and parameters. The document also defines queues as first-in first-out (FIFO) structures where elements are inserted at the rear and deleted from the front. Examples of stack and queue applications are given. Implementation of stacks using arrays and pointers is described along with push and pop algorithms.
This document provides an overview of parallel programming concepts like parallelism, threads, and concurrency. It discusses the importance of parallel programming given increasing numbers of processor cores. Key concepts covered include parallelism versus multi-processing, tasks and threads, the Java thread classes and methods, threading in Swing applications, and the new Java ForkJoin framework for parallel divide-and-conquer tasks. Examples are provided of using threads, Runnables, and SwingWorkers in Java programs.
This document provides an introduction to object-oriented programming concepts in Java, including data abstraction and encapsulation, inheritance, and polymorphism. It discusses how objects combine data and operations, and how data abstraction allows using an object's interface without knowing implementation details. Instance variables store an object's data, and constructors, accessors, and mutators are used to initialize, read, and modify this data. The document also covers class vs instance methods, wrappers that allow primitive types to be used like objects, and using files for input/output in Java.
This document provides an overview of object-oriented programming and Java. It defines object-oriented programming as organizing programs around objects and their interfaces rather than functions. The key concepts of OOP discussed include classes, objects, encapsulation, inheritance, polymorphism, and abstraction. It also provides details on the history and characteristics of Java, the most popular language for OOP. The document is serving as course material for a programming paradigms class focusing on OOP using Java.
This document discusses object-oriented concepts in software development. It describes the four main types of object-oriented paradigms used in the software lifecycle: object-oriented analysis, design, programming, and testing. It then explains some benefits of the object-oriented approach like modularity, reusability, and mapping to real-world entities. Key concepts like inheritance, encapsulation, and polymorphism are defined. The document also provides examples of how classes and objects are represented and compares procedural with object-oriented programming.
11.11.2020 - Unit 5-3 ACTIVITY, MENU AND SQLITE DATABASE.pptxMugiiiReee
This document provides information about activities, menus, intents, services, broadcast receivers and SQLite database in Android. It discusses the activity lifecycle and different types of activities. It explains the concept of intents and how they are used to start activities, services and broadcast receivers. It covers the different types of menus like option menu, context menu and popup menu. It discusses services, their types and lifecycle. It provides details about broadcast receivers, how they receive and respond to broadcast messages. It also gives an overview of SQLite database and how it is used in Android applications for data storage.
Forritun gagnaaðgangs er líklega eitt algengasta viðfangsefni við gerð enterprice lausna. Einhvern vegin verðum við að geyma stöður og gögn. Til þess eru töflugagnagrunnar (relational databases) lílkega algengasta formið af geymslu. Gallinn er sá að hlutbundin forritun fellur ekkert sérlega vel að töflugrunnum.
Í þessum fyrirlestri er yfir þau vandamál sem koma upp við hönnun gagnalagsins og hvernig best er að brúa bilið milli klasa í forriti og taflna í grunni.
This document contains questions and answers related to Business Objects (BO) concepts. It discusses detail objects, the BOMain.Key file, the BO repository, domains in a basic setup, when the repository is created, having multiple domains, restricting row access, categories, universes, objects, object qualification, database size, loops and how to overcome them, joins, linked universes, alerts, filters, breaks, conditions, the difference between master-detail and breaks, metrics, sets, the use of Analysis for Decision Maker (AFD), the source for metrics, why metrics and sets are needed, issues in migration processes, the use of BO SDK, improving performance, analysis in BO, integrity checks, universe parameters
This document discusses programming concepts, data management, and technology applications. It covers topics such as programming logic, object-oriented programming, data types, file structures, databases, and database models. Specifically, it defines key terms related to programming semantics, pseudocode, sequential logic, looping logic, decision logic, and event-driven programming. It also defines terms related to objects, classes, methods, and data structures like arrays, stacks, queues, and trees.
In this educational video, we will provide an introduction to data structures. You will learn what data structures are, including queues, stacks, trees, and binary search. We will explore different types of data structures and specifically dive into priority queues. By watching this ppt, you will gain knowledge and have the opportunity to improve your command of data structures. Don't miss this chance to learn something new and expand your understanding.
This document introduces object-oriented programming concepts including data abstraction and encapsulation, inheritance, and polymorphism. It discusses how objects combine data and operations, and how data abstraction allows using an object without knowing implementation details. Instance variables store an object's data, and constructors, accessors, and mutators are used to initialize, read, and modify instance variables respectively. The document provides examples using a Polygon class to illustrate these concepts.
William Jones and others presented on bringing information together across devices and applications. They proposed modeling information structure using itemMirror objects that could be accessed by different applications. This would allow information to remain where it is while being used across platforms. A spring project was proposed for students to build HTML5 apps that work with the same information through itemMirror objects. The goal is to separate information from applications and stores to avoid lock-in and allow mixing and matching of tools.
This document discusses key concepts and components in the Doctrine ORM, including:
- The EntityManager acts as the central access point and manages entity persistence. It allows persisting, flushing, merging, detaching, removing, and refreshing entities.
- The UnitOfWork maintains a list of entities affected by a transaction and coordinates writing changes to the database. It tracks entity states and maintains an identity map.
- Persisters handle writing entity and collection data while events allow reacting to state changes at specific points in the persistence process.
Fyrirlestur fyrir Félag tölvunarfræðinga og Verkfræðingafélagið þann 18.05.2022
Nýsköpun er forsenda tækniframfara sem eru forsendur framþróunar. Nýsköpun byrjar yfirleitt smátt og þarf margar ítranir til að virka. Frumkvöðlar sem eru að búa til nýjungar þurfa ekki einungis að glíma við tæknina og takmarkanir hennar, heldur einnig skoðanir og álit samtímamanna sem sjá ekki alltaf tilgang með nýrri tækni. Í þessum fyrirlestri skoðar Ólafur Andri nýsköpun og þær framfarir sem hafa orðið. Einnig skoðar hann hvert tækniframfarir nútímans muni leiða okkur á komandi árum.
Ólafur Andri Ragnarsson er aðjúnkt við Háskólann í Reykjavík og kennir þar námskeið um tækniþróun og hvernig tæknibreytingar hafa áhrif á fyrirtæki. Hann er tölvunarfræðingur (Msc) að mennt frá Oregon University í Bandaríkjanum. Ólafur Andri er frumkvöðull og stofnaði, ásamt fleirum, Margmiðlun og síðar Betware. Þá tók Ólafur Andri þátt í að stofna leikjafyrirtækið Raw Fury AB í Stokkhólmi.
Fyrirlestur haldinn fyrir tæknifaghóp Stjórnvísi þann 13. október 2020.
Undanfarna áratugi höfum við séð gríðalegar framfarir í tækni og nýsköpun á heimsvísu. Þessar framfarir hafa skapað mannkyninu öllu aukna hagsæld. Þrátt fyrir veirufaraldur á heimsvísu eru framfarir ekkert að minnka heldur munu bara aukast næstu árum. Gervgreind, róbotar, sýndarveruleiki, hlutanetið og margt fleira er að búa til nýjar lausnir og ný tækifæri. Framtíðin er í senn sveipuð dulúð og getur verið spennandi og ógnvekjandi í senn. Eina sem við vitum fyrir vissu er að framtíðin verður alltaf betri. Í þessu fyrirlestri ætlar Ólafur Andri Ragnarsson kennari við HR að fjalla um nýjustu tækni og framtíðina.
Technology is one of the factors of change. When new disruptive technology is introduced, it can change industries. We have many examples of that and will start this journey it one of the most important innovation that has come in our lifetimes, the smartphone. We will explore the impact of the smartphone and the fate of existing companies at the time when iPhone, the first smartphone as we know them, was introduced to the world.
We will also look at other examples from history. Then we look at the broader picture, past industrial revolutions and the one that we are experiencing now, the fourth industrial revolution. Specifically we look briefly at the technologies that fuel this revolution, for example artificial intelligence, robotics, drones, internet of things and more.
This document summarizes a lecture on robotics and drones. It discusses the history of robots dating back to ancient times. It also covers modern industrial robots, robotic developments in the 21st century including robots that can see, hear and sense. The document outlines Isaac Asimov's three laws of robotics. It discusses self-driving cars and their levels of automation. Finally, it covers unmanned aerial vehicles including military drones and delivery drones, and concludes that the robot revolution has only just begun.
The normal interaction with computers is with keyboard and a mouse. For display a rectangular somewhat small screen is used with 2D windowing systems. The mouse was invented more the 40 years ago and has been for 20 years dominant input. Now we are seeing new types of input devices. Multi-touch adds new dimensions and new applications. Natural user interfaces or gesture interfaces where people point to drag objects. Computers are also beginning to recognize facial expressions of people, so it knows if you are smiling. Voice and natural language understanding is getting to a usable stage. All this calls all types of new applications.
Displays are getting bigger. What if any surface was a screen? If you could spray the wall with screen? Or have you phone project images to the wall.
This lectures explores some of these new types of interactions with computers and software. It makes the old mouse look old.
Local is the Lo in SoLoMo, the buzz word. Local is not only about location, it's also about your digital track record. Over 70% of Netflix users watch the films recommend. Mining data to understand people's behaviour is getting to be a huge and valuable business. Advertisers see opportunities in getting direct to their target groups. Predictive intelligence is also about where you will be at some time in the future, and where somebody you know will be.
It turns out that Facebook and Google know you better than you think you know yourself. The world is about to get really scary.
Over two billion people signed up for Facebook. This site the most used site for people when using the Internet. People are not watching TV so much anymore - they using Facebook, Youtube and Netflix and number of popular web sites.
Some people denote their time working for others online. What drives people to write an article on Wikipedia? They don´t get paid. Companies are enlisting people to help with innovations and sites such as Galaxy Zoo ask people to help identifying images. And why do people have to film themselves singing when they cannot sing and post the video on Youtube?
In this lecture we talk about how people are using the web to interact in new ways, and doing stuff.
With the computer revolution vast amount of digital data has become available. With the Internet and smart connected product, the data is growing exponentially. It is estimated that every year, more data is generated than all history prior. And this has repeated over several years.
With all this data, it becomes a platform for something new of its own. In this lecture, we look at what big data is and look at several examples of how to use data. There are many well-know algorithms to analyse data, like clustering and machine learning.
After the computing industry got started, a new problem quickly emerged. How do you operate this machines and how to you program them. The development of operating systems was relatively slow compared to the advances in hardware. First system were primitive but slowly got better as demand for computing power increased. The ideas of the Graphical User Interfaces or GUI (Gooey) go back to Doug Engelbarts Demo of the Century. However, this did not have much impact on the computer industry. One company though, Xerox, a photocopy company explored these ideas with Palo Alto Park. Steve Jobs of Apple and Bill Gates of Microsoft took notice and Apple introduced first Apple Lisa and the Macintosh.
In this lecture on we look so lessons for the development of software, and see how our business theories apply.
In this lecture on we look so lessons for the development of algorithms or software, and see how our business theories apply.
In the second part we look at where software is going, namely Artificial Intelligence. Resent developments in AI are causing an AI boom and new AI application are coming all the time. We look at machine learning and deep learning to get an understanding of the current trends.
We are currently living in times of great transformation. We have over the last couple of decade seen the Internet become the most powerful disrupting force in the world, connecting everyone and transforming businesses. Now everyday objects - things we use are getting smart with sensors and software. And they are connecting. What does this mean?
We will see the world become alive. Cars will talk to road sensors that talk to systems that guide traffic. Plants will talk to weather systems that talk to scientists that research climate change. Farming fields will talk to the farming system that talks to robots that do fertilising and harvesting. Home appliances like refrigerators, ovens, coffee machines and microwaves ovens will talk to the home food and cooking system that will inform the store that you are running out butter, cheese, laundry detergent and coffee beans, which will inform the robot driver to get this to your house after consulting your calendar upon when someone is at home.
In this lecture we explore the Internet of Things, IoT.
The Internet grew out of US efforts to build the ARPANET, a network of peer computers built during the cold war. The two major players were military and academia. The network was simple and required no efforts for security or social responsibility. The early Internet community was mainly highly educated and respectable scientist. In the early 1990s the World Wide Web, a hypertext system is introduced, and soon browsers start to appear, leading the commercialization of Net. New businesses emerge and a technology boom known as the dot-com era.
The network, now over 40, is being stretched. Problems such as spam, viruses, antisocial behaviour, and demands for more content are prompting reinvention of the Net and threatening its neutrality. Add to this government efforts to regulate and limit the network.
In this lecture we look at the Internet and the impact of the network. We will also look at the future of the Internet.
The Internet grew out of US efforts to build the ARPANET, a network of peer computers built during the cold war. The two major players were military and academia. The network was simple and required no efforts for security or social responsibility. The early Internet community was mainly highly educated and respectable scientist. In the early 1990s the World Wide Web, a hypertext system is introduced, and soon browsers start to appear, leading the commercialisation of Net. New businesses emerge and a technology boom known as the dot-com era.
The network, now over 40, is being stretched. Problems such as spam, viruses, antisocial behaviour, and demands for more content are prompting reinvention of the Net and threatening its neutrality. Add to this government efforts to regulate and limit the network.
In this lecture we look at the Internet and the impact of the network. We will also look at the future of the Internet.
- Mobile phones are now the most common device in the world, with over 8.5 billion connections globally as of 2017.
- The development of mobile phones was enabled by earlier innovations in electromagnetism and radio in the late 19th century, but mobile phones did not become practical until the 1980s with the invention of the microchip.
- Mobile technology has advanced through generations from analog 1G networks in the 1980s, to digital 2G networks in the 1990s incorporating texting, and 3G packet switched networks in the 2000s enabling more data and applications.
Did you know that the term "Computer" once meant a profession? And what did people or computers actually do? They computed mathematical problems. Some problems were tedious and error prone. And it is not surprising that people started to develop machines to aid in the effort. The first mechanical computers were actually created to get rid of errors in human computation. Then came tabulating machines and cash registers. It was not until telephone companies were well established that computing machines became practical.
First computers were huge mainframes, but soon minicomputers like DEC’s PDP started to appear. The transistor was introduced in 1947, but its usefulness was not truly realized until in 1958 when the integrated circuit was invented. This led to the invention of the microprocessor. Intel, in 1971, marketed the 4004 – and the personal computer revolution started. One of the first Personal Computers was MITS’ Altair. This was a simple device and soon others saw the opportunities.
In this lecture we start our coverage of computing and look at some of the early machines and the impact they had.
Software is changing the way traditional business operate. People now have smartphones in their pockets - a supercomputer that is 25,000 times more powerful and the minicomputers of the 1960s. This is changing people's behaviour and how people shop and use services. The organisational structure created in the 20th century cannot survive when new digital solution are being offered. Software is changing the way traditional business operate. People now have smartphones in their pockets - a supercomputer that is 25,000 times more powerful and the minicomputers of the 1960s. This is changing people's behaviour and how people shop and use services. The organisational structure created in the 20th century cannot survive when new digital solution are being offered. The hierarchical structure of these established companies assumes high coordination cost due to human activity. But when the coordination cost drops
The organisational structure that companies in the 20th century established was based on the fact that employees needed to do all the work. The coordination cost was high due to the effort and cost of employees, housing etc. Now we have software that can do this for use and the coordination cost drops to close-to-zero. Another thing is that things become free. Consider Flickr. Anybody can sign up and use the service for free. Only a fraction of the users get pro account and pay. How can Flickr make money on that? It turns out that services like this can.
Many businesses make money by giving things away. How can that possibly work? The music business has suffered severely with digital distribution of content. Should musicians put all their songs on YouTube? What is the future business model for music?
One of the great irony of successful companies is how easily they can fail. New companies are founded to take advantage of some new technology. They become highly successful and but when the technology shifts, something new comes along, they are unable to adapt and fail. This is the innovator’s dilemma.
Then there are companies that manage to survive. For example, Kodak survived two platform shift, only til fail the third. IBM has survived over 100 years. What do successful companies do differently?
History has many examples of great innovators who had difficult time convincing their contemporaries of new technology. Even incumbent and powerful companies regarded new technologies as inferior and dismissed it as "toys". Then when disruptive technologies take off they often are overhyped and can cause bubbles like the Internet bubble of the late 1990s.
In this lecture we look at some examples of disruptive technologies and the impact they had. We look at the The Disruptive Innovation Theory by Harvard Professor Clayton Christensen.
Technology evolves in big waves that we call revolutions. The first revolution was the Industrial revolution that started in Britain in 1771. Since than we have see more revolutions come and how we are in the fifth. These revolutions follow a similar path. First there is an installation period where the new technologies are installed and deployed, creating wealth to those who were are the right place at the right time. This is followed by a frenzy, where financial markets wants to be apart. The there is crash and turning point, followed by synergy, a golden age.
In 1908, a new technological revolution started. It was the Age of Oil and Automobile. The technology trigger was Henry Ford´s new assembly line technique that allowed the manufacturing of standardized, low cost automobile. This created the car industry and other manufacturing companies. This also created demand for gas thus creating the oil industry. During the Roaring Twenties the stock prices rose to new levels, until a crash and the Great Depression. Only after World War II, came a turnaround point followed by a golden age in the post-war boom.
In this lecture we look at a framework for understanding technological revolutions. There revolutions completely change societies and replace the old with new technologies. We will explore how these revolutions take place. We should now be in the golden age phase.
We also look at generations.
In the early days of product development, the technology is inferior and lacking in performance. The focus is very much on the technology itself. The users are enthusiast who like the idea of the product, find use for it, and except the lack of performance. Then as the product becomes more mature, other factors become important, such as price, design, features, portability. The product moves from being a technology to become a consumer item, and even a community.
In this lecture we explore the change from technology focus to consumer focus, and look at why people stand in line overnight to buy the latest gadgets.
This document summarizes a lecture about the diffusion of innovation. It discusses how new ideas are developed through collaboration and exchange. It also discusses how innovations diffuse slowly at first, gaining momentum over time as they are adopted by pragmatists and conservatives seeking convenient solutions. The rate of adoption follows an S-curve, with innovators and enthusiasts driving early adoption and the mass market adopting later. Customers' motivations for adoption change over time, initially valuing the innovation's benefits and later valuing its functionality. Factors like network effects, convenience, and compatibility influence adoption rates.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
2. Reading
Fowler 3 Mapping to Relational Database
– The Behavioural Problem
Fowler 12: Object-Relational Behavioral
Patterns
– Unit of work
– Identity Map
– Lazy Load
3. Agenda
Error handling
The Behavioral Problem
Object-Relational Behavioral Patterns
– Unit of work
– Identity Map
– Lazy Load
Object-Relational Mapping
5. Error handling
Important aspect of programming
– Programming the best case is usually easy
– Making programs robust is another thing
Empty catch-blocks are usually not acceptable
– Can be worse since the error gets lost
– system.out.println is usually not practical
How to handle exception
– Log the exception
– Create a new exception and throw
– Ignore and have upper layers handle the exception
6. Some guidelines
If you cannot handle an exception, don’t catch it
If you catch an exception, don’t eat it
If you need to handle an exception
– Log with useful information
– Catch it where you can do something with it
Use domain specific exception
– Removes dependences
– Example: Should SQLException be handled in the
web layer if there is duplicate row in the database?
7. Exception Handling
Exceptions flow through layers
– Catch exception at the source and throw a domain specific
exception
– Upper layers will handle the error
Example: Add User
– Table Data Gateway add method catches a duplicate
database exception
– Throw domain specific exception
– Each layer will ignore the exception, just pass it through
– Web layer decides to display message to user saying the
username chosen is already taken
9. Types of Exceptions
Unchecked
– Can occur at any time
– For example
• OutOfMemoryError, NullPointerException
Checked
– Part of declaration, must he handled or specifically
handed to the caller
public static String readFirstLine(String filename)
throwsIOException
{ ...
10. Unexpected Exceptions
Problem with checked exceptions
–
–
–
–
Too much code – unnessary try-catch blocks
Hard-to-read code – difficult to see the real code
The real error can get lost
Dependencies
Guidelines
– Use checked exception if caller must deal with the
problem, the exception has direct consequences to the
computation
– In layered systems, if the calling layer will not be able to do
anything, log and throw unchecked exception
– Layer controlling the flow will handle
12. Example
public class UserInserter extends SqlUpdate
{
...
public int insert(User user)
{
int rows = 0;
try {
rows = update(new Object[] {
user.getUsername(), user.getName(),
user.getEmail(), user.getPassword(), });
}
catch (DataIntegrityViolationException divex)
{
String msg = "User '" + user.getUsername() +
"' is already registered.";
log.info(msg);
throw new RuDuplicateDataException(msg, divex);
}
13. Example
catch (Throwable t)
{
String msg = "Unable to access Database: cause: " +
t.getMessage();
log.severe(msg);
throw new RuDataAccessException(msg, t);
}
return rows;
}
}
14. UserDataGateway
Do not need to handle the exception
public class UserData extends RuData implements UserDataGateway
{
UserInserteruserInserter = ...
public void addUser(User user)
{
userInserter.insert(user);
}
...
}
public interface UserDataGateway extends RuDataGateway
{
User findUser(int id);
Collection findByName(String name);
void addUser(User user);
void updateUser(User user);
void deteleUser(int id);
}
15. QUIZ
Which of these statements is not true
A)
B)
✔ C)
D)
Checked exceptions must be always be handled by caller
In layered systems, each layer must handle exceptions
Unchecked exceptions are never handled
Checked exceptions require more coding
17. The Behavioral Problem
Object-Relational Mapping
– How you relate tables to objects
The Data Source Layer patterns are architectural
patterns – the focus on structure
–
–
–
–
Row Data Gateway,
Table Data Gateway,
Active Record, and
Data Mapper
They simply tell you how to load and save objects to
tables
– What if you maintain these objects in-memory?
18. The Behavioral Problem
How to get various object to load and save
themselves to the database
– With objects in memory, how can we keep track of
modified objects?
– What if we have two of the same object in memory
and both are changed?
– How can we maintain consistency and data integrity?
– What if you need object that is already in memory?
19. Keeping track of changed Objects
Simple way is to have an object that keeps track
of other objects
– Unit of Work
The idea is this
– When object is loaded it is registered as “clean” in the
UoW
– If modified, it is marked “dirty”
– When writing all objects back, just write the dirty ones
20. Keeping track of loaded Objects
What if you need an object from the database –
is it already loaded? And changed?
– Identity Map
The idea is this
– Keep all objects in a map and check get them from
the map
– If they are not in the map, load them from the
database
21. Loading Objects
For rich data models, what about loading object
hierarchies?
– Do we need to load all linked objects?
– Lazy Load
The idea is this
– We load part of the objects but maintain a placeholder
that we use when the rest of the object is needed
22. Unit of Work
Maintains a list of objects affected by a business
transaction and coordinates the writing out of
changes and the resolution of concurrency
problems
Keeps track of objects that are moved in and out
of the database
– What has changed?
23. Unit of Work
How It Works
– Unit of Work is an object that tracks all changes to the
database
– As soon as something affects the database, tell the Unit of
Work
– The Unit of Work must know the state of objects
• Upon committing the Unit of Work decides what to do
• Application programmers don’t have know what to write to the
database
Two methods
– Caller registration
– Object registration
24. Unit of Work
Caller Registration
– User of the object has to remember to register the
object with the Unit of Work for changes
25. Unit of Work
Object Registration
– The object must register itself with the Unit of work
26. Unit of Work
When to Use It
– When you have in-memory objects you need to
synchronize with the database
– When you have many updates to objects and you
want to avoid unneeded calls to the database to save
the object
Benefits
– Keeps the state of object in one place
27. Identity Map
Ensures that each object gets loaded only once by
keeping every loaded object in a map. Looks up
objects using the map when referring to them
Keeps a record of all the objects that have been
read
28. Identity Map
How It Works
– Contains a map of all loaded objects
– Provides method to get the objects
Choice of Key
– Usually the primary key
Explicit or Generic
– Explicit Identity Maps have method of the type of the
object
• Person findPerson (1)
– Generic Identity Maps have generic objects and keys
• Object find(“person”, 1)
29. Identity Map
How Many
– One map per class or per session
– Session maps works for database-unique keys
– For multiple maps, maintain one per class or per table
Where to put them
– Identity maps need to be somewhere
– Can be part of Unit of work
– Can be in a Registry
Identity Maps can be used as cache
– Works well if objects are read-only
30. Identity Map
When to Use It
– When you need to load objects to memory and you
don’t want them duplicated
– Main benefit of Identity Map is avoiding problems
when object is updated in-memory
– For immutable object, such as value object, Identity
Map is not needed – object may be duplicated
Performance
– When you need caching of objects for performance
31. Lazy Load
An object that doesn’t contain all of the data you
need but knows how to get it
Load only the data that is needed
– Load the rest when it is needed
32. Lazy Load
How It Works
– Object can contain other objects and associations
– Loading all the data might be too much
– Lazy Load delays loading until the objects are
needed
Four ways to implement Lazy Load
–
–
–
–
Lazy Initialization
Virtual Proxy
Value Holder
A ghost
33. Lazy Load
Lazy Initialization
– Uses a special marker value (usually null) to indicate
a field isn't loaded
– Every access to the field checks the field for the
marker value and if unloaded, loads it
Class Supplier...
public List getProducts() {
if (products == null)
products = Product.findSupplier(getId());
return products;
}
34. Lazy Load
Virtual Proxy
– An object with the same interface as the real object
– The first time one of its methods are called it loads the
real the object and then delegates.
Class VirtualList...
private List source;
private VirtualListLoader loader;
public VirtualList(VirtualListLoader loader) {
this.loader = loader;
}
private List getSource() {
if(source == null) source = loader.load();
return source();
}
public int size() {
return getSource().size();
}
35. Lazy Load
Value Holder
– An object with a getValue method
– Clients call getValue to get the real object, the first
call triggers the load
Class SupplierVH...
private ValueHolder products;
public List getProducts() {
return (List)products.getValue();
}
Class ValueHolder…
private Object Value;
...
public Object getValue() {
if (value==null) value = loader.load();
return value;
}
36. Lazy Load
A ghost
– The real object without any data
– The first time you call a method the ghost loads the
full data into its fields
class Domain Object
protected void Load() {
if(IsGhost())
DataSource.load(this);
}
Class Employee...
public String Name {
get {
Load();
return _name;
}
set {
Load();
_name = value;
}
}
String _name;
37. Lazy Load
When to Use It
– When you have complex objects with associations
with other objects
– Need to decide how much to get on a hit and how
many hits we want
– Rule might be to bring in everything you need in one
call
• The overhead of taking extra fields in the table is not that
high
– The best time to use Lazy Load is when it involves an
extra call and the data you’re calling isn’t used when
the main object is used
38. QUIZ
We are writing a business application which is using fairly large
data set. We only need to update few objects. Writing them all
back to database is too expensive. What pattern can we use?
A)
B)
✔ C)
D)
Lazy Load
Identity Map
Unit of Work
Data Mapper
40. Object Relational Mapping (ORM)
Use a mapping layer to map between objects
and tables
– Mapping a data representation from an object model
to a relational data model with a SQL-based schema
Mapping requires
metadata
– XML
Authoring and
maintaining
metadata is less work than maintaining SQL
41. Advantages of ORM
Can radically reduce the amount of code you
need to write
– 30% compared to JDBC for server side application
More Productivity
Applications are easier to maintain
Fosters thinking about an OO domain model
42. Disadvantages of ORM
Some loss of control over the persistence
process
May be more difficult to tune queries
Performance characteristics of the tool may
affect your application’s performance
43. When to use ORM?
Well-suited to ORM
– Read-modify-write lifecycle
– Little requirement for stored procedures
Poorly suited to ORM
– “Window on data” application
– Significant use of stored procedures
– Write centric apps, where data is seldom read
44. When to use ORM?
Typical server-side applications are fairly well
suited for ORM
– 90%-95% of applications
– But there are always some special cases
– Mix and match as needed
46. Hibernate
Object/relational mapping tool
– A persistence service that stores Java objects in
relational databases
– Provides an object oriented view of existing relational
data
Uses reflection and XML mapping files to
persist POJOs
– No changes to business domain objects
– The goal is to relieve the developer from a significant
amount of common data persistence-related
programming tasks
49. Database Properties
File
– hibernate.properties
hibernate.connection.username=andri
hibernate.connection.password=abc123
hibernate.connection.url=jdbc:jtds:sqlserver://honn.ru.is:1433
hibernate.connection.driver_class=net.sourceforge.jtds.jdbc.Driver
Contains information to access the database
– Username and password
– URL
– Database driver
Hibernate will automatically read the file from the
classpath
50. Mapping File
File
– hibernate-mapping
– In the same package as Nemandi class
<hibernate-mapping>
<class name="org.ru.honn.domain.Nemandi" table="NEMENDUR">
<id name="kennitala" column="kennitala" type="string">
</id>
<property name="nafn" column="nafn" type="string"
length="64" not-null="false"/>
<property name="netfang" column="netfang" type="string"
length="64" not-null="false"/>
<property name="hopur" column="hopur" type="string"
length="32" not-null="false" />
</class>
</hibernate-mapping>
51. Using Hibernate
Usually an application will
– Create a single Configuration
– Build a single instance of SessionFactory
– Then instantiate Session objects
Configuration cfg = new Configuration();
cfg.addClass(theClass);
SessionFactory factory = cfg.buildSessionFactory();
Session session = factory.openSession();
52. Using Hibernate
Configuration
– Allows the application to specify properties and
mapping documents to be used when creating a
SessionFactor
SessionFactory
– Factory class to create Session objects
Session
– Interface that represents a transaction
– The main function is to offer create, read and delete
operations for instances of mapped entity classes
53. Example
NemandiGateway
public interface NemandiGateway
{
public Nemandi findNemandi(String kennitala);
public Collection getNemendur();
public void addNemandi(Nemandi nemandi);
}
54. Example
NemandiData
– Constructor creates the configuration and the factory
– Variable factory is used when a Session is needed
public class NemandiData implements NemandiGateway
{
SessionFactory factory = null;
public NemandiData()
{
Configuration cfg = new Configuration();
cfg.addClass(Nemandi.class);
factory = cfg.buildSessionFactory();
}
55. Example
NemandiData
– findNemandi
public Nemandi findNemandi(String kennitala)
{
Session session = factory.openSession();
Nemandi nem = (Nemandi)session.get(Nemandi.class, kennitala);
session.close();
return nem;
}
56. Example
NemandiData
– getNemendur
public Collection getNemendur()
{
Session session = factory.openSession();
List l = session.createQuery(
"SELECT n FROM is.ru.honn.domain.Nemandi AS n").list();
session.close();
return l;
}
– Uses the Hibernate Query Language, HQL
58. Summary
The Behavioral Problem
– When objects are used
Object-Relational Behavioral Patterns
– Unit of work
– Identity Map
– Lazy Load
Object-Relational Mapping