The document discusses database management systems and data modeling. It begins by defining key terms like data, databases, database management systems, and data models. It then provides a brief history of database development from the 1960s to the 1980s. The rest of the document discusses database concepts in more detail, including components of a DBMS, types of database users, database administration responsibilities, data modeling techniques, and the evolution of different data models.
● Data Modeling and Data Models.
● Business Rules (Translating Business Rules into Data Model Components).
● Emerging Data Models: Big Data and NoSQL.
● Degrees of Data Abstraction (External, Conceptual, Internal and Physical model).
A database is a collection of information that is organized so that it can easily be accessed, managed, and updated. In one view, databases can be classified according to types of content: bibliographic, full-text, numeric, and images.....
● Data Modeling and Data Models.
● Business Rules (Translating Business Rules into Data Model Components).
● Emerging Data Models: Big Data and NoSQL.
● Degrees of Data Abstraction (External, Conceptual, Internal and Physical model).
A database is a collection of information that is organized so that it can easily be accessed, managed, and updated. In one view, databases can be classified according to types of content: bibliographic, full-text, numeric, and images.....
Database systems that were based on the object data model were known originally as object-oriented databases (OODBs).These are mainly used for complex objects
Data Models [DATABASE SYSTEMS: Design, Implementation, and Management]Usman Tariq
In this PPT, you will learn:
• About data modeling and why data models are important
• About the basic data-modeling building blocks
• What business rules are and how they influence database design
• How the major data models evolved
• About emerging alternative data models and the needs they fulfill
• How data models can be classified by their level of abstraction
Author: Carlos Coronel | Steven Morris
Database systems that were based on the object data model were known originally as object-oriented databases (OODBs).These are mainly used for complex objects
Data Models [DATABASE SYSTEMS: Design, Implementation, and Management]Usman Tariq
In this PPT, you will learn:
• About data modeling and why data models are important
• About the basic data-modeling building blocks
• What business rules are and how they influence database design
• How the major data models evolved
• About emerging alternative data models and the needs they fulfill
• How data models can be classified by their level of abstraction
Author: Carlos Coronel | Steven Morris
The Data Management challenges each organization faces are unique in their priority and severity. Therefore the structure and composition of a Data Organization is one of the major success factors for establishing a successful and sustainable data program. In this presentation, we will review the developmental stages of a data organization, the models and the choices for establishing the right structure to the organization in addition to the process for selecting the team members that will produce high-performance business results.
This presentation has been uploaded by Public Relations Cell, IIM Rohtak to help the B-school aspirants crack their interview by gaining basic knowledge on IT.
Transform Your DBMS to Drive Application InnovationEDB
IT leaders face ever-increasing challenges to fund and deliver innovative business solutions for better customer and stakeholder engagement. While much of the infrastructure stack has been commoditized, the DBMS remains an expensive and growing drain on IT resources that otherwise could drive innovation.
Thi presentation demonstrates that money can be freed up in IT from expensive database spend by transforming applications and the DBMS to a subscription-based, cloud ready EnterpriseDB Postgres offering.
Further, learn how the same DBMS used for these traditional workloads with familiar tools and a large skill base can also be used for new applications of engagement, which rely on a wider variety of data including NoSQL, semi-structured along with relational data.
Visit EnterpriseDB > Resources > Webcasts to listen to the webinar recording or email sales@enterprisedb.com.
Target Audience:
This presentation is intended for strategic IT and business decision-makers involved in data infrastructure decisions and cost savings.
This presentation covers some common terminology used to describe NoSQL databases, goes into depth on some popular scalable database architectures, and includes an overview of Hypertable
Database management system is a computer software system that has been designed to manage databases, Oracle, DB2, Microsoft Access among others are examples of DBMS.
For more such innovative content on management studies, join WeSchool PGDM-DLP Program: http://bit.ly/ZEcPAc
Introduction to Database and Database Management. This presentation gives a basic idea of the differences among terms and types of databases.
It can be used for the first lecture on Database Management course or a seminar in Information Systems.
It doesn't cover database modelling and languages.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
DBMS and its Models
1. By: Ahmad Shah Sultani
MSc CS/IT Scholar
South Asian University
New Delhi – India
August 2013
2. Data: Known facts that can be recorded and have
an implicit meaning.
Database: A collection of related data
Mini-world: Some part of the real world about
which data is stored in a database. For example,
student grades and transcripts at a university.
Database Management System (DBMS): A software
package/ system to facilitate the creation and
maintenance of a computerized database.
Database System: The DBMS software together with
the data itself. Sometimes, the applications are
also included.
3. A DBMS consists of a group of programs that
manipulate the database and provide an interface
between the database , the user of the database and
other application programs.
4. 1960 – First DBMS designed by Charles Bachman at
GE. Integrated Data source(IBS).
1970 – Codd introduced IMS. IBMs Information
Management System (IMS)
1980 – Relational model became popular and
accepted as the main database paradigm. SQL, ANSI
SQL, etc.
1980 to 1990 – New data models, powerful query
languages, etc. Popular vendors are Oracle, SQL
Server, IBMs DB2, Informix, etc.
5. A company has 500 GB of data on
employees, departments, products, sales, & so on..
Data is accessed concurrently by several employees
Queries about the data must be answered quickly
Changes made to the data by different users must be
applied consistently
Access to certain parts of the data be restricted
6. Data stored in operating system files
Many drawbacks!!!
500 GB of main memory not available to hold all data. Data
must be stored on secondary storage devices
Even if 500GB of main memory is available, with 32-bit
addressing, we cannot refer directly to more than 4GB of
data
Data redundancy and inconsistency
Multiple file formats, duplication of information in different files
Special program to answer each query user may ask
7. Many drawbacks!!!
Integrity problems
o Integrity constraints (e.g. account balance > 0) become
“buried” in program code rather than being stated explicitly
o Hard to add new constraints or change existing ones
We must protect the data from inconsistent changes made by
different users. If application programs need to address
concurrency, their complexity increases manifolds
Consistent state of data must be restored if the system crashes
while changes are being made
OS provide only a password mechanism for security. Not flexible
enough if users have permission to access subsets of data
8. Define a database : In terms of data types, structures and
constraints
Construct or Load the Database on a secondary storage
medium
Manipulating the database : Querying, generating
reports, insertions, deletions and modifications to its content
Concurrent Processing and Sharing by a set of users and
programs – Yet, keeping all data valid and consistent
Crash Recovery
Data Security and Integrity
Data Dictionary
Performance
9. Plan Executor
OptimizerOperator Evaluator
Parser
SQL Engine
Recovery
Manager
Lock
Manager
Tx
Manager
Files & Access
Buffer Manager
Disk Space Manager DBMS
CatalogData File Database
Web Forms Front-End
SQL I/f
SQL Commands
11. Casual users
These are people who use the database occasionally.
Naive users
These are users who constantly querying and updating the
database.
Eg. Reservation Clerks of Airline, Railway, Hotel, etc.
Clerks at receiving station of Courier service, Insurance
agencies, etc.
Sophisticated Users
People who use for their complex requirements.
Eg. Engineers, Scientists, Business analysts…
Standalone Users
Who maintain database for personal use.
12. Managing resources
Creation of user accounts
Providing security and authorization
Managing poor system response time
System Recovery
Tuning the Database
13. DDL – Data Definition Language
SDL – Storage Definition Language
VDL – View Definition Language
DML – Data Manipulation Language
(For data manipulations like
insertion, deletion, update,
retrieval, etc.)
14. Various types of data: Images, Text, complex
queries, Data Mining, etc.
Enterprise Resource Planning (ERP)
Management Resource Planning (MRP)
Database in Web technologies
Banking: all transactions
Airlines: reservations, schedules
Universities: registration, grades
Current Database trends:
Multimedia databases
Interactive video
Streaming data
Digital Libraries
Databases touch all aspects of our lives
15. Program-Data Independence
Insulation between programs and data: Allows changing
data storage structures and operations without having to
change the DBMS access programs.
Efficient Data Access
DBMS uses a variety of techniques to store & retrieve data
efficiently
Data Integrity & Security
Before inserting salary of an employee, the DBMS can check
that the dept. budget is not exceeded
Enforces access controls that govern what data is visible to
different classes of users
16. Data Administration
When several users share data , centralizing the
administration offers significant improvement
Concurrent Access & Crash Recovery
DBMS schedules concurrent access to the data in such a
manner that users think of the data as being accessed by
only one user at a time
DBMS protects users from the ill-effects of system failures
Reduced Application Development Time
Many important tasks are handled by the DBMS
17. What is data model ?
Why data models are important ?
Basic data-modeling building blocks.
What are business rules and how do they influence
database design?
How the major data models evolved ?
How data models can be classified by level of
abstraction ?
18. Data Model: A set of concepts to describe the
structure of a database, and certain constraints that
the database should obey.
Data Model Operations: Operations for specifying
database retrievals and updates by referring to the
concepts of the data model. Operations on the data
model may include basic operations and user-defined
operations.
19. Entity
◦ Anything about which data will be collected/stored
Attribute
◦ Characteristic of an entity
Relationship
◦ Describes an association among entities
One-to-one (1:1) relationship
One-to-many (1:M) relationship
Many-to-many (M:N or M:M) relationship
Constraint
◦ A restriction placed on the data
20. Data models
Representations, usually graphical, of complex
real-world data structures
Facilitate interaction among the designer, the
applications programmer and the end user
End-users have different views and needs
for data
Data model organizes data for various users
21. Brief, precise and unambiguous descriptions of
policies, procedures or principles within the
organization
Apply to any organization that stores and uses data to
generate information
Description of operations that help to create and
enforce actions within that organization‟s environment
22.
23. Conceptual (high-level, semantic) data models:
Provide concepts that are close to the way many users
perceive data. (Also called entity-based or object-
based data models.)
Physical (low-level, internal) data models: Provide
concepts that describe details of how data is stored in
the computer.
Implementation (representational/External) data
models: Provide concepts that fall between the above
two, balancing user views with some computer
storage details.
24. American National Standards Institute
(ANSI)
Standards Planning and
Requirements Committee
(SPARC)
Developed standards 1970
Framework for data modeling
based on degrees of data
abstraction:
External
Conceptual
Internal
Physical
25. Each end users‟ view of the data environment
Modeler subdivides requirements and constraints into
functional (Business unit‟s) modules
These can be examined within the framework of their external
models
26. Easy to identify specific data required to support each
business unit‟s operations
Facilitates designer‟s job by providing feedback
about the model‟s adequacy
Creation of external models helps to identify and
ensure security constraints in the database design
Simplifies application program development
27. Global view of the entire database
Representation of data as viewed by the entire organization
Basis for identification and high-level description of main
data objects, avoiding details
28. Software and hardware independent
◦ Independent of DBMS software
◦ Independent of hardware to be used
◦ Changes in either hardware or DBMS
software have no effect on the database
design at the conceptual level
Most widely used conceptual model is
the Entity Relationship (ER) model
◦ Provides a relatively easily understood macro
level view of data environment
29. The database as “seen” by the DBMS
Maps the conceptual model to the DBMS
Depicts a specific representation of an internal model
Logical independence
◦ Can change the internal model without affecting the conceptual model
30. Lowest level of abstraction
◦ Describes the way data are saved on
storage media such as disks or tapes
Software and hardware dependent
◦ Requires database designers to have a
detailed knowledge of the hardware and
software used to implement database
design
Physical independence
◦ Can change the physical model without
affecting the internal model
31.
32. Logically represented by an upside down TREE
Each parent can have many children
Each child has only one parent
The top layer is perceived as the parent of the
segment directly beneath it.
The segments below other segments are the children
of the segment above them.
SALIENT FEATURES
33.
34.
35. Emp No. First Name Last Name Dept Num
100 Ahmad Rashad 2A
101 Zobair Nabizada 2B
102 Chattan Kumar 2C
103 David Moorey 2D
Serial No. Type User Emp No.
3009734-4 Computer 100
3-23-283742 Monitor 100
2-22-723423 Monitor 100
232342 Printer 100
PARENT
CHILD
37. Complex implementation
Difficult to manage and lack of standards
Lacks structural independence
Applications programming and use complexity
Implementation limitations (no M:N relationship)
38.
39. Developed in mid 1960s as part of work of CODASYL
(Conference on Data Systems Languages) which proposed
programming language COBOL (1966) and then network model
(1971)
The network model has greater flexibility than the hierarchical
model for handling complex spatial relationships
Objective of network model is to separate data structure from
physical storage, eliminate unnecessary duplication of data with
associated errors and costs
The Network Database Model was created for three main
purposes :
- representing a complex data relationship more effectively
- improving database performance
- imposing a database standard
40. Major characteristic of this database model is that it
comprises of at least two record types ; the owner &
the member.
An owner is a record type equivalent to the parent type
in the hierarchal database model, and the member
record type resembles the child type in the hierarchal
model.
The network database model uses a data management
language that defines data characteristics and the data
structure in order to manipulate the data.
41. The network model contains logical information such as
connectivity relationships among nodes and links, directions
of links, and costs of nodes and links.
Example with diagram
42. A node represents an object of interest.
A link represents a relationship between two nodes. Within a
directed network, any link can be bidirected (that is, able to be
traversed either from the start node to the end node or from the
end node to the start node) or unidirected (that is, able to be
traversed only from the start node to the end node). Within an
undirected network, all links are bidirected.
A path is an alternating sequence of nodes and links, beginning
and ending with nodes, and usually with no nodes and links
appearing more than once. (Repeating nodes and links within a
path are permitted, but are rare in most network applications.)
43. A network is a set of nodes and links. A network is directed if
the links that it contains are directed, and a network is
undirected if the links that it contains are undirected.
A logical network contains connectivity information but no
geometric information. This is the model used for network
analysis. A logical network can be treated as a directed graph
or undirected graph, depending on the application.
Cost is a non-negative numeric attribute that can be
associated with links or nodes for computing the minimum
cost path
Duration is a non-negative numeric attribute that can be
associated with links or nodes to specify a duration value for
the link or node.
44. A network hierarchy enables us to represent a
network with multiple levels of abstraction by
assigning a hierarchy level to each node.
The lowest (most detailed) level in the hierarchy is
level 1, and successive higher levels are numbered 2,
3, and so on.
Nodes at adjacent levels of a network hierarchy have
parent-child relationships.
Each node at the higher level can be the parent node
for one or more nodes at the lower level.
45. Each node at the lower level can be a child node of
one node at the higher level.
Sibling nodes are nodes that have the same parent
node.
Links can also have parent-child relationships.
However, because links are not assigned to a
hierarchy level, there is not necessarily a relationship
between link parent-child relationships and network
hierarchy levels.
Sibling links are links that have the same parent link.
46. In a typical road network, the intersections of roads are
nodes and the road segments between two intersections are
links. An important operation with a road network is to find
the path from a start point to an end point, minimizing either
the travel time or distance. There may be additional constraints
on the path computation, such as having the path go through a
particular landmark or avoid a particular intersection.
Also in a biochemical process metabolic pathways are
networks involved in enzymatic reactions, while regulatory
pathways represent protein-protein interactions. In this
example, a pathway is a network; genes, proteins, and
chemical compounds are nodes; and reactions among nodes
are links.
47. The subway network of any major city is probably
best modeled as a logical network, assuming that
precise spatial representation of the stops and track
lines is unimportant. Important operations with a train
network include finding all stations that can be
reached from a specified station, finding the number
of stops between two specified stations, and finding
the travel time between two stations.
48. Simplicity : The network model is conceptually
simple and easy to design.
Ability to handle more relationship types : The
network model can handle the one-to-many and
many-to-many relationships.
Ease of data access : In the network database
terminology, a relationship is a set. Each set
comprises of two types of records.- an owner record
and a member record, In a network model an
application can access an owner record and all the
member records within a set.
49. Data Integrity : In a network model, no member can
exist without an owner. A user must therefore first
define the owner record and then the member record.
This ensures the integrity.
Data Independence : The network model draws a
clear line of demarcation between programs and the
complex physical storage details. The application
programs work independently of the data. Any
changes made in the data characteristics do not affect
the application program.
50. System Complexity : The structure of the network
model is very difficult to change. This type of system
is very complex
Lack of Structural independence : This model
lacks structural independence. This database model
should be used when it is necessary to have a flexible
way of representing objects and their relationship.
Any changes made to the database structure require
the application programs to be modified before they
can access data.
51.
52. Most widely used model.
◦ Vendors:
IBM, Informix, Microsoft, Oracle, Sybase, etc.
“Legacy systems” in older models
◦ E.G., IBM‟s IMS
Recent competitor: object-oriented model
◦ ObjectStore, Versant, Ontos
◦ A synthesis emerging: object-relational model
Informix Universal
Server, UniSQL, O2, Oracle, DB2
53. A database instance, or an „instance‟ is made
up of the background processes needed by
the database software.
Include a process monitor, session monitor,
lock monitor, etc. They will vary from
database vendor to database vendor.
54. A SCHEMA IS NOT A DATABASE, AND A DATABASE IS
NOT A SCHEMA.
A database instance controls 0 or more databases.
A database contains 0 or more database application schemas.
A database application schema
Set of database objects that apply to a specific application.
Objects are relational in nature, and are related to each other, within a database to serve a
specific functionality.
Example payroll, purchasing, calibration, trigger, etc.
A database application schema not a database. Usually several schemas coexist in a
database.
A database application is the code base to manipulate and
retrieve the data stored in the database application schema.
55. Table, a set of columns that contain data. In the old
days, a table was called a file.
Row, a set of columns from a table reflecting a
record.
Index, an object that allows for fast retrieval of table
rows. Every primary key and foreign key should have
an index for retrieval speed.
Primary key, often designated pk, is 1 or more
columns in a table that makes a record unique.
56. Foreign key, often designated fk, is a common
column common between 2 tables that define the
relationship between those 2 tables.
Foreign keys are either mandatory or optional.
Mandatory forces a child to have a parent by creating
a not null column at the child. Optional allows a child
to exist without a parent, allowing a nullable column
at the child table (not a common circumstance).
59. Constraints are rules residing in the database‟s data
dictionary governing relationships and dictating the
ways records are manipulated, what is a legal move vs.
what is an illegal move.
These are of the utmost importance for a secure and
consistent set of data.
60. Data Manipulation Language or DML, sql statements
that insert, update or delete database in a database.
Data Definition Language or DDL, sql used to create
and modify database objects used in an application
schema.
61. A transaction is a logical unit of work that contains one
or more SQL statements.
A transaction is an atomic unit.
The effects of all the SQL statements in a transaction can be
either all committed (applied to the database) or all rolled
back (undone from the database), insuring data consistency.
62. A view is a selective presentation of the structure
of, and data in, one or more tables (or other views).
A view is a „virtual table‟, having predefined columns
and joins to one or more tables, reflecting a specific
facet of information.
63. Database triggers are PL/SQL, Java, or C procedures
that run implicitly whenever a table or view is
modified or when some user actions or database
system actions occur.
Database triggers can be used in a variety of ways for
managing your database.
For example, they can automate data generation, audit data
modifications, enforce complex integrity constraints, and
customize complex security authorizations.
Trigger methodology differs between databases.
64. Name Price Category Manufacturer
gizmo $19.99 gadgets GizmoWorks
Power gizmo $29.99 gadgets GizmoWorks
SingleTouch $149.99 photography Canon
MultiTouch $203.99 household Hitachi
Tuples or rows or records
Attribute names
Table name or relation name
Products:
65.
66. Relation (file, table) is a two-dimensional table.
Attribute (i.e. field or data item) is a column in the
table.
Each column in the table has a unique name within
that table.
Each column is homogeneous. Thus the entries in any
column are all of the same type (e.g. age, name,
employee-number, etc).
Each column has a domain, the set of possible values
that can appear in that column.
A Tuple (i.e. record) is a row in the table.
67. The order of the rows and columns is not important.
Values of a row all relate to some thing or portion of a thing.
Repeating groups (collections of logically related attributes that
occur multiple times within one record occurrence) are not
allowed.
Duplicate rows are not allowed (candidate keys are designed to
prevent this).
Cells must be single-valued (but can be variable length). Single
valued means the following:
◦ Cannot contain multiple values such as 'A1,B2,C3'.
◦ Cannot contain combined values such as 'ABC-XYZ' where
'ABC' means one thing and 'XYZ' another