The document discusses algorithms and their analysis. It begins by defining an algorithm and key aspects like correctness, input, and output. It then discusses two aspects of algorithm performance - time and space. Examples are provided to illustrate how to analyze the time complexity of different structures like if/else statements, simple loops, and nested loops. Big O notation is introduced to describe an algorithm's growth rate. Common time complexities like constant, linear, quadratic, and cubic functions are defined. Specific sorting algorithms like insertion sort, selection sort, bubble sort, merge sort, and quicksort are then covered in detail with examples of how they work and their time complexities.
Monadic Comprehensions and Functional Composition with Query ExpressionsChris Eargle
Build monads using the C# language with a C# style, then use the appropriate methods to ensure the LINQ query syntax works with this functional design pattern. After describing monads, we will cut the middleman and apply the same techniques directly to objects and functions to achieve better results with a declarative syntax.
Monadic Comprehensions and Functional Composition with Query ExpressionsChris Eargle
Build monads using the C# language with a C# style, then use the appropriate methods to ensure the LINQ query syntax works with this functional design pattern. After describing monads, we will cut the middleman and apply the same techniques directly to objects and functions to achieve better results with a declarative syntax.
Daniel Glazman, W3C CSS Working Group Chair and Web Tech Lead from Samsung OSG, discusses how CSS 3 and stylesheets will affect web standards in the future.
Google Algorithm Change History - 2k14-2k16.Saba SEO
For the organizations like "Saba SEO" to stick to the Google exercises and generate the relevant piece of content, it is important to know about the Google algorithm changes. The work of these algo(s) is to restrict companies and individual from spamming the content of the website with links that risen up the website traffic. In other words, these algorithms do justice with the sites in order to rank in the search engines. By the help of algorithms, if someone is doing spam for the sake of rankings can't take your place if you're on the right path. This infographic explains the 13 algorithms used by Google in between 2014 and 2016.
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion should be inserted
while some elements unsorted:
Using linear search, find the location in the sorted portion where the 1st element of the unsorted portion
Analysis and design of algorithms part2Deepak John
Analysis of searching and sorting. Insertion sort, Quick sort, Merge sort and Heap sort. Binomial Heaps and Fibonacci Heaps, Lower bounds for sorting by comparison of keys. Comparison of sorting algorithms. Amortized Time Analysis. Red-Black Trees – Insertion & Deletion.
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...Tosin Amuda
Sorting is a fundamental operation in computer science (many programs use it as an intermediate step), and as a result a large number of good sorting algorithms have been developed. Which algorithm is best for a given application depends on—among other factors—the number of items to be sorted, the extent to which the items are already somewhat sorted, possible restrictions on the item values, and the kind of storage device to be used: main memory, disks, or tapes.
There are three reasons to study sorting algorithms. First, sorting algorithms illustrate many creative approaches to problem solving, and these approaches can be applied to solve other problems. Second, sorting algorithms are good for practicing fundamental programming techniques using selection statements, loops, methods, and arrays. Third, sorting algorithms are excellent examples to demonstrate algorithm performance.
However, this paper attempt to compare the practical efficiency of three sorting algorithms – Insertion, Quick and mere Sort using empirical analysis. The result of the experiment shows that insertion sort is a quadratic time sorting algorithm and that it’s more applicable to subarray that is sufficiently small. The merge sort performs better with larger size of input as compared to insertion sort. Quicksort runs the most efficiently.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
5. There are two aspects of algorithmic
performance:
Time
Space
6. First, we start to count the number of basic
operations in a particular solution to assess its
efficiency.
Then, we will express the efficiency of algorithms
using growth functions.
7. We measure an algorithm’s time requirement
as a function of the problem size.
The most important thing to learn is how
quickly the algorithm’s time requirement
grows as a function of the problem size.
An algorithm’s proportional time requirement
is known as growth rate.
We can compare the efficiency of two
algorithms by comparing their growth rates.
8. Each operation in an algorithm (or a program) has a
cost.
Each operation takes a certain of time.
count = count + 1; take a certain amount of time, but it is
constant
A sequence of operations:
count = count + 1; Cost: c1
sum = sum + count; Cost: c2
Total Cost = c1 + c2
10. Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5
The time required for this algorithm is proportional
to n
11. Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 +
n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
The time required for this algorithm is proportional to n2
13. Informal definitions:
◦ Given a complexity function f(n),
◦ O(f(n)) is the set of complexity functions that are
upper bounds on f(n)
◦ (f(n)) is the set of complexity functions that are
lower bounds on f(n)
◦ (f(n)) is the set of complexity functions that,
given the correct constants, correctly describes f(n)
Example: If f(n) = 17n3 + 4n – 12, then
◦ O(f(n)) contains n3, n4, n5, 2n, etc.
◦ (f(n)) contains 1, n, n2, n3, log n, n log n, etc.
◦ (f(n)) contains n3
15. Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5
The time required for this algorithm is proportional
to n
O(n)
16. Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 +
n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
The time required for this algorithm is proportional to n2
O(n2)
17. Function Growth Rate Name
c Constant
log N Logarithmic
log2N Log-squared
N Linear
N log N Linearithmic
N2 Quadratic
N3 Cubic
2N Exponential
18.
19.
20. Input:
◦ A sequence of n numbers a1, a2, . . . , an
Output:
◦ A permutation (reordering) a1’, a2’, . . . , an’ of the
input sequence such that a1’ ≤ a2’ ≤ · · · ≤ an’
21. In-Place Sort
◦ The amount of extra space required to sort the data
is constant with the input size.
22. Sorted on first key:
Sort file on second key:
Records with key value
3 are not in order on
first key!!
Stable sort
◦ preserves relative order of records with equal keys
23. Idea: like sorting a hand of playing cards
◦ Start with an empty left hand and the cards facing
down on the table.
◦ Remove one card at a time from the table, and
insert it into the correct position in the left hand
◦ The cards held in the left hand are sorted
24. To insert 12, we need to
make room for it by moving
first 36 and then 24.
25.
26.
27.
28. insertionsort (a) {
for (i = 1; i < a.length; ++i) {
key = a[i]
pos = i
while (pos > 0 && a[pos-1] > key) {
a[pos]=a[pos-1]
pos--
}
a[pos] = key
}
}
29. O(n2), stable, in-place
O(1) space
Great with small number of elements
30. Algorithm:
◦ Find the minimum value
◦ Swap with 1st position value
◦ Repeat with 2nd position down
O(n2), stable, in-place
31. Algorithm
◦ Traverse the collection
◦ “Bubble” the largest value to the end using pairwise
comparisons and swapping
O(n2), stable, in-place
Totally useless?
32. 1. Divide: split the array in two
halves
2. Conquer: Sort recursively both
subarrays
3. Combine: merge the two sorted
subarrays into a sorted array
34. The key to Merge Sort is merging two sorted
lists into one, such that if you have two lists
X (x1x2…xm) and Y(y1y2…yn) the
resulting list is Z(z1z2…zm+n)
Example:
L1 = { 3 8 9 } L2 = { 1 5 7 }
merge(L1, L2) = { 1 3 5 7 8 9 }
54. Merge Sort runs O (N log N) for all cases, because of
its Divide and Conquer approach.
T(N) = 2T(N/2) + N = O(N logN)
55. 1. Select: pick an element x
2. Divide: rearrange elements so
that x goes to its final position
• L elements less than x
• G elements greater than or equal
to x
3. Conquer: sort recursively L and G
x
x
x
L G
L G
58. Use the first element as pivot
◦ if the input is random, ok
◦ if the input is presorted? - shuffle in advance
Choose the pivot randomly
◦ generally safe
◦ random numbers generation can be expensive
59. Use the median of the array
◦ Partitioning always cuts the array into half
◦ An optimal quicksort (O(n log n))
◦ hard to find the exact median (chicken-egg?)
◦ Approximation to the exact median..
Median of three
◦ Compare just three elements: the leftmost, the
rightmost and the center
◦ Use the middle of the three as pivot
60. Given a pivot, partition the elements of the
array such that the resulting array consists of:
◦ One subarray that contains elements < pivot
◦ One subarray that contains elements >= pivot
The subarrays are stored in the original array
69. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
70. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
71. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
72. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
73. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
74. 40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
75. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
76. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
77. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
78. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
79. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
80. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
81. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
82. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
83. 40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
84. 1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
5. swap a[too_small_index]a[pivot_index]
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
85. 7 20 10 30 40 50 60 80 100pivot_index = 4
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
5. swap a[too_small_index]a[pivot_index]
86. Running time
◦ pivot selection: constant time, i.e. O(1)
◦ partitioning: linear time, i.e. O(N)
◦ running time of the two recursive calls
T(N)=T(i)+T(N-i-1)+cN where c is a
constant
◦ i: number of elements in L
87. What will be the worst case?
◦ The pivot is the smallest element, all the time
◦ Partition is always unbalanced
88. What will be the best case?
◦ Partition is perfectly balanced.
◦ Pivot is always in the middle (median of the array)
89. Java API provides a class Arrays with several
overloaded sort methods for different array
types
Class Collections provides similar sorting
methods
90. Arrays methods:
public static void sort (int[] a)
public static void sort (Object[] a)
// requires Comparable
public static <T> void sort (T[] a,
Comparator<? super T> comp)
// uses given Comparator
91. Collections methods:
public static <T extends Comparable<T>>
void sort (List<T> list)
public static <T> void sort (List<T> l,
Comparator<? super T> comp)
92.
93. Given the collection and an element to
find…
Determine whether the “target”
element was found in the collection
◦ Print a message
◦ Return a value
(an index or pointer, etc.)
Don’t modify the collection in the
search!
94. A search traverses the collection until
◦ the desired element is found
◦ or the collection is exhausted
95. linearsearch (a, key) {
for (i = 0; i < a.length; i++) {
if (a[i] == key) return i
}
return –1
}
127. Set
◦ The familiar set abstraction.
◦ No duplicates; May or may not be ordered.
List
◦ Ordered collection, also known as a sequence.
◦ Duplicates permitted; Allows positional access
Map
◦ A mapping from keys to values.
◦ Each key can map to at most one value (function).
128. Set List Map
HashSet ArrayList HashMap
LinkedHashSet LinkedList LinkedHashMap
TreeSet Vector Hashtable
TreeMap
129. Ordered
◦ Elements are stored and accessed in a specific
order
Sorted
◦ Elements are stored and accessed in a sorted
order
Indexed
◦ Elements can be accessed using an index
Unique
◦ Collection does not allow duplicates
130. A linked list is a series of connected nodes
Each node contains at least
◦ A piece of data (any type)
◦ Pointer to the next node in the list
Head: pointer to the first node
The last node points to NULL
A
Head
B C
A
data pointer
node
135. Operation Complexity
insert at beginning O(1)
Insert at end O(1)
Insert at index O(n)
delete at beginning O(1)
delete at end O(1)
delete at index O(n)
find element O(n)
access element by index O(n)
139. Operation Complexity
insert at beginning O(n)
Insert at end O(1) amortized
Insert at index O(n)
delete at beginning O(n)
delete at end O(1)
delete at index O(n)
find element O(n)
access element by index O(1)
140. Some collections are constrained so clients
can only use optimized operations
◦ stack: retrieves elements in reverse order as added
◦ queue: retrieves elements in same order as added
stack
queue
top 3
2
bottom 1
pop, peekpush
front back
1 2 3
addremove, peek
141. stack: A collection based on the principle of
adding elements and retrieving them in the
opposite order.
basic stack operations:
◦ push: Add an element to the top.
◦ pop: Remove the top element.
◦ peek: Examine the top element.
stack
top 3
2
bottom 1
pop, peekpush
142. Programming languages and compilers:
◦ method call stack
Matching up related pairs of things:
◦ check correctness of brackets (){}[]
Sophisticated algorithms:
◦ undo stack
143. queue: Retrieves elements in the order they
were added.
basic queue operations:
◦ add (enqueue): Add an element to the back.
◦ remove (dequeue): Remove the front element.
◦ peek: Examine the front element.
queue
front back
1 2 3
addremove, peek
144. Operating systems:
◦ queue of print jobs to send to the printer
Programming:
◦ modeling a line of customers or clients
Real world examples:
◦ people on an escalator or waiting in a line
◦ cars at a gas station
145. A data structure optimized for a very
specific kind of search / access
In a map we access by asking "give me the
value associated with this key."
capacity, load factor
A -> 65
149. What to do when inserting an element and
already something present?
150. Could search forward or backwards for an
open space
Linear probing
◦ move forward 1 spot. Open?, 2 spots, 3 spots
Quadratic probing
◦ 1 spot, 2 spots, 4 spots, 8 spots, 16 spots
Resize when load factor reaches some limit
151. Each element of hash table be another data
structure
◦ LinkedList
◦ Balanced Binary Tree
Resize at given load factor or when any chain
reaches some limit
152. Implements Map
Sorted
Easy access to the biggest
logarithmic put, get
Comparable or Comparator
153. 0, 1, or 2 children per node
Binary Search Tree
◦ node.left < node.value
◦ node.right >= node.value
154. A priority queue stores a collection of entries
Main methods of the Priority Queue ADT
◦ insert(k, x)
inserts an entry with key k and value x
◦ removeMin()
removes and returns the entry with smallest key
Priority Queues
15
4
155. A heap can be seen as a complete binary tree:
16
14 10
8 7 9 3
2 4 1
156. A heap can be seen as a complete binary tree:
16
14 10
8 7 9 3
2 4 1 1 1 111
157. In practice, heaps are usually implemented as
arrays:
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 1 =0
158. To represent a complete binary tree as an
array:
◦ The root node is A[1]
◦ Node i is A[i]
◦ The parent of node i is A[i/2] (note: integer divide)
◦ The left child of node i is A[2i]
◦ The right child of node i is A[2i + 1]
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 1 =0
168. java.util.Collections
java.util.Arrays exports similar basic operations for an
array.
binarySearch(list, key)
sort(list)
min(list)
max(list)
reverse(list)
shuffle(list)
swap(list, p1, p2)
replaceAll(list, x1, x2)
Finds key in a sorted list using binary search.
Sorts a list into ascending order.
Returns the smallest value in a list.
Returns the largest value in a list.
Reverses the order of elements in a list.
Randomly rearranges the elements in a list.
Exchanges the elements at index positions p1 and p2.
Replaces all elements matching x1 with x2.