This document discusses various algorithms for sorting data, including:
- Bubble sort, which works by comparing and swapping adjacent elements until the list is fully sorted. Both regular bubble sort and a version with a sentinel are described.
- Bidirectional bubble sort, which works in both directions simultaneously to prevent issues when the data is almost sorted.
The document provides pseudocode examples and discusses the time complexity of different sorting algorithms, including simple algorithms like bubble sort and more sophisticated approaches. It aims to classify and explain sorting techniques.
This document discusses insertion sort, including its mechanism, algorithm, runtime analysis, advantages, and disadvantages. Insertion sort works by iterating through an unsorted array and inserting each element into its sorted position by shifting other elements over. Its worst case runtime is O(n^2) when the array is reverse sorted, but it performs well on small, nearly sorted lists. While simple to implement, insertion sort is inefficient for large datasets compared to other algorithms.
From the perspective of Design and Analysis of Algorithm. I made these slide by collecting data from many sites.
I am Danish Javed. Student of BSCS Hons. at ITU Information Technology University Lahore, Punjab, Pakistan.
It is a presentation on some Searching and Sorting Techniques for Computer Science.
It consists of the following techniques:
Sequential Search
Binary Search
Selection Sort
Bubble Sort
Insertion Sort
Mca ii dfs u-1 introduction to data structureRai University
This document provides an introduction to data structures. It defines data structures as a way of organizing and storing data in a computer so that it can be used efficiently. The document discusses different types of data structures including primitive, non-primitive, linear and non-linear structures. It provides examples of various data structures like arrays, linked lists, stacks, queues and trees. It also covers important concepts like time complexity, space complexity and Big O notation for analyzing algorithms. Common operations on data structures like search, insert and delete are also explained.
Binary search is a fast search algorithm that works on sorted data by comparing the middle element of the collection to the target value. It divides the search space in half at each step to quickly locate an element. The algorithm gets the middle element, compares it to the target, and either searches the left or right half recursively depending on if the target is less than or greater than the middle element. An example demonstrates finding the value 23 in a sorted array using this divide and conquer approach.
This document discusses implementing stacks and queues using linked lists. For a stack, elements are inserted and removed from the head (start) of the linked list for constant time operations. For a queue, elements are inserted at the head and removed from the tail (end) of the linked list, requiring traversing to the second last node for removal. Implementing stacks and queues with linked lists avoids size limitations of arrays and uses dynamic memory allocation.
The document presents an overview of selection sort, including its definition, algorithm, example, advantages, and disadvantages. Selection sort works by iteratively finding the minimum element in an unsorted sublist and exchanging it with the first element. It has a time complexity of O(n2) but performs well on small lists since it is an in-place sorting algorithm with minimal additional storage requirements. However, it is not efficient for huge datasets due to its quadratic time complexity.
The document discusses different types of queues including their representations, operations, and applications. It describes queues as linear data structures that follow a first-in, first-out principle. Common queue operations are insertion at the rear and deletion at the front. Queues can be represented using arrays or linked lists. Circular queues and priority queues are also described as variants that address limitations of standard queues. Real-world and technical applications of queues include CPU scheduling, cashier lines, and data transfer between processes.
This document discusses insertion sort, including its mechanism, algorithm, runtime analysis, advantages, and disadvantages. Insertion sort works by iterating through an unsorted array and inserting each element into its sorted position by shifting other elements over. Its worst case runtime is O(n^2) when the array is reverse sorted, but it performs well on small, nearly sorted lists. While simple to implement, insertion sort is inefficient for large datasets compared to other algorithms.
From the perspective of Design and Analysis of Algorithm. I made these slide by collecting data from many sites.
I am Danish Javed. Student of BSCS Hons. at ITU Information Technology University Lahore, Punjab, Pakistan.
It is a presentation on some Searching and Sorting Techniques for Computer Science.
It consists of the following techniques:
Sequential Search
Binary Search
Selection Sort
Bubble Sort
Insertion Sort
Mca ii dfs u-1 introduction to data structureRai University
This document provides an introduction to data structures. It defines data structures as a way of organizing and storing data in a computer so that it can be used efficiently. The document discusses different types of data structures including primitive, non-primitive, linear and non-linear structures. It provides examples of various data structures like arrays, linked lists, stacks, queues and trees. It also covers important concepts like time complexity, space complexity and Big O notation for analyzing algorithms. Common operations on data structures like search, insert and delete are also explained.
Binary search is a fast search algorithm that works on sorted data by comparing the middle element of the collection to the target value. It divides the search space in half at each step to quickly locate an element. The algorithm gets the middle element, compares it to the target, and either searches the left or right half recursively depending on if the target is less than or greater than the middle element. An example demonstrates finding the value 23 in a sorted array using this divide and conquer approach.
This document discusses implementing stacks and queues using linked lists. For a stack, elements are inserted and removed from the head (start) of the linked list for constant time operations. For a queue, elements are inserted at the head and removed from the tail (end) of the linked list, requiring traversing to the second last node for removal. Implementing stacks and queues with linked lists avoids size limitations of arrays and uses dynamic memory allocation.
The document presents an overview of selection sort, including its definition, algorithm, example, advantages, and disadvantages. Selection sort works by iteratively finding the minimum element in an unsorted sublist and exchanging it with the first element. It has a time complexity of O(n2) but performs well on small lists since it is an in-place sorting algorithm with minimal additional storage requirements. However, it is not efficient for huge datasets due to its quadratic time complexity.
The document discusses different types of queues including their representations, operations, and applications. It describes queues as linear data structures that follow a first-in, first-out principle. Common queue operations are insertion at the rear and deletion at the front. Queues can be represented using arrays or linked lists. Circular queues and priority queues are also described as variants that address limitations of standard queues. Real-world and technical applications of queues include CPU scheduling, cashier lines, and data transfer between processes.
Insertion sort works by iterating through an array, inserting each element into its sorted position by shifting other elements over. It finds the location where each element should be inserted into the sorted portion using a linear search, moving larger elements out of the way to make room. This sorting algorithm is most effective for small data sets and can be implemented recursively or iteratively through comparisons and shifts.
The document discusses bubble sort, a simple sorting algorithm where each pair of adjacent elements is compared and swapped if out of order. It gets its name because elements "bubble" to their correct positions like bubbles rising in a glass of soda. The algorithm makes multiple passes through the list, swapping elements on each pass until the list is fully sorted. While simple to implement, bubble sort has a slow running time of O(n^2), making it inefficient for large data sets.
The document discusses insertion sort, a simple sorting algorithm that builds a sorted output list from an input one element at a time. It is less efficient on large lists than more advanced algorithms. Insertion sort iterates through the input, at each step removing an element and inserting it into the correct position in the sorted output list. The best case for insertion sort is an already sorted array, while the worst is a reverse sorted array.
Queues
a. Concept and Definition
b. Queue as an ADT
c. Implementation of Insert and Delete operation of:
• Linear Queue
• Circular Queue
For More:
https://github.com/ashim888/dataStructureAndAlgorithm
http://www.ashimlamichhane.com.np/
This document discusses various sorting algorithms and their complexities. It begins by defining an algorithm and complexity measures like time and space complexity. It then defines sorting and common sorting algorithms like bubble sort, selection sort, insertion sort, quicksort, and mergesort. For each algorithm, it provides a high-level overview of the approach and time complexity. It also covers sorting algorithm concepts like stable and unstable sorting. The document concludes by discussing future directions for sorting algorithms and their applications.
The document discusses the divide and conquer algorithm design technique. It begins by explaining the basic approach of divide and conquer which is to (1) divide the problem into subproblems, (2) conquer the subproblems by solving them recursively, and (3) combine the solutions to the subproblems into a solution for the original problem. It then provides merge sort as a specific example of a divide and conquer algorithm for sorting a sequence. It explains that merge sort divides the sequence in half recursively until individual elements remain, then combines the sorted halves back together to produce the fully sorted sequence.
The document describes insertion sort, a sorting algorithm. It lists the group members who researched insertion sort and provides an introduction. It then explains how insertion sort works by example, showing how it iterates through an array and inserts elements into the sorted portion. Pseudocode and analysis of insertion sort's runtime is provided. Comparisons are made between insertion sort and other algorithms like bubble sort, selection sort, and merge sort, analyzing their time complexities in best, average, and worst cases.
This document discusses priority queues. It defines a priority queue as a queue where insertion and deletion are based on some priority property. Items with higher priority are removed before lower priority items. There are two main types: ascending priority queues remove the smallest item, while descending priority queues remove the largest item. Priority queues are useful for scheduling jobs in operating systems, where real-time jobs have highest priority and are scheduled first. They are also used in network communication to manage limited bandwidth.
Quicksort is a sorting algorithm that works by partitioning an array around a pivot value, and then recursively sorting the sub-partitions. It chooses a pivot element and partitions the array based on whether elements are less than or greater than the pivot. Elements are swapped so that those less than the pivot are moved left and those greater are moved right. The process recursively partitions the sub-arrays until the entire array is sorted.
The document discusses different sorting algorithms including merge sort and quicksort. Merge sort has a divide and conquer approach where an array is divided into halves and the halves are merged back together in sorted order. This results in a runtime of O(n log n). Quicksort uses a partitioning approach, choosing a pivot element and partitioning the array into subarrays of elements less than or greater than the pivot. In the best case, this partitions the array in half at each step, resulting in a runtime of O(n log n). In the average case, the runtime is also O(n log n). In the worst case, the array is already sorted, resulting in unbalanced partitions and a quadratic runtime of O(n^2
The document discusses parallel algorithms and parallel computing. It begins by defining parallelism in computers as performing more than one task at the same time. Examples of parallelism include I/O chips and pipelining of instructions. Common terms for parallelism are defined, including concurrent processing, distributed processing, and parallel processing. Issues in parallel programming such as task decomposition and synchronization are outlined. Performance issues like scalability and load balancing are also discussed. Different types of parallel machines and their classification are described.
The document discusses sorting algorithms. It begins by defining sorting as arranging data in logical order based on a key. It then discusses internal and external sorting methods. For internal sorting, all data fits in memory, while external sorting handles data too large for memory. The document covers stability, efficiency, and time complexity of various sorting algorithms like bubble sort, selection sort, insertion sort, and merge sort. Merge sort uses a divide-and-conquer approach to sort arrays with a time complexity of O(n log n).
Binary Search - Design & Analysis of AlgorithmsDrishti Bhalla
Binary search is an efficient algorithm for finding a target value within a sorted array. It works by repeatedly dividing the search range in half and checking the value at the midpoint. This eliminates about half of the remaining candidates in each step. The maximum number of comparisons needed is log n, where n is the number of elements. This makes binary search faster than linear search, which requires checking every element. The algorithm works by first finding the middle element, then checking if it matches the target. If not, it recursively searches either the lower or upper half depending on if the target is less than or greater than the middle element.
Selection sort is a sorting algorithm that finds the smallest element in an unsorted list and swaps it with the first element, then finds the next smallest element and swaps it with the second element, continuing in this way until the list is fully sorted. It works by iterating through the list, finding the minimum element, and swapping it into its correct place at each step.
The document describes insertion sort, including an example with figures, the algorithm, implementations in Java, C++ and Python, runtime performance of Θ(n2) in the average and worst cases but Θ(n) in the best case, an example execution on sample input data, and some other notes about its properties. It is a simple sorting algorithm that works by building up a sorted sequence from left to right by inserting each element into its sorted position.
Strassen's algorithm improves on the basic matrix multiplication algorithm which runs in O(N3) time. It achieves this by dividing the matrices into sub-matrices and performing 7 multiplications and 18 additions on the sub-matrices, rather than the 8 multiplications of the basic algorithm. This results in a runtime of O(N2.81) using divide and conquer, providing an asymptotic improvement over the basic O(N3) algorithm.
This document discusses the greedy algorithm approach and the knapsack problem. It defines greedy algorithms as choosing locally optimal solutions at each step in hopes of reaching a global optimum. The knapsack problem is described as packing items into a knapsack to maximize total value without exceeding weight capacity. An optimal knapsack algorithm is presented that sorts by value-to-weight ratio and fills highest ratios first. An example applies this to maximize profit of 440 by selecting full quantities of items B and A, and half of item C for a knapsack with capacity of 60.
The document discusses and compares linear and binary search algorithms. Linear search sequentially checks each element of an unsorted array to find a target value, while binary search works on a sorted array by repeatedly calculating the midpoint and comparing the target to the value there to narrow the search range. It provides steps for performing a binary search, including sorting the array, calculating the midpoint, and updating the search range based on whether the target is less than, greater than, or equal to the midpoint value.
The document discusses various sorting algorithms. It begins by defining sorting as organizing a list of elements into a certain order, such as ascending or descending. It then discusses the objectives of learning about sorting algorithms like bubble sort, insertion sort, selection sort, and merge sort. The document proceeds to explain the concepts of each of these sorting algorithms at a high level through diagrams and examples.
Insertion sort works by iterating through an array, inserting each element into its sorted position by shifting other elements over. It finds the location where each element should be inserted into the sorted portion using a linear search, moving larger elements out of the way to make room. This sorting algorithm is most effective for small data sets and can be implemented recursively or iteratively through comparisons and shifts.
The document discusses bubble sort, a simple sorting algorithm where each pair of adjacent elements is compared and swapped if out of order. It gets its name because elements "bubble" to their correct positions like bubbles rising in a glass of soda. The algorithm makes multiple passes through the list, swapping elements on each pass until the list is fully sorted. While simple to implement, bubble sort has a slow running time of O(n^2), making it inefficient for large data sets.
The document discusses insertion sort, a simple sorting algorithm that builds a sorted output list from an input one element at a time. It is less efficient on large lists than more advanced algorithms. Insertion sort iterates through the input, at each step removing an element and inserting it into the correct position in the sorted output list. The best case for insertion sort is an already sorted array, while the worst is a reverse sorted array.
Queues
a. Concept and Definition
b. Queue as an ADT
c. Implementation of Insert and Delete operation of:
• Linear Queue
• Circular Queue
For More:
https://github.com/ashim888/dataStructureAndAlgorithm
http://www.ashimlamichhane.com.np/
This document discusses various sorting algorithms and their complexities. It begins by defining an algorithm and complexity measures like time and space complexity. It then defines sorting and common sorting algorithms like bubble sort, selection sort, insertion sort, quicksort, and mergesort. For each algorithm, it provides a high-level overview of the approach and time complexity. It also covers sorting algorithm concepts like stable and unstable sorting. The document concludes by discussing future directions for sorting algorithms and their applications.
The document discusses the divide and conquer algorithm design technique. It begins by explaining the basic approach of divide and conquer which is to (1) divide the problem into subproblems, (2) conquer the subproblems by solving them recursively, and (3) combine the solutions to the subproblems into a solution for the original problem. It then provides merge sort as a specific example of a divide and conquer algorithm for sorting a sequence. It explains that merge sort divides the sequence in half recursively until individual elements remain, then combines the sorted halves back together to produce the fully sorted sequence.
The document describes insertion sort, a sorting algorithm. It lists the group members who researched insertion sort and provides an introduction. It then explains how insertion sort works by example, showing how it iterates through an array and inserts elements into the sorted portion. Pseudocode and analysis of insertion sort's runtime is provided. Comparisons are made between insertion sort and other algorithms like bubble sort, selection sort, and merge sort, analyzing their time complexities in best, average, and worst cases.
This document discusses priority queues. It defines a priority queue as a queue where insertion and deletion are based on some priority property. Items with higher priority are removed before lower priority items. There are two main types: ascending priority queues remove the smallest item, while descending priority queues remove the largest item. Priority queues are useful for scheduling jobs in operating systems, where real-time jobs have highest priority and are scheduled first. They are also used in network communication to manage limited bandwidth.
Quicksort is a sorting algorithm that works by partitioning an array around a pivot value, and then recursively sorting the sub-partitions. It chooses a pivot element and partitions the array based on whether elements are less than or greater than the pivot. Elements are swapped so that those less than the pivot are moved left and those greater are moved right. The process recursively partitions the sub-arrays until the entire array is sorted.
The document discusses different sorting algorithms including merge sort and quicksort. Merge sort has a divide and conquer approach where an array is divided into halves and the halves are merged back together in sorted order. This results in a runtime of O(n log n). Quicksort uses a partitioning approach, choosing a pivot element and partitioning the array into subarrays of elements less than or greater than the pivot. In the best case, this partitions the array in half at each step, resulting in a runtime of O(n log n). In the average case, the runtime is also O(n log n). In the worst case, the array is already sorted, resulting in unbalanced partitions and a quadratic runtime of O(n^2
The document discusses parallel algorithms and parallel computing. It begins by defining parallelism in computers as performing more than one task at the same time. Examples of parallelism include I/O chips and pipelining of instructions. Common terms for parallelism are defined, including concurrent processing, distributed processing, and parallel processing. Issues in parallel programming such as task decomposition and synchronization are outlined. Performance issues like scalability and load balancing are also discussed. Different types of parallel machines and their classification are described.
The document discusses sorting algorithms. It begins by defining sorting as arranging data in logical order based on a key. It then discusses internal and external sorting methods. For internal sorting, all data fits in memory, while external sorting handles data too large for memory. The document covers stability, efficiency, and time complexity of various sorting algorithms like bubble sort, selection sort, insertion sort, and merge sort. Merge sort uses a divide-and-conquer approach to sort arrays with a time complexity of O(n log n).
Binary Search - Design & Analysis of AlgorithmsDrishti Bhalla
Binary search is an efficient algorithm for finding a target value within a sorted array. It works by repeatedly dividing the search range in half and checking the value at the midpoint. This eliminates about half of the remaining candidates in each step. The maximum number of comparisons needed is log n, where n is the number of elements. This makes binary search faster than linear search, which requires checking every element. The algorithm works by first finding the middle element, then checking if it matches the target. If not, it recursively searches either the lower or upper half depending on if the target is less than or greater than the middle element.
Selection sort is a sorting algorithm that finds the smallest element in an unsorted list and swaps it with the first element, then finds the next smallest element and swaps it with the second element, continuing in this way until the list is fully sorted. It works by iterating through the list, finding the minimum element, and swapping it into its correct place at each step.
The document describes insertion sort, including an example with figures, the algorithm, implementations in Java, C++ and Python, runtime performance of Θ(n2) in the average and worst cases but Θ(n) in the best case, an example execution on sample input data, and some other notes about its properties. It is a simple sorting algorithm that works by building up a sorted sequence from left to right by inserting each element into its sorted position.
Strassen's algorithm improves on the basic matrix multiplication algorithm which runs in O(N3) time. It achieves this by dividing the matrices into sub-matrices and performing 7 multiplications and 18 additions on the sub-matrices, rather than the 8 multiplications of the basic algorithm. This results in a runtime of O(N2.81) using divide and conquer, providing an asymptotic improvement over the basic O(N3) algorithm.
This document discusses the greedy algorithm approach and the knapsack problem. It defines greedy algorithms as choosing locally optimal solutions at each step in hopes of reaching a global optimum. The knapsack problem is described as packing items into a knapsack to maximize total value without exceeding weight capacity. An optimal knapsack algorithm is presented that sorts by value-to-weight ratio and fills highest ratios first. An example applies this to maximize profit of 440 by selecting full quantities of items B and A, and half of item C for a knapsack with capacity of 60.
The document discusses and compares linear and binary search algorithms. Linear search sequentially checks each element of an unsorted array to find a target value, while binary search works on a sorted array by repeatedly calculating the midpoint and comparing the target to the value there to narrow the search range. It provides steps for performing a binary search, including sorting the array, calculating the midpoint, and updating the search range based on whether the target is less than, greater than, or equal to the midpoint value.
The document discusses various sorting algorithms. It begins by defining sorting as organizing a list of elements into a certain order, such as ascending or descending. It then discusses the objectives of learning about sorting algorithms like bubble sort, insertion sort, selection sort, and merge sort. The document proceeds to explain the concepts of each of these sorting algorithms at a high level through diagrams and examples.
The document discusses various sorting algorithms. It describes how sorting algorithms arrange elements of a list in a certain order. Efficient sorting is important as a subroutine for algorithms that require sorted input, such as search and merge algorithms. Common sorting algorithms covered include insertion sort, selection sort, bubble sort, merge sort, and quicksort. Quicksort is highlighted as an efficient divide and conquer algorithm that recursively partitions elements around a pivot point.
The document discusses various sorting algorithms. It begins by defining a sorting algorithm as arranging elements of a list in a certain order, such as numerical or alphabetical order. It then discusses popular sorting algorithms like insertion sort, bubble sort, merge sort, quicksort, selection sort, and heap sort. For each algorithm, it provides examples to illustrate how the algorithm works step-by-step to sort a list of numbers. Code snippets are also included for insertion sort and bubble sort.
The document discusses various searching and sorting algorithms. It describes linear search, binary search, selection sort, bubble sort, and heapsort. For each algorithm, it provides pseudocode examples and analyzes their performance in terms of number of comparisons required in the worst case. Linear search requires N comparisons in the worst case, while binary search requires log N comparisons. Selection sort and bubble sort both require approximately N^2 comparisons, while heapsort requires 1.5NlogN comparisons.
This document discusses different sorting algorithms including bubble sort, insertion sort, and selection sort. It provides details on each algorithm, including time complexity, code examples, and graphical examples. Bubble sort is an O(n2) algorithm that works by repeatedly comparing and swapping adjacent elements. Insertion sort also has O(n2) time complexity but is more efficient than bubble sort for small or partially sorted lists. Selection sort finds the minimum value and swaps it into place at each step.
This document discusses creating domain-specific languages (DSLs) using Xtext. It defines DSLs as programming languages focused on a particular domain, as opposed to general purpose languages. The document outlines how DSLs are classified and stakeholders involved. It describes benefits of DSLs like reducing mistakes and facilitating understanding for non-experts, but also drawbacks like additional development costs. Key aspects of creating DSLs with Xtext are discussed, including defining a grammar to generate a parser and IDE tools to develop models that can then be transformed to other artifacts.
Introduction to architectures based on models, models and metamodels. model d...Vicente García Díaz
This document provides an introduction to model-driven architecture and model-driven engineering. It discusses the motivation for MDE, including reducing software complexity and improving productivity. The key concepts of MDE are models, metamodels, and model transformations to generate code and other artifacts. MDE aims to increase abstraction levels and automate software development processes. The document uses examples like state machines and database schemas to illustrate metamodels, modeling languages and model transformations.
Este documento presenta jBPM, una plataforma para modelar, ejecutar y administrar procesos de negocio. Explica conceptos como modelado de procesos de negocio, BPMN, XPDL y WS-BPEL. Luego introduce jPDL, el lenguaje utilizado para describir procesos en jBPM. Finalmente, detalla diversas actividades de control de flujo, automáticas y de eventos que se pueden utilizar en jPDL para modelar procesos complejos con decisiones, concurrencias, tareas, subprocesos,
Este documento describe dos lenguajes de marcado, KML y ARML, que se pueden usar para crear mundos de realidad aumentada en Wikitude. Explica los conceptos básicos de Wikitude, y proporciona detalles sobre cómo desarrollar mundos utilizando cada lenguaje, incluidos ejemplos de su estructura y cómo probar los mundos creados.
Este documento presenta el SDK Wikitude ARchitect para crear aplicaciones de realidad aumentada. Explica conceptos básicos como las herramientas ARchitect, la API, el visor móvil y el motor de escritorio. Incluye ejemplos de cómo insertar elementos flotantes en la cámara, usar el contexto AR, crear un círculo superpuesto, añadir y reaccionar a eventos, e insertar imágenes desde archivos. El objetivo es mostrar las capacidades del SDK para desarrollar aplicaciones de realidad aumentada de forma sen
Este documento presenta una introducción a la ingeniería dirigida por modelos (MDE). Explica conceptos básicos como los modelos, metamodelos y el proceso de desarrollo basado en modelos. También describe ejemplos de aplicación de MDE en diferentes dominios como la telefonía IP, los seguros y los videojuegos. Finalmente, introduce los estándares relacionados con MDE como el estándar Model-Driven Architecture y lenguajes como UML y MOF.
Este documento introduce conceptos básicos de OpenGL como la definición de objetos, luces, cámara y ventanas de visualización. Explica el modelo de programación de OpenGL y conceptos clave como proyecciones, matrices y su uso para transformaciones. Luego, cubre OpenGL 2D para dibujar triángulos, modificar colores, texturas e insertar transformaciones. Finalmente, menciona brevemente las principales diferencias entre OpenGL 2D y 3D.
Este documento proporciona una introducción a la realidad aumentada. Explica conceptos clave como la diferencia entre realidad aumentada y realidad virtual, y provee ejemplos de aplicaciones en entretenimiento, ayuda y comercio. También describe métodos para identificar elementos como el seguimiento de características naturales y la búsqueda visual, y cubre temas como navegadores AR y geolocalización.
El documento proporciona una introducción a ARToolKit, una librería de software para construir aplicaciones de realidad aumentada. Explica los conceptos básicos como el rastreo de posiciones y la superposición de objetos a través de video, y describe cómo funciona a través de la detección de marcadores. También cubre temas como la calibración de cámaras, el desarrollo de aplicaciones simples utilizando las funciones principales de ARToolKit, y proporciona ejemplos de su uso en diferentes sistemas como Android.
Este documento proporciona una introducción al Robot Operating System (ROS). ROS es un framework de código abierto que se utiliza comúnmente para desarrollar aplicaciones robóticas. Proporciona herramientas para la comunicación entre máquinas, simulación y desarrollo de software para robots. El documento explica conceptos clave como nodos, tópicos, servicios y el grafo de computación en ROS.
Este documento describe cómo crear servicios web para proporcionar datos de realidad aumentada a la plataforma Wikitude. Explica conceptos básicos como el almacenamiento externo de datos, la arquitectura de Wikitude y formas de desarrollo. También presenta una biblioteca PHP para trabajar con el formato ARML y muestra un ejemplo de servicio web que se conecta a una base de datos MySQL para obtener y devolver puntos de interés.
Este documento proporciona una introducción a los aspectos básicos del procesamiento de textos con LaTeX. Explica conceptos como la estructura de un documento LaTeX, los diferentes comandos y entornos disponibles, la inserción de caracteres especiales, el formato de fuentes y estilos, y las opciones para alinear y espaciar el texto. El documento servirá como guía para aprender los fundamentos de LaTeX.
Este documento describe cómo automatizar Microsoft Word usando código. Explica que las aplicaciones de Office como Word tienen librerías que permiten acceder y manipular sus objetos como si fueran objetos de programación. Esto permite desarrollar software que controle Word usando lenguajes de programación. Luego detalla los pasos para crear un proyecto .NET, incluir las referencias a las librerías de Word, y escribir código para abrir Word, crear un documento y escribir texto en él.
Este documento trata sobre los árboles como estructura de datos. Explica conceptos básicos sobre árboles como nodos, altura, profundidad y tamaño. Luego se detalla sobre árboles binarios, árboles de búsqueda como los AVL, y árboles multicamino. Finalmente menciona bibliografía sobre el tema.
Este documento presenta conceptos básicos sobre dispersión y estructuras de datos hash. Explica protección activa mediante el uso de buenas funciones hash y protección pasiva cuando varios elementos comparten la misma posición en la tabla. Detalla métodos como tablas hash abiertas y cerradas, y técnicas de exploración lineal y cuadrática para buscar posiciones próximas cuando ocurren colisiones. El documento contiene varios ejercicios para ilustrar estos conceptos.
Este documento presenta varios algoritmos para encontrar caminos óptimos en grafos. Explica el algoritmo de búsqueda en anchura (breadth-first search) para encontrar el camino más corto sin considerar pesos. Luego describe el algoritmo de Dijkstra para encontrar el camino mínimo en grafos con pesos positivos y el algoritmo de Bellman-Ford para grafos con pesos positivos y negativos. Finalmente, proporciona ejemplos y pseudocódigo para cada algoritmo.
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/how-axelera-ai-uses-digital-compute-in-memory-to-deliver-fast-and-energy-efficient-computer-vision-a-presentation-from-axelera-ai/
Bram Verhoef, Head of Machine Learning at Axelera AI, presents the “How Axelera AI Uses Digital Compute-in-memory to Deliver Fast and Energy-efficient Computer Vision” tutorial at the May 2024 Embedded Vision Summit.
As artificial intelligence inference transitions from cloud environments to edge locations, computer vision applications achieve heightened responsiveness, reliability and privacy. This migration, however, introduces the challenge of operating within the stringent confines of resource constraints typical at the edge, including small form factors, low energy budgets and diminished memory and computational capacities. Axelera AI addresses these challenges through an innovative approach of performing digital computations within memory itself. This technique facilitates the realization of high-performance, energy-efficient and cost-effective computer vision capabilities at the thin and thick edge, extending the frontier of what is achievable with current technologies.
In this presentation, Verhoef unveils his company’s pioneering chip technology and demonstrates its capacity to deliver exceptional frames-per-second performance across a range of standard computer vision networks typical of applications in security, surveillance and the industrial sector. This shows that advanced computer vision can be accessible and efficient, even at the very edge of our technological ecosystem.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...
Sorting algorithms
1. Algorithms
Sorting
Vicente García Díaz – garciavicente@uniovi.es
University of Oviedo, 2013
2. 2
Table of contents
Sorting
1. Basic concepts
2. Sorting algorithms
▫ Simple algorithms
Sorting by direct exchange
Sorting by insertion
Sorting by selection
▫ Sophisticated algorithms
Sorting by direct exchange
Sorting by insertion
Sorting by selection
▫ Constrained algorithms
Radix
▫ External algorithms
3.
4. 4
Basic concepts
Introduction
• Given a set of n elements a1, a2, a3, …, an
and an order relation (≤) the problem of sorting is
to sort these elements increasingly
• What can influence the sorting?
▫ The type
▫ The size
▫ The device on which they are
• Integers stored in a vector (simplification)
5. 5
Basic concepts
Sorting integers
• In practice, the problems tend to be much more
complex
▫ However they can be reduced to the same problem
that the integers (using registry keys, indexes, etc.)
• In essence it is the same to sort numbers or listed
streets, houses, cars or any numerically
quantifiable property
6. 6
Basic concepts
Classification criteria for algorithms
1. Total number of steps
▫ Complexity
2. Number of comparisons performed
3. Number of interchanges performed
▫ Much more expensive than the comparison
4. Stability
5. Type of memory used
▫ Internal algorithms
▫ External algorithms
7. 7
Basic concepts
Preliminary considerations (I)
• We will sort arrays of integers using the Java
programming language
• We will make use of utility methods to facilitate
the work and make the code more readable
10. 10
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bubble External algorithms
• Description
▫ Based on the successive 1
comparison and exchange of
2
adjacent elements
▫ At each step, each item is 3
compared with the previous
4
one and in case of being 6 5
disordered, they are exchanged 7
▫ In the first step, the smallest 8
element will be placed in the
leftmost position, and so on…
12. 12
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bubble External algorithms
Best case: O(n2)
• High number of exchanges
• High number of comparisons
Worst case: O(n2)
Average case: O(n2)
13. 13
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bubble with sentinel External algorithms
• Description
▫ Observing the previous result 1
you can see that the last steps
2
are repeated unnecessarily
▫ To avoid this you can enter a 3
stop condition when the vector
4
is sorted 6 5
If you make a pass and you do 7
not make any exchange it
8
means that it is already sorted
15. 15
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bubble with sentinel External algorithms
Best case: O(n)
• Complexity whether the input would be 8 1 2 3 4 5 6 7? Worst case: O(n2)
Average case: O(n2)
17. 17
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bidirectional bubble External algorithms
• Description
▫ Used to prevent problems when 1
the vector is almost sorted
2
▫ It operates in both directions at
the same time 3
4
6 5
7
8
19. 19
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Bidirectional bubble External algorithms
Best case: O(n)
Worst case: O(n2)
• It still has a high number of comparisons and
exchanges, just as the other methods of bubble Average case: O(n2)
20. 20
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Direct insertion External algorithms
• Description
▫ The vector is divided into two
“virtual” parts: an ordered and Unordered sequence
an unordered sequence
6 8 2 5
▫ At each step we take the first
element of the unordered
sequence and insert it into the
corresponding place of the
ordered sequence 1 3 4 7
Ordered sequence
23. 23
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Direct insertion External algorithms
Best case: O(n)
• High number of exchanges Worst case: O(n2)
• Less comparisons than the bubble method Average case: O(n2)
24. 24
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Binary insertion External algorithms
• Description
▫ The direct insertion can be
improved taking into account Unordered sequence
that the sequence in which the
6 8 2 5
elements are inserted is already
sorted
▫ We use binary search to quickly
locate the suitable position for
the insertion 1 3 4 7
▫ *** The binary search was discussed Ordered sequence
in the introduction to Algorithms
25. 25
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Binary insertion External algorithms
• Initial vector: 4 5 6 1 3 2 7 8
1 4 5 6 1 3 2 7 8
2 4 5 6 1 3 2 7 8
3 1 4 5 6 3 2 7 8
4 1 3 4 5 6 2 7 8
5 1 2 3 4 5 6 7 8
6 1 2 3 4 5 6 7 8
7 1 2 3 4 5 6 7 8
The result is exactly the same as with the direct insertion
26. 26
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Binary insertion External algorithms
• Less comparisons than the direct insertion Best case: O(n logn)
• High number of exchanges Worst case: O(n2)
• THE INSERTION MOVING ELEMENTS ONE POSITION Average case: O(n2)
IS NOT ECONOMIC
27. 27
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Direct selection External algorithms
• Description
▫ It consists on selecting the
smallest element and exchange
it with the first element
▫ Repeat the process with the 1 2
3 4 7 5 3
2
remaining elements until there
is only one element (the
greatest of all)
29. 29
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
Direct selection External algorithms
• The number of exchanges is minimum Best case: O(n2)
• It is predictable: the number of exchanges and Worst case: O(n2)
comparisons depends on n Average case: O(n2)
• The number of comparisons is very high
30. 30
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
ShellSort External algorithms
• Description
▫ The idea is to divide the vector
of elements in smaller virtual Unordered sequence
vectors
6 8 2 5
▫ Each subvector is sorted using
an insertion sort algorithm
▫ Each element of the subvectors
is positioned at a fixed distance
which decreases in each 1 3 4 7
iteration (increments sequence) Ordered sequence
31. 31
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
ShellSort External algorithms
59 20 17 13 28 14 23 83 36 98 11 70 65 41 42 15
• Increments sequence: {1, 2, 4, 8} (example)
Iteration 1
8 sublists of size 2 36 20 11 13 28 14 23 15 59 98 17 70 65 41 42 83
(increment of 8)
Iteration 2
4 sublists of size 4 28 14 11 13 36 20 17 15 59 41 23 70 65 98 42 83
(increment of 4)
Iteration 3
2 sublists of size 8 11 13 17 14 23 15 28 20 36 41 42 70 59 83 65 98
(increment of 2)
Iteration 4
1 sublists of size 16 11 13 14 15 17 20 23 28 36 41 42 59 65 70 83 98
(increment of 1)
**The elements of the sequence of increments should not be multiples of each other
33. 33
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
ShellSort External algorithms
The complexity depends on the sequence of increments
We do not know the sequence that works best
Knuth recommends:
hk = 3hk-1 + 1 1, 4, 13, 40, …
h1 = 1
hk = 2hk-1 + 1 1, 3, 7, 15, 31, … O(n1.2)
• Can be optimized by distributing the processing of each subvector
across multiple processors
34. 34
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
ShellSort External algorithms
• Principle on which relies
▫ THEOREM: If a sequence is sorted in n by n and
subsequently ordered at intervals of j by j, it will
also be ordered in n by n
• The intervals should be decreasing
• The last interval must be 1
• The interval values should not be multiples of
each other (for efficiency)
35. 35
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
ShellSort External algorithms
ShellSort
Direct insertion
VS
36. 36
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
HeapSort External algorithms
Unsorted area of the vector
• Description
▫ The idea is to create a heap of 5 4
maximums in the vector
▫ Subsequently, the first element is 4 3 3 2
chosen (maximum) and exchanged
with the final element of the 1 2
vector (sorted area)
1
▫ The heap is reorganized and we
again choose the first element
Sorted area of the vector
(maximum) to place it in the last
unsorted position of the vector
The process is repeated until all the
4 5 6 7 9
elements of the unsorted vector are
placed in the sorted vector
37. 37
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
HeapSort External algorithms
• Idea of the algorithm
CREATE HEAP OF MAXIMUMS IN THE FIRST PART OF VECTOR
O(n) First part
GET THE ROOT ELEMENT AND EXCHANGE IT WITH THE LAST
ELEMENT OF THE VECTOR (LAST POSITION) O(1)
Second part
REPEAT UNTIL THE HEAP IS EMPTY O(n)
REORGANIZE THE HEAP OF MAXIMUMS O(log n)
GET THE ROOT ELEMENT AND EXCHANGE IT WITH THE NEXT
UNSORTED ELEMENT OF THE VECTOR O(1)
45. 45
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
HeapSort External algorithms
• Initial vector: 2 1 3 4 5 6 7 8
2. Exchange and restructure the heap
1 sinks 1 1
1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8
46. 46
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
HeapSort External algorithms
Best case: O(n logn)
• Not recommended for small vectors Worst case: O(n logn)
• Fast for large vectors Average case: O(n logn)
47. 47
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
HeapSort External algorithms
• Peculiarities
▫ First all big elements are moved to the left
▫ Then they are moved to the right
▫ The best case does not have to be the ordered vector
▫ The worst case does not have to be the unordered vector
48. 48
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
Partitioned element
• Description
▫ It is based on partitioning
▫ When you partition an item, 2 3 1 4 7 6
that item will be in its
corresponding position
Then, the idea is to partitionate 1 2 3 6 7
all the elements
▫ You have to choose a good
element to partitionate (pivot) 1 3 7
so that it is in the center and it
creates a tree (if would be on
one corner it will create a list)
▫ The implementation is recursive
49. 49
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
• Criteria for choosing a good pivot 1 3 8 7 2 4 6 5
▫ The median
It is the ideal solution but it is very expensive to compute
▫ The first element
It is "cheap" but it is a bad choice
▫ The last element
Occurs as with the first element
▫ A random element
Statistically it is not the worst choice, but has computational cost
▫ The central element
Statistically it is not a bad choice
▫ A compromise solution (median-of-3)
We obtain a sample of 3 elements and calculate the median
It does not guarantee anything but it can be a good indicator
We chose the first element, the last element and the center element
We order the elements, and we assume that the median is the element which is
in the center
50. 50
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
• Idea of the algorithm
REPEAT UNTIL ALL THE ELEMENTS ARE SORTED O(log n) …O(n)
First
CHOOSE A PIVOT Using median-of-3 is O(1) part
PARTITIONING THE PIVOT THROUGH A PARTITIONING
STRATEGY Typical case O(n) Second part
51. 51
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
• Initial vector: 4 5 6 1 3 2 7 8
Choice of pivot
1 4 5 6 1 3 2 7 8 1 1 5 6 8 3 2 7 4
Exchange 1
i j
1 1 1 2 6 8 3 5 7 4
1 5 6 4 3 2 7 8
Partitioning strategy Pivot 1 1 2 6 8 3 5 7 4
Exchange 2
1 1 5 6 8 3 2 7 4 i j
i j 1 1 2 3 8 6 5 7 4
Search for an element[i] > pivot and an
element[j] < pivot and interchange them
i is moved to the right and j is moved to the left 1 1 2 3 8 6 5 7 4
displacing elements element[j] < pivot to the left j i
Positioning
and elements element[i] > pivot to the right
(Ends when j and i se are crossed, determining i 1 1 2 3 4 6 5 7 8
the final position)
52. 52
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
• Initial vector: 4 5 6 1 3 2 7 8
1 2 3 4 6 5 7 8
Choice of pivot Choice of pivot
2 1 2 3 2 6 5 7 8
2 1 2 3 2 5 6 7 8
Partitioning strategy Partitioning strategy
Pivot
2 5 8 7 6
2 1 2 3
Positioning
2 5 8 7 6 j i
i j
Being 3 items or less, the
2 5 6 7 8
elements are already sorted
when searching the median of 3
(ends the recursion in that
branch)
53. 53
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
• Initial vector: 4 5 6 1 3 2 7 8
5 6 7 8
Choice of pivot
3 7 8
3 5
3 7 8
There is only one element, then
there is no need to do the Choice of pivot
median of 3 because the
element is already sorted (ends
the recursion in that branch) 3 7 8
Being 3 items or less, the
elements are already sorted
when searching the median of 3
(ends the recursion in that
branch)
55. 55
Sorting algorithms Simple algorithms Sorting by direct exchange
Sophisticated algorithms Sorting by insertion
Constrained algorithms Sorting by selection
QuickSort External algorithms
Best case: O(n logn)
• Very fast for values of n > 20 Worst case: O(n2)
• Optimizable distributing the processing of each Average case: O(n logn)
partition in different threads in parallel
56. 56
Sorting algorithms
Comparison of algorithms (I)
• Data consists of only one key
n = 256 Sorted Random Inverse n = 512 Sorted Random Inverse
Bubble 540 1026 1492 Bubble 2165 4054 5931
Bubble with sent. 5 1104 1645 Bubble with sent. 8 4270 6542
Bidirect. bubble 5 961 1619 Bidirect. bubble 9 3642 6520
Direct insertion 12 366 704 Direct insertion 23 1444 2836
Binary insertion 56 373 662 Binary insertion 125 1327 2490
Direct selection 489 509 695 Direct selection 1907 1956 2675
ShellSort 58 127 157 ShellSort 116 349 492
HeapSort 116 110 104 HeapSort 253 241 226
QuickSort 31 60 37 QuickSort 69 146 79
57. 57
Sorting algorithms
Comparison of algorithms (II)
• Data consists of only one key VS data with a size 7 times
the key size
n = 256 Sorted Random Inverse n = 256 Sorted Random Inverse
Bubble 540 1026 1492 Bubble 610 3212 5599
Bubble with sent. 5 1104 1645 Bubble with sent. 5 3237 5762
Bidirect. bubble 5 961 1619 Bidirect. bubble 5 3071 5757
Direct insertion 12 366 704 Direct insertion 46 1129 2150
Binary insertion 56 373 662 Binary insertion 76 1105 2070
Direct selection 489 509 695 Direct selection 547 607 1430
ShellSort 58 127 157 ShellSort 186 373 435
HeapSort 116 110 104 HeapSort 264 246 227
QuickSort 31 60 37 QuickSort 55 137 75
58. 58
Sorting algorithms Simple algorithms
Sophisticated algorithms
Constrained algorithms
Radix External algorithms
• Description 8 0 5
▫ It's actually a external sorting 3 1 2 6 4
method
▫ It is not COMPARATIVE
▫ It can be used to sort sequences of
digits that support a lexicographic
order (words, numbers, dates, ...)
▫ It consists of defining K queues[0,
k-1], being k the possible Q0 Q1 … Q9
values that can take each of the
digits that make up the sequence
▫ Then the elements are distributed in
the queues carrying different passes
(ordered from the least significant
digit to the most significant)
60. 60
Sorting algorithms Simple algorithms
Sophisticated algorithms
Constrained algorithms
Radix External algorithms
Best case: O(k * n) = O(n)
• Needs external data structures Worst case: O(k * n) = O(n)
• Not very good if many digits in each element
Average case: O(k * n) = O(n)
• It is very efficient
**K is the number of digits of each element
61. 61
Sorting algorithms Simple algorithms
Sophisticated algorithms
Constrained algorithms
External algorithms External algorithms
• They are based on the principle of DIVIDE AND
CONQUER
• They use external memory
▫ The slowest of them is like Quicksort
• They use the same basic principles that internal
sorting algorithms
• Different algorithms:
▫ Direct mixture
▫ Natural mixture
▫ Balanced mixture
▫ Polyphasic mixture
62. 62
Bibliography
JUAN RAMÓN PÉREZ PÉREZ; (2008) Introducción al diseño y análisis de algoritmos en
Java. Issue 50. ISBN: 8469105957, 9788469105955 (Spanish)