Introduction to algorithms: computational complexity - Tech hangout #23 - 2013.04.24
Upcoming SlideShare
Loading in...5
×
 

Introduction to algorithms: computational complexity - Tech hangout #23 - 2013.04.24

on

  • 532 views

В эту среду, 24 апреля, Василий Наквасюк начал серию коротких докладов про алгоритмы, сложность алгоритмов, ...

В эту среду, 24 апреля, Василий Наквасюк начал серию коротких докладов про алгоритмы, сложность алгоритмов, структуры данных, виды алгоритмов (сортировка, ...) и т.п. В первом докладе постарался быстро рассказать про вычислительную сложность алгоритмов и нотацию "О-большое" на примерах.

* Tech Hangout – мероприятие, организованное разработчиками для разработчиков с целью обмена знаниями и опытом. Подобные встречи проводятся еженедельно по средам с 12:00 до 13:00 и охватывают исключительно инженерные темы. Формат данного ивента подразумевает под собой 30 минутный доклад на ранее определенную тему, и такую же по продолжительности дискуссию в формате круглого стола.
Если у вас есть неутомимое рвение к новым знаниям, профессиональному росту, или же вы хотите поделиться своим опытом - добро пожаловать в Hangout Club!

Присоединяйтесь к обсуждению - https://www.facebook.com/groups/techhangout/
Читайте нас на - http://hangout.innovecs.com/

Statistics

Views

Total Views
532
Views on SlideShare
531
Embed Views
1

Actions

Likes
0
Downloads
21
Comments
0

1 Embed 1

http://blog.innovecs.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Introduction to algorithms: computational complexity - Tech hangout #23 - 2013.04.24 Introduction to algorithms: computational complexity - Tech hangout #23 - 2013.04.24 Presentation Transcript

    • INTRODUCTION TOALGORITHMS:COMPUTATIONALCOMPLEXITYBY VASYL NAKVASIUK, 2013
    • WHAT IS ANALGORITHM?
    • WHAT IS AN ALGORITHM?Important issues: correctness, elegance and efficiency.“ An algorithm is a procedure that takes any ofthe possible input instances and transforms itto the desired output. ”
    • EFFICIENCYIs this really necessary?
    • CRITERIA OF EFFICIENCY:Time complexitySpace complexityTime complexity ≠ Space complexity ≠ Complexity ofalgorithm
    • HOW CAN WE MEASURECOMPLEXITY?
    • HOW CAN WE MEASURE COMPLEXITY?EMPIRICAL ANALYSIS (BENCHMARKS)THEORETICAL ANALYSIS (ASYMPTOTIC ANALYSIS)
    • BENCHMARKSEMPIRICAL ANALYSIS
    • BENCHMARKSVERSION #1WHAT MEANS “FAST”?
    • BENCHMARKSVERSION #2import timestart = time.time() # Return the time in seconds since the epoch.my_algo(some_input)end = time.time()print(end - start)0.048032498359680176
    • BENCHMARKSVERSION #3import timeittimeit.timeit(my_algo(some_input), number=1000)1000 loops, best of 3: 50.3 ms per loop
    • BENCHMARKSVERSION #4import timeitinputs = [1000, 10000, 500000, 1000000]for input in inputs:timeit.timeit(my_algo(input), number=1000)list of 1000 items:1000 loops, best of 3: 50.3 ms per looplist of 10000 items:1000 loops, best of 3: 104.7 ms per looplist of 500000 items:1000 loops, best of 3: 459.1 ms per looplist of 1000000 items:1000 loops, best of 3: 3.12 s per loop
    • BENCHMARKSVERSION #5# Intel Core i7-3970X @ 3.50GHz, RAM 8 Gb, Ubuntu 12.10 x64, Python 3.3.0import timeitinputs = [1000, 10000, 500000, 1000000]for input in inputs:timeit.timeit(my_algo(input), number=1000)list of 1000 items:1000 loops, best of 3: 50.3 ms per looplist of 10000 items:1000 loops, best of 3: 104.7 ms per looplist of 500000 items:1000 loops, best of 3: 459.1 ms per looplist of 1000000 items:1000 loops, best of 3: 3.12 s per loop
    • EXPERIMENTAL STUDIES HAVE SEVERALLIMITATIONS:It is necessary to implement and test the algorithm in orderto determine its running time.Experiments can be done only on a limited set of inputs, andmay not be indicative of the running time on other inputsnot included in the experiment.In order to compare two algorithms, the same hardware andsoftware environments should be used.
    • ASYMPTOTIC ANALYSISTHEORETICAL ANALYSIS
    • ASYMPTOTIC ANALYSISEFFICIENCY AS A FUNCTION OF INPUT SIZET(n) – running time as a function of n, where n – size of input.n → ∞Random-Access Machine (RAM)
    • BEST, WORST, AND AVERAGE-CASECOMPLEXITYLINEAR SEARCHT(n) = n ?T(n) = 1/2 ⋅ n ?T(n) = 1 ?def linear_search(my_item, items):for position, item in enumerate(items):if my_item == item:return position
    • BEST, WORST, AND AVERAGE-CASECOMPLEXITY
    • BEST, WORST, AND AVERAGE-CASECOMPLEXITYLINEAR SEARCHWorst case: T(n) = nAverage case: T(n) = 1/2 ⋅ nBest case: T(n) = 1T(n) = O(n)def linear_search(my_item, items):for position, item in enumerate(items):if my_item == item:return position
    • HOW CAN WE COMPARETWO FUNCTIONS?WE CAN USEASYMPTOTIC NOTATION
    • ASYMPTOTIC NOTATION
    • THE BIG OH NOTATIONASYMPTOTIC UPPER BOUNDO(g(n)) = {f(n): there exist positive constants c and n0 such that0 ≤ f(n) ≤ c⋅g(n) for all n ≥ n0}T(n) ∈ O(g(n))orT(n) = O(g(n))
    • Ω-NOTATIONASYMPTOTIC LOWER BOUNDΩ(g(n)) = {f(n): there exist positive constants c and n0 such that0 ≤ c⋅g(n) ≤ f(n) for all n ≥ n0}T(n) ∈ Ω(g(n))orT(n) = Ω(g(n))
    • Θ-NOTATIONASYMPTOTIC TIGHT BOUNDΘ(g(n)) = {f(n): there exist positive constants c1, c2 and n0 suchthat 0 ≤ c1⋅g(n) ≤ f(n) ≤ c2⋅g(n) for all n ≥ n0}T(n) ∈ Θ(g(n))orT(n) = Θ(g(n))
    • GRAPHIC EXAMPLES OF THE Θ, O AND ΩNOTATIONS
    • EXAMPLES3⋅n2 - 100⋅n + 6 = O(n2),because we can choose c = 3 and3⋅n2 > 3⋅n2 - 100⋅n + 6100⋅n2 - 70⋅n - 1 = O(n2),because we can choose c = 100 and100⋅n2 > 100⋅n2 - 70⋅n - 13⋅n2 - 100⋅n + 6 ≈ 100⋅n2 - 70⋅n - 1
    • LINEAR SEARCHLINEAR SEARCH (VILLARRIBA VERSION):T(n) = O(n)LINEAR SEARCH (VILLABAJO VERSION)T(n) = O(3⋅n + 2) = O(n)Speed of "Villarriba version" ≈ Speed of "Villabajo version"def linear_search(my_item, items):for position, item in enumerate(items):print(poition – {0}, item – {0}.format(position, item))print(Compare two items.)if my_item == item:print(Yeah!!!)print(The end!)return position
    • TYPES OF ORDERHowever, all you really need to understand is that:n! ≫ 2n ≫ n3 ≫ n2 ≫ n⋅log(n) ≫ n ≫ log(n) ≫ 1
    • THE BIG OH COMPLEXITY FOR DIFFERENTFUNCTIONS
    • GROWTH RATES OF COMMON FUNCTIONSMEASURED IN NANOSECONDSEach operation takes one nanosecond (10-9 seconds).CPU ≈ 1 GHz
    • BINARY SEARCHT(n) = O(log(n))def binary_search(seq, t):min = 0; max = len(seq) - 1while 1:if max < min:return -1m = (min + max) / 2if seq[m] < t:min = m + 1elif seq[m] > t:max = m - 1else:return m
    • PRACTICAL USAGEADD DB “INDEX”Search with index vs Search without indexBinary search vs Linear searchO(log(n)) vs O(n)
    • HOW CAN YOU QUICKLY FIND OUTCOMPLEXITY?O(?)
    • D. E. Knuth, "Big Omicronand Big Omega and BIg Theta", SIGACT News, 1976.“ On the basis of the issues discussed here, Ipropose that members of SIGACT, and editorsof computer science and mathematicsjournals, adopt the O, Ω and Θ notations asdefined above, unless a better alternative canbe found reasonably soon. ”
    • BENCHMARKSORASYMPTOTIC ANALYSIS?USE BOTH APPROACHES!
    • SUMMARY1. We want to predict running time of an algorithm.2. Summarize all possible inputs with a single “size” parametern.3. Many problems with “empirical” approach (measure lots oftest cases with various n and then extrapolate).4. Prefer “analytical” approach.5. To select best algorithm, compare their T(n) functions.6. To simplify this comparision “round” the function usingasymptotic (“big-O”) notation7. Amazing fact: Even though asymptotic complexity analysismakes many simplifying assumptions, it is remarkably usefulin practice: if A is O(n3) and B is O(n2) then B really will befaster than A, no matter how they’re implemented.
    • LINKSBOOKS:“Introduction To Algorithms, Third Edition”, 2009, byThomas H. Cormen, Charles E. Leiserson, Ronald L. Rivestand Clifford Stein“The Algorithm Design Manual, Second Edition”, 2008, bySteven S. SkienaOTHER:“Algorithms: Design and Analysis” by Tim RoughgardenBig-O Algorithm Complexity Cheat Sheethttps://www.coursera.org/course/algohttp://bigocheatsheet.com/
    • THE ENDTHANK YOU FOR ATTENTION!Vasyl NakvasiukEmail: vaxxxa@gmail.comTwitter: @vaxXxaGithub: vaxXxaTHIS PRESENTATION:Source:Live:https://github.com/vaxXxa/talkshttp://vaxXxa.github.io/talks