Your SlideShare is downloading. ×
0
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Mescon logarithms
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Mescon logarithms

530

Published on

An academic preentation about the

An academic preentation about the

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
530
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Base Agnostic Approximations of Logarithms<br />Josh Woody<br />University of Evansville<br />Presented at MESCON 2011<br />
  • 2. Overview<br />Motivation<br />Approximation Techniques<br />Applications<br />Conclusions<br />
  • 3. Motivation<br />Big “Oh” notation<br />Compares growth of functions<br />Common classes are<br />How does 𝑂(𝑛log𝑛)fit? Compared to 𝑂𝑛1.5 or 𝑂(𝑛)?<br />Other Authors<br />Topic barely addressed in texts<br /> <br />𝑂1, 𝑂𝑛, 𝑂𝑛log𝑛, 𝑂𝑛2, 𝑂(2𝑛)<br /> <br />
  • 4. Approximation Technique 1<br />Integration<br />Integrate the log function<br />𝐹𝑥= 𝑓𝑥𝑑𝑥= log𝑥𝑑𝑥=𝑥𝑙𝑜𝑔 𝑥 −𝑥+𝐶<br />Note that log x is still present, presenting recursion<br />Did not pursue further<br /> <br />
  • 5. Approximation Technique 2<br />Derivation<br />Derive the log function<br />𝑓′𝑥=1𝑥=𝑥−1 <br />What if we twiddle with the exponent by ±.01 and integrate?<br />𝑔𝑥=100𝑥0.01−100 <br /> <br />
  • 6. Approximation 2 Results<br />Error at x = 50 is ±4.2%<br />Error grows with increasing x<br />Can be reduced with more significant figures<br />
  • 7. Approximation Technique 3<br />Taylor Series<br />Infinite series<br />Reasonable approximation truncates series<br />Argument must be < 1 to converge<br />
  • 8. Approximation 3 Results<br />Good approximation, even with only 3 terms<br />But approximation only valid for small region<br />
  • 9. Approximation Technique 4<br />Chebychev Polynomial<br />Infinite Series<br />Approximates “minimax” properties<br />Peak error is minimized in some interval<br />Slightly better convergence than Taylor<br />
  • 10. Approximation 4 Results<br />Centered about 0<br />Can be shifted<br />Really bad approximation outside region of convergence<br />Good approximation inside<br />
  • 11. Conclusions<br />Infinite series not well suited to task<br />Too much error in portions of number line<br />Derivation attempt is best<br />𝑔𝑥=100𝑥0.01−100 <br /> <br />
  • 12. Applications<br />Suppose two algorithms run in 𝑂(𝑛log𝑛)and 𝑂(𝑛1.5)<br />Which is faster?<br />Since log 𝑛=𝑜𝑛0.01, the𝑂(𝑛log𝑛 ) algorithm is faster.<br /> <br />
  • 13. What base is that?<br />Base in this presentation is always e.<br />Base conversion was insignificant portion of work<br />Change of Base formula always sufficient<br />
  • 14. The End<br />Slides will be posted on JoshWoody.com tonight<br />Questions, Concerns, or Comments?<br />

×