• Save
Multithreading Presentation
Upcoming SlideShare
Loading in...5
×
 

Multithreading Presentation

on

  • 1,995 views

Presentation of mutltithreading concepts.

Presentation of mutltithreading concepts.

Statistics

Views

Total Views
1,995
Views on SlideShare
1,995
Embed Views
0

Actions

Likes
1
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Task.Factory.StartNew (Go); ThreadPool.QueueUserWorkItem

Multithreading Presentation Multithreading Presentation Presentation Transcript

  • Neeraj Kaushik
    • Why we use threads
    • Synchronization Concepts
    • Using Threads
    • Asynchronous Pattern
    • Parallel Programming
    • Q&A
    • Process is independent, runs in its isolated boundary which its OS has assigned address spaces while thread is subset of process which share same memory and resources.
    • Context Switching between threads in same process is typically faster than context switching between processes.
    • Processes interact only through system provided inter-process communication (like remoting, message queues etc.)
  •  
    • Thread th1=new Thread(new ThreadStart(Run));
    • Th1.Start();
    • Thread th1= new Thread(Run);
    • th1.Run();
    • new Thread(Run).Start();
    • Thread th1 = new Thread(new ParameterizedThreadStart(Run));
    • th1.Run(“hello”);
    • Thread th1=new Thread(delegate() { Console.WriteLine(“Running in anonoumys method”);});
    • th1.Start();
    • Basic
      • Sleep
      • Join
    • Advance Synchronization
      • Lock
      • Monitor
      • Wait Handle
      • Mutex
      • Semaphore
      • Reader Writer Lock
    • Lock: ensures one thread can access critical section of code at a time. Lock keyword expect synchronized object as reference type.
    • public void Insert(int index, T item)
    • {
    • lock (locker)
    • _list.Insert(index, item); }
    • public void RemoveAt(int index)
    • {
    • lock (locker)
    • _list.RemoveAt(index);
    • }
    • Monitor.Enter(object obj) : This method acquires exclusive lock on the specified object.
    • Monitor.Exit(object obj) : Release exclusive lock on the specified object.
    • Monitor.TryEnter(object obj): It try to acquire exclusive, if fails return false else return true. You can also put time period to wait for acquiring lock.
    • Monitor.Wait(object obj) : It release locks and block current thread until reacquires lock on specified object.
    • Monitor.Pulse(object obj): Notifies a thread in the waiting queue of a change in locked object state.
    • Monitor.PulseAll(object obj): Notifies all threads in the waiting queue of a change in locked object state.
    • Ensures just one thread can access a resource, or section of code. It can work for inter process synchronization.
    • Note: Common example of Mutex is to run only one instance of application on machine.
    Mutex mt = new Mutex(true,"test"); try { if(!mt.WaitOne(TimeSpan.FromSeconds(10))) { Console.WriteLine("Another instance of this application is running"); return; } } finally { mt.ReleaseMutex(); }
    • It limits the number of threads that can access a resource or pool or resources concurrently. Use the Semaphore class to control access to a pool of resources. Threads enter the semaphore by calling the WaitOne method, which is inherited from the WaitHandle class, and release the semaphore by calling the Release method.
    Semaphore sm = new Semaphore(1, 1); sm.WaitOne();                  counter++;                  Console.WriteLine("Counter {0} increased by Thread {1}", counter, Thread.CurrentThread.Name);                  Thread.Sleep(1000);                  sm.Release();
    • Wait Handlers are synchronization mechanism used for signaling. If one task is dependent on another task then you should use wait handlers. One thread waits to be signaled and another thread signal first thread to resume its task.
    • There are three classes derived from WaitHandle class ie. Mutex,Semaphore and Event WaitHandle. I have already covered Mutex and Semaphore classes.
    A Comparison of Signaling Constructs Construct Purpose Crossprocess Overhead* AutoResetEvent Allows a thread to unblock once when it Yes 1000ns ManualResetEvent Allows a thread to unblock indefinitely when it receives a signal from another (until reset) Yes 1000ns ManualResetEventSlim (4.0) Allows a thread to unblock indefinitely when it receives a signal from another (until reset)   40ns CountdownEvent Allows a thread to unblock when it receives a predetermined number of signals     Barrier Implements a thread execution barrier   80ns
    • One thread pool per CLR.
    • ThreadPool is shared by all app domains.
    • Saves time in loading time for thread
    • Saves context switching
    • Maintain cap on no of active threads
    • There are a number of ways to enter the thread pool:
    • Via the Task Parallel Library (from Framework 4.0)
    • By calling ThreadPool.QueueUserWorkItem
    • Via asynchronous delegates
    • Via BackgroundWorker
    • There are a few things to be wary of when using pooled threads:
    • · You cannot set the Name of a pooled thread, making debugging more difficult (although you can attach a description when debugging in Visual Studio’s Threads window).
    • · Pooled threads are always background threads (this is usually not a problem).
  • public static void Main() { try { new Thread (Go).Start(); } catch (Exception ex) { // We'll never get here! Console.WriteLine ("Exception!"); } } static void Go() { throw null; } // Throws a NullReferenceException
    • You need an exception handler on all thread entry methods in production.
    • The “global” exception handling events for WPF and Windows Forms applications (Application.DispatcherUnhandledException and Application.ThreadException) fire only for exceptions thrown on the main UI thread. You still must handle exceptions on worker threads manually.
    • AppDomain.CurrentDomain.UnhandledException fires on any unhandled exception, but provides no means of preventing the application from shutting down afterward.
    • ThreadPool.QueueUserWorkItem doesn’t provide an easy mechanism for getting return values back from a thread
    • after it has finished executing. Asynchronous delegate invocations (asynchronous delegates for short) solve this,
    • Steps to initiate asynchronous delegate:
    • 1. Instantiate a delegate targeting the method you want to run in parallel (typically one of the predefined Func
    • delegates).
    • 2. Call BeginInvoke on the delegate, saving its IAsyncResult return value.
    • BeginInvoke returns immediately to the caller. You can then perform other activities while the pooled
    • thread is working.
    • 3. When you need the results, call EndInvoke on the delegate, passing in the saved IAsyncResult object.
    • Here’s how you start a worker task via an asynchronous delegate:
    • 1. Instantiate a delegate targeting the method you want to run in parallel (typically one of the predefined Func
    • delegates).
    • 2. Call BeginInvoke on the delegate, saving its IAsyncResult return value.
    • BeginInvoke returns immediately to the caller. You can then perform other activities while the pooled
    • thread is working.
    • 3. When you need the results, call EndInvoke on the delegate, passing in the saved IAsyncResult object.
      • static void Main()
      • {
      • Func<string, int> method = Work;
      • IAsyncResult cookie = method.BeginInvoke (&quot;test&quot;, null, null);
      • //
      • // ... here's where we can do other work in parallel...
      • //
      • int result = method.EndInvoke (cookie);
      • Console.WriteLine (&quot;String length is: &quot; + result);
      • }
      • static int Work (string s) { return s.Length; }
  •  
  •