This seminar was presented by Ashish Kumar
A randomized algorithm is defined as an algorithm that typically
uses the random bits as an auxiliary input to guide its behavior. It achievs
good performance in the "average case". Formally, the algorithm's
performance will be a random variable determined by the random bits,
with (hopefully) good expected value. The "worst case" is typically so
unlikely to occur that it can be ignored. First and foremost reason for
using randomized algorithms is simplicity.