The paper states two necessary conditions for an efficient and successful algorithm: (1) it must not converge on the slope of the fitness function, and (2) it must be allowed to converge in the valley. It also shows a simple Gaussian EDA with truncation selection which tries to fight the premature convergence by enlarging the ML estimate of standard deviation by a constant factor k. Finally, it is shown that a constant factor k that would satisfy the two stated requirements does not exist and that different factors for slope and for valley are needed.