using System; public category take a look at one.00m; Console.WriteLine (d); } } When I initial ran the on top of (or one thing similar) I expected it to output simply one (which is what it might are on .NET 1.0) - however in reality, the output was one.00. The decimal sort does not normalize itself - it remembers what percentage decimal digits it\'s (by maintaining the exponent wherever possible) and on info, zero could also be counted as a major digit. i do not understand the precise nature of what exponent is chosen (where there\'s a choice) once 2 totally different decimals ar increased, divided, added etc, however you will realize it attention-grabbing to manipulate with programs like the following: using System; public category take a look at zero.00000000000010000m; whereas (d != 0m) } } Solution using System; public category take a look at one.00m; Console.WriteLine (d); } } When I initial ran the on top of (or one thing similar) I expected it to output simply one (which is what it might are on .NET 1.0) - however in reality, the output was one.00. The decimal sort does not normalize itself - it remembers what percentage decimal digits it\'s (by maintaining the exponent wherever possible) and on info, zero could also be counted as a major digit. i do not understand the precise nature of what exponent is chosen (where there\'s a choice) once 2 totally different decimals ar increased, divided, added etc, however you will realize it attention-grabbing to manipulate with programs like the following: using System; public category take a look at zero.00000000000010000m; whereas (d != 0m) } }.