Why Do Researchers Make Approximations To Normal Distribution?

Words: 799
Pages: 4

Mathematical models are used as tools to describe reality. These models are supposed to characterize the important features of the analyzed phenomena and provide insight. The normal distribution is an example of a random variable that is widely used by researchers to model real data.
Researchers often model real observations using the normal distribution, but sometimes the real distribution is a bit different from the perfect, normal distribution. List some reasons why researchers might make approximations to normal distribution like this and describe at least one situation when researchers should not use this approximation.
When forming your answer to this question you may give an example of a situation from you own field of interest for which a random variable can serve as a model.

According to Yakir, B. (2011), Mathematical models serves as a
…show more content…
Sure, nothing in real life exactly matches the Normal. But it is surprising how many things come close. Prior to that, it is partly due to the Central Limit Theorem (CLT), which says that if you average enough unrelated things, you eventually get the Normal
Like classical (Newtonian) mechanics in physics, the Normal distribution in statistics is a special world in which the math is straightforward and all the parts fit together in a way that is easy to understand and interpret. It may not exactly match the real world, but it is close enough that this one simplifying assumption allows you to predict lots of things, and the predictions are often pretty reasonable.
The normal is also statistically convenient. It is represented by two parameters which are arguably the most basic statistics there are: the average and the variance (or standard deviation). The average is the most basic statistic there is. And variance is arguably the second most basic (the average of what's left when you take away the average, but to the power of