If IQ scores are normally distributed with a mean of 100 and a standard deviation of 15, what is the mean IQ score after the scores have been standardized by converting them to z-scores?

Respuesta :

Answer:

[tex]\mu = 0 , \sigma=1[/tex]

Step-by-step explanation:

Previous concepts

Normal distribution, is a "probability distribution that is symmetric about the mean, showing that data near the mean are more frequent in occurrence than data far from the mean".

The Z-score is "a numerical measurement used in statistics of a value's relationship to the mean (average) of a group of values, measured in terms of standard deviations from the mean".  

Solution to the problem

Let X the random variable that represent the IQ scores of a population, and for this case we know the distribution for X is given by:

[tex]X \sim N(100,15)[/tex]  

Where [tex]\mu=100[/tex] and [tex]\sigma=15[/tex]

If we standardize the variable with the z score given by:

[tex] Z= \frac{x -\mu}{\sigma}[/tex]

We got a normal standard distribution with parameters [tex] Z\sim N (0,1)[/tex]

[tex]\mu = 0 , \sigma=1[/tex]