Greetings
I am in dire need of help as I am thoroughly FRUSTRATED!! With NORMDIST functions and its extended family. I am trying to plot a histogram (or a bell curve) for 1000 possible portfolio returns so as to show visually that the probability of having a 1% loss is no greater than 10% and hence, losses greater than 1% are even smaller.
So, I generated a range of random monthly returns are from -0.04 to 0.04. The mean of monthly returns = 0.009955333 and the SD = 0.01556967. But when I use NORMDIST I get the values from -0.04 climbing up to the mean 0.0099553333, BUT, they then continue climbing beyond that until they reach approximately 99.9999....% for a 0.04 return. I know they SHOULD begin descending upon reaching the mean (which is the peak and hence the bell curve). If I run 1-NORMDIST, i acquire the correct probabilities, but I do not wish to do this for 500 odd random returns.
Please let me know how to ensure that my probabilities, climb then descend, just as if I used values between 0 - 100 or 120- 300 or any positive number.
Regards
Bookmarks