Hey people,

I want to run a Monte Carlo simulation with the following aim: see the impact of missing a certain number of random days in a market index over 20 years.
I have the daily returns over that period handy and would like the simulation 10,000 times with missing 5,10,15,20,25... random days and going into a fixed deposit instead during the missed days. Then finding out what my average return and risk would be compared to not missing those days.
I am new to trying a Monte Carlo simulation in excel and just did a tutorial on doing a dice simulation. That was easy enough, but didn't help too much with my above problem.

So 20 years is 5027 rows (weekends are excluded). see table so the first column contains the dates (each day on the index), the second column is the closing price of the index, the third column is the gain or loss for the days (based on today and yesterday's ending value) and the 4th column is a simple start at 100 and add each day's gain/loss.
so i need that exact table (which goes until Aug 2017 which is 5027 rows) with random days left out and then seeing what the final value is in the 4th column when a set number of days are randomly removed. So then i can compare the average ending value when 5 random days are removed, simulated over 10000 times, with the ending value when no days are removed.

Here is the study i'm trying to duplicate for context pages 11 and 16 explain the results and a bit of the methodology respectively. I am using days, instead of months as in the above, due to the index i'm looking at not having that long track record. rest is identical to the above.

Any help would be amazing, also let me know if it's too much of a project to tackle.