Hi,
Slightly random query and I'm posting it here and not on one of the specific forums as it's not a problem that I can't solve (it's a case of how long does something typically take?).
In short, I have two essentially recursive loops set up in a spreadsheet (with two worksheets - the loops effectively tie the sheets together) and I am using a find and replace to trip one of the loops and force it run some iterations to solve a problem. I know the maths is good as I've spent long enough doing this and had plenty of problems on the way (with things not converging - but the numbers when I have tried it appear to be converging and moving in both directions etc. - and not one direction as they had done previously and hence not converging).
Basically I have ratings against 739 teams, and just under 8000 games to work the iterations through (to derive new ratings for the 739 teams) and I was wondering typically how long does a calculation process like this take? I have iterations set to 10000, and max change to 0.001, and I've stopped it running after about 3 hours and only 400-450 iterations. I'm wondering if my laptop is too old and somewhat cranky (a 7 year old last of the Power PC's Mac) to be doing this (e.g. should it normally take over 3 hours for this level of calculation or long should something like this take)?
Any advice or suggestions (or this will take more than 8 hours type experiences of past exploits) would be greatly appreciated,
Bookmarks