Please bear with me as this is a rather general question:
I have a somewhat large workbook which is duplicated many times with different data.
I have a "summary" workbook which has data linked in from all those others. It contains a list of all the data files.
I have an "update" process in the "summary" workbook which goes through the list of data files in the "summary" and runs them one at a time.
I believe it's fair to say that the code is "proven" to a large degree.
The process works ... almost:
There are over 200 "data" files and with full options turned on, includes 2 or 3 runs of Solver each. So, it takes 2-3 minutes per file to execute.
This worked over night for nearly all the files until it stopped with "Code execution has been interrupted".
Now I can continue the process by selecting "Continue" 5 or 6 times per file.
The code lines where it stops aren't always identical so here are a couple of examples of ONLY the lines where it stops:
and![]()
Please Login or Register to view this content.
"![]()
Please Login or Register to view this content.
Over the development of this, I have done these things:
- Turned off all COM addins
- Selectively (but perhaps not as intelligently as you might) added DoEvents statements
Both of these seemed to help but the problem is obviously not entirely solved.
So, I wonder if there are other similar things that I might be thinking about in order to fix this annoying transient problem?
The computer has 16GB of RAM and, in the past but not now, I've seen rather huge amounts of memory used up as if there were a memory leak.
So, that doesn't seem to be happening now. I mention it for completeness.
Also, it appears that the process was "broken" when one of the files returned unexpected results. Why this should affect the subsequent file processing is a mystery to me.
Bookmarks