I have a macro that runs through a month's worth of reports that are exported out of the company system into Excel. My macro opens up the first report, gathers the necessary data and enters it into database format into a table. It then closes the report and opens the next one and repeats until all the month's reports are converted.
The problem is that the reports are fairly large (they are meant to be read by the human eye, and so have loads and loads of repeated strings throughout). By the end of the month's reports, the computer is being very sluggish. It seems to me that even though I close out each report, some part of it is being held in memory. Is there a way to clear out the memory usage after the report is closed? The files are opened as Read Only.
I am using a for each loop that cycles through all the cells in column A looking for certain criteria, and if matched then copy the corresponding data.
I would normally upload an example workbook, but I wouldn't even know where to begin to dummy it down.
Here is the code in question:
The problem is that when it gets to the CALL DDATrialExtract, the pc is so slow that it takes about an hour to run through the DDATrialExtract. When I run CIFExtract, save and close out of Excel, reopen the workbook and run DDATrialExtract it takes a few minutes.
Is there any way I could save on memory usage to make this not such a memory hog?
Bookmarks