Hi there, I run data analysis on many datasets (date, data) of between 5000 and 30,000 lines (rows) with multiple columns. Recently I've put this into VBA to compile summary tables which is currently just a 90x90 grid of results (totals, averages, correlations etc) of just 90 data tables - each has there own file.
The VBA iterates through each file (90 of) to analyse data within a certain time period to extract results in table and chart form into a separate summary table. The resulting summary file (the result) is 100MB in size. being 90 tabs - one for each file analysed. Script closes each file accessed during execution, so that no more than 3 files are ever open, ending with only 1 file open (containing the VBA script) - which is around 16MB only.
Problem: is I can only run the VBA once successfully, saving a 100MB file. Doing it again, excel then runs out of system resources during the run for a different set of analysis. Current script uses copy/paste, and opens/closes 90 files each execution. I can save, exit excel, reopen a new excel session and run it again fine.
Why do I have resource issues? Task manager says memory use is not above 4GB, system pagefile size is fixed to same size as physical memory (16384GB).
System: is 5 years old dual Xeon workstation with 16GB ram running vista x64 - all MSW/MSO updates current. Am thinking either I have a hardware problem, or some other software issue.
Could it be a memory allocation issue? or should I consider hardware faults? Maybe it is more simple?
TIA, euc
Bookmarks