+ Reply to Thread
Results 1 to 8 of 8

What can I do to reduce filesize on a MONSTER spreadsheet?

  1. #1
    Registered User
    Join Date
    08-30-2006
    Posts
    40

    What can I do to reduce filesize on a MONSTER spreadsheet?

    Hello, I hope someone out there can help me with this problem. I'm working on a fully automated very detailed report for work. Using VBA, the report accesses our mainframe, compiles a list of several thousand records, sorts them between 12 different tabs on a spreadsheet, identifies their revenue, ranks their importance based on 10 guidelines I provided it, builds 2 pivot tables and 1 pie chart per tab, a summary tab that provides totals for all other tabs, and a weekly trending tab that creates line graphs of the history of each tab. In addition to this, there's a "control" tab that uses active-X controllers to start the whole thing. The user then clicks the button on the control tab, waits around 30 minutes for the report to pull everything off the mainframe and compile/sort/calculate it all, and has a completed report.

    The problem is...this file is now 52meg, and just having it open kills system performance on most machines, forget about actually RUNNING the thing. Does anyone have any ideas to help improve performance? Here's a list of what we're working with:

    -15 tabs
    1 control tab
    12 department tabs
    1 summary tab
    1 weekly trending tab
    -24 Pivot tables (2 on each department tab)
    -12 pie charts (1 on each department tab)
    -1 line graph (weekly trending tab)
    -1 Active-X button that loads the mainframe data into memory and triggers the VBA script that performs all other functions.
    -2 Modules containing the VBA scripts I wrote to sort everything out.
    -average of about 3-4000 rows of data compiled between the 12 department tabs.

    I can't get rid of any of the tabs, pivots, or charts...so about the only thing I can think of that may help is moving all the VBA code into the object for the control tab worksheet, and delete modules 1 & 2. Either that, or delete the control tab, put all the codes into module 1, and instruct the users to start a macro from the Tools menu instead of having a button to click that starts the thing. Will either of those ideas help much?

    Any other ideas to clean this up? 52 meg is just not feasible for this beast.

  2. #2
    Forum Contributor WinteE's Avatar
    Join Date
    04-07-2007
    Location
    Netherlands
    Posts
    544
    Bill-E-BoB,

    3 - 4000 rows of data shouldn't give a 52 meg file.

    Using lots of functions is often the reason of a very large file, from what you wrote this isn't the case up here.

    Erik
    Just keep it simple !


    http://www.excelguide.eu
    In English as well as in Dutch

  3. #3
    Forum Expert dominicb's Avatar
    Join Date
    01-25-2005
    Location
    Lancashire, England
    MS-Off Ver
    MS Office 2000, 2003, 2007 & 2016 365
    Posts
    4,867

    Smile

    Good morning Bill-E-BoB

    Create a copy of the file, and try this on the copy :

    Download Rob Boveys VBA cleaner add-in and use that to see if it cleans up your macro code :
    http://www.apps-pro.com/Utilities/CodeCleaner.htm

    You could also try downloading my add-in via the link below. It has three optimisation routines, I would recommend trying Ultimate > Optimise > Optimise Worksheets and see how that affects it. Don't delete the original until you are sure that all components are still present.

    Is this a shared workbook BTW ... ?

    HTH

    DominicB
    Please familiarise yourself with the rules before posting. You can find them here.

  4. #4
    Registered User
    Join Date
    08-30-2006
    Posts
    40
    Yes, its shared. If we can get the filesize down below 10 meg, then a read-only copy will be distributed by email, but until then its on a network drive shared between all users for read only priveledges and write-access is limited with a password protect.

    I'm going to try the optimization you recommended in just a few minutes (tight deadline on another project right now) and I'll let you know how it goes.

  5. #5
    Registered User
    Join Date
    08-30-2006
    Posts
    40
    Quote Originally Posted by WinteE
    Bill-E-BoB,

    3 - 4000 rows of data shouldn't give a 52 meg file.

    Using lots of functions is often the reason of a very large file, from what you wrote this isn't the case up here.

    Erik
    By functions, are you referring to formulas I may have used in cells, the pivots & charts excel calculates, or the VBA code that I wrote? As far as formulas, the stats summary page does have 12 Countif's, 24 Sumif's, and 3 Sums. Other than those 39 formulas everything in the spreadsheet is either plain text, a pivot table, or a chart. The VBA scripts are bigger, but still not huge. 262 lines for the control tab worksheet, 548 for module 1 which contains the main macro, and thats it. There are things that loop thousands of times over the same 40 lines of code in some spots, but the code itself is very short.

  6. #6
    Forum Contributor harrywaldron's Avatar
    Join Date
    05-24-2007
    Location
    Roanoke, VA
    MS-Off Ver
    Office Professional 2010 BETA
    Posts
    169
    Hi Bill - Yes, a 52MB Excel document is far too large for mainstream processing Some ideas to consider might include going back to the drawing board, as follows:

    1. If possible, it's preferable to simplify where possible and I'm thinking the generated Graphic tabs are the culprit in eaching up precious space. You might try opening a blank Excel document and copying tab by tab, to determine which are the "heavy hitters". A picture is worth a '000 words, so graphics are nice but they are gonna require a lot of space in the workbook (far more than the data or forumlas). For example, copy the workbook to a new test file name, go in and delete all graphical tabs ... I bet the test version will open much faster for you.

    2. Survey the users and eliminate any tabs possible in the existing workbook. For example, if the graphic tabs are "nice to have", you might consider putting them in a completely separate workbook or porting them to PDF style outputs.

    3. Even some of the PIVOT table outputs might be logically parsed out into separate workbooks.

    4. So that everything is logically linked, you might zip the parts into a whole archive so that users have everything bundled together. Look for ways to continue giving the users everything needed but maybe not in one bundle that overwhelms what can be done efficiently on a small PC workstation.

    Good luck and I hope some of these ideas might help
    Last edited by harrywaldron; 06-11-2007 at 03:15 PM.

  7. #7
    Registered User
    Join Date
    08-30-2006
    Posts
    40
    I fixed it. Not sure exactly where the problem lied...but I just deleted every blank row on each sheet so that I knew for a fact there wasn't an invisible character like a space or hard return in there somewhere making it think there were an extra 65,000 rows somewhere. Anyway, once I did that, the file size went from 51 M down to 3 M. MUCH BETTER! Thanks for the advice guys!!

  8. #8
    Forum Contributor harrywaldron's Avatar
    Join Date
    05-24-2007
    Location
    Roanoke, VA
    MS-Off Ver
    Office Professional 2010 BETA
    Posts
    169

    Thumbs up

    Excellent Job ... Yes, I've also done a similar thing, where folks at work have highlighted whole rows or cells and propogated colors or text outside of normal range for the data contents. You may have to test out the mainframe creation process to ensure that rows beyond the normal range aren't being sent out

+ Reply to Thread

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Search Engine Friendly URLs by vBSEO 3.6.0 RC 1