# Off Topic > The Water Cooler >  >  Large Excel systems - your experiences

## Jacc

Hello

When reading through posts here it seems to be quite common to have multiple workbooks linked together, either with formulas or "joined" by some macro setup.

I have twice been asked to do "larger" (with my references anyway  :Smilie:  ) applications at work. This means multiple sheets, a few buttons here and there, generating pdf's etc. About 20 pages of VBA code (pasted in Word, no extra line spacing).
As long as it works, I am asked to add and expand this application frequently. I have several times pointed out that I see where this is going and that we need to buy a professional system. Most people agree more or less but it is a costly investment, hence it is denied or postponed to an indefinite future. Sounds familiar?

I want to know what experiences you have from "large" Excel systems. Everything from stories of great success to complete trainwrecks!

I'm guessing that many systems start out as something small, well suitable for Excel. If this works well, it will slowly grow over time, more featured and functionality is added and eventually you have a complete database or a some part of a business system, spread out on multiple Excel workbooks, linked together or working together with VBA.

----------


## CK76

I have in the past created database like system using Excel and VBA. However, it required quite a bit of manual maintenance (create data entry workbook for new employee, weekly validation of files etc).

I eventually did presentation on more robust system using MySQL running on virtual machine and was able convince upper management that it is a good investment (after spending few months doing the leg work to get key players' buy-in).

You'll often need to do up full plan including succession plan, implementation cycle, cost analysis along with business continuity plan to convince higher ups.

----------


## Mvaldesi

Mine wasn’t necessarily a _large_ Excel system, just an extremely convoluted one. I work in the supply chain of a major retailer, and several years ago I was tasked with reporting out a 8-week forecast of inbound volume. This forecast was developed nearly 10 YEARS before it fell in my lap, and by that time, everyone involved in its development had retired, been fired or just left the company. It’s one, unassuming Excel workbook, but inside, one finds dozens upon dozens of tabs, each linked to each other and another by myriad vlookups, references, calculations, conditional formatting, etc., basically any function you can think of. I’m all about keeping everything simple and manageable, and this thing was (is) a house of cards; delete or insert one row or change one number, and the whole thing collapses into a mess of #REF! and #N/A errors. I transitioned it just last year to another unfortunate soul with the very stern message: “Don’t. Touch. ANYTHING.”

----------


## Jacc

These are among the stories I was hoping to hear. More, more!  :Smilie: 

CK76, was Access ever a contender for replacing your setup?

Mvaldesi, that sounds like a nightmare!  I assume it worked?

To be fair old, freaky, orphaned, undocumented systems are not limited to Excel. As I understand, a lot of stuff programmed in some low level language during the 70's and 80's is still running today.

----------


## Mvaldesi

Sure, the forecast still works exactly as it has for over a decade and a half! The only change we made about 5 years ago was automating the data collection piece using 4-5 Access databases (we used to have to wait for emails and manual reporting from the field to assemble the inputs,) but the final report is still this 15-year-old Excel 2003 workbook where you can spend 20 minutes chasing references across dozens of sheets until you find that someone typed text into a field that Excel uses in a calculation 14 worksheets upstream. A replacement is ideal, but will require a TON of development and no one has the time, so we’re stuck with this Rube Goldberg machine of a workbook…

----------


## scottiex

We have some fairly complex setups. For example our version of Mvaldesi's sheet, coincidentally one of the few major projects i didn't write (although I do largely understand). 

It hasn't really become much of an issue for us, and the company really likes being able to request a change for some small user based reason and getting a result very quickly as opposed to having to go to an external provider or whatever. The down side is that this sort of process is that sometimes it does offend my sensibilities by ending up a bit overly complex and with all sorts of not entirely necessary user customization and cosmetic features.

To some extent there is an issue with large excel files and demanding macros using up system resources and resulting in some user issues that might occur less if we were doing it using another program, but it is pretty manageable.

----------


## scottiex

We used to do a bit of stuff in access because a guy who did part of my job before me was an "access man". But that all got removed as it was in a bit of a no man's land between excel and the main IT systems that we use to run the company.

----------


## CK76

> CK76, was Access ever a contender for replacing your setup?



Not in my instance. One of the reason for moving to MySQL was to deploy it across network and allow remote users.
I've used Access as front end for MySQL in the past though and it has worked well.

----------


## Jacc

> It hasn't really become much of an issue for us, and the company really likes being able to request a change for some small user based reason and getting a result very quickly as opposed to having to go to an external provider or whatever.



Good point. My boss will just stick his head in the doorway to my office with some request. On a good day, I can have it done by the next morning. This is indeed a big advantage.






> We used to do a bit of stuff in access because a guy who did part of my job before me was an "access man". But that all got removed as it was in a bit of a no man's land between excel and the main IT systems that we use to run the company.



Interesting. I don't think that would be an issue in my position. We are a small group,  I could probably put up an Access database on the server without anyone asking about that. Except I had to learn Access first.  :Smilie:

----------


## scottiex

> Good point. My boss will just stick his head in the doorway to my office with some request. On a good day, I can have it done by the next morning. This is indeed a big advantage.



Also there are numerous ppl in my company who could in theory do a minor change if it was urgent, despite it not really being their job.





> Interesting. I don't think that would be an issue in my position. We are a small group,  I could probably put up an Access database on the server without anyone asking about that. Except I had to learn Access first.



Yeah for me there is great diversity in skill on the other side of that scale to my comment above...
1) not everyone would be comfortable using it (they might panic at the idea of opening a program that was not excel).
2) it doesn't give them the customization user interface that would allow them to highlight a cell or something.

----------


## Trevor_S

I always use excel where possible, you can do (almost!) anything with it.  At a previous employer I wrote a procurement system to work with our existing Oracle Financials.  Users could raise orders from catalogues on it, there was a workflow process for authorisations, and a file was generated for upload in Oracle GL to set up commitments.  A report on Oracle AP provided a file to feed into my process, update the records, and generate a file for the GL to reverse the commitments.  Reporting, etc. could all be built in.

The down side is succession planning!  The more complicated the process, they more difficult it will be for anyone to just pick up, which could be a problem when the original author leaves.  Whereas knowledge of an "off-the-shelf" system will be easier to find.  So if you're going to use excel to that extent, you really need to have at least a couple of people with a good understanding of what's happening "behind the scenes", as well as detailed well maintained documentation.

----------


## Arkadi

In my first experience with VBA, I developped a production application that ran as a stand alone spreadsheet, which did what it needed to, but only within the context of that single production run.  Later when reports were needed, to get info on the bigger picture, I had to get the data from all the existing saved books in various folders... NEVER AGAIN!
I now never use sheets to store data permanently, they are only front end applications, all data is stored and retrieved from access databases.  In a few instances those are reaching their size limit, and am seriously considering migrating to MySQL which has both a larger size limit, and is much more efficient when it comes to stored data and how much space it takes up.

----------


## Trevor_S

Following up on a point by Arkadi, I'd agree that its worth separating the application/user interface file from the data file(s).  Use macros (rather than formula links) to get data from one to the other.  I did this within my example mainly because it allowed more people to use it simultaneously - the user interface spreadsheet was set as read only, and it opened/closed data files only for the brief second or so that it needed them.  But it also gave a speed advantage as you weren't continually working with a huge spreadsheet.  If you want to "lock" an individual record in a data file, simply create a marker field, which can be set when a user retrieves the record to work on, and is cleared when their updates are done.  If it contains the user name, then any subsequent user trying to access the record can be notified of who has it open.

I used a combination of excel spreadsheets and text files (produced using VBA output/write functionality) - however this was mainly because my employer didn't have access, and I had no experience with it anyway.  If you have got it as an option, then well worth exploring.

----------


## Kyle123

I despise the type of "large excel systems" discussed here, I've seen far too many of them where excel is simply the wrong tool. They're usually of a questionable standard, contain loads of VBA with no thought out structure or architecture that simply gets bodged to add on the latest requirement. They inevitably end up as business critical systems that are passed from person to person, each of which adds more bodge without really understanding how most of it functions.

I understand how they come about, usually out of necessity by people with little knowledge that just "need something simple" that then spirals out of control. I've got a deal of sympathy with this, I have a natural inclination to improve process and get frustrated by inefficiency, but it doesn't mean that these solutions are good.

Access isn't much better, but at least it can be split and the backend migrated onto sql server. 

I think that most of the issues stem from only needing something "simple" and then building on it as scope increases rather than stepping back and looking for a better solution when requirements grow. This can happen with other bespoke software too, but it's more chronic and debilitating in excel due to the low barrier to entry. The tools don't help much either, source control is particularly troublesome without 3rd party add ins (though in all honesty I can't say I've ever once encountered someone using source control on VBA in the wild - though I'm sure some people must).

----------


## Kyle123

In terms of train wrecks, I've had to find a solution for a "system" that comprised of over 175,000 (yes, that's not a typo) workbooks, where someone left and they wanted it to work multi-site.

----------


## Kyle123

In context of the above, is always advocate SQL server over access for data storage, the express version is free, I've even written a post on it somewhere on the forum

----------


## scottiex

One of the things I have done is to have a central collection of macros that allow various things to work so that I don't need to roll out new versions (Although some changes are best addressed by releasing a new end version) without having to get all the end users to go find new versions and update them. 

I used to keep pretty detailed version notes but I've never really had much use for them, not sure what value they would add if they were a large number of changes ago. I do keep a lot of saves of old copies with dates on them though - in cause I need to go back a few versions (which is also pretty rare).

----------


## Kyle123

Hi Scottiex, I think you're getting mixed up between version control and source control

----------


## Doc.AElstein

Hi,

I have to say at the outset that I am no professional and only work on a very large ever increasing personal project, involving massive data files ever increasing in size. My ignorance  obviously shows  sometimes in the solutions I have. 
I started with Excel with Formulas, then moved on to Excel VBA. 

I am open minded and spent a long time and a lot of effort investigating the whole SQL , ADO, database , text data file  etc. alternatives.. ** I got working solutions , but I was finally not impressed with the performance improvements , ( if any sometimes)  for me, and there were often very annoying problems that I can see would be responsible for keeping someone in full time employment to keep on top of  them in “_support_”.
I don’t have the luxury of being paid to spend a lot of time “_maintaining_” the stuff. I need something that just always works and is simple to trouble shoot on the odd occasion when something might go wrong. 

My final solution*** is two  main files: 
_ one main macro file with most controlling macros in it, 
_ the other a main “working data file” which has a combination of permanent data in it and very efficient search and data import and merge codes ( in it and in the main macro file ) which use closed workbook references ( in VBA ) to get at extra data in a 
_ number of further large Excel Files which simply hold massive amounts of data. ( These large extra data files are usually kept closed) ***
( It is worth noting that in an Excel File using closed workbook references, the last complete set of data is actually held anyway in that Excel File, so you have that anyway there should the current data File not be available at any further time….)
It seems to work a lot better than any other way I experimented with. My one annoyance is that the  _ *VBA Evaluate(“ “) Function* _ does not work on closed workbook references. I cannot seem to get any answer to why that is, other than Microsoft wired it that way.. Pessimistically , I would say they did that because if it did work, a lot of the solutions I use would be that little bit faster and probably make a lot of other ACCESS data base stuff unnecessary.  ( And it would be nice if Microsoft cured the memory leak bug I discovered in the Match Function which prevents it being used for sorting/ merging codes working on large amounts of data. Again, I could pessimistically say they did that on purpose, as I usually found the versions of my sorting/ merging codes using Match worked by far the quickest, until the bug kicked in **** ) 

I have to say though that as I am working privately I do not have the extra requirement of possible simultaneous multiple user access, although the stuff is shared with a few people in my family. 
Things get updated all the time, by any of us. For me the solution to that is easy. The “newest” files are held in a File sharing site. ( Obviously from time to time I make full back ups on USB sticks etc. ). The rule is simple: If you want to update anything, you download and update that copy from the file sharing site and re upload it.
The likelihood that one of us might try to upload or download at the same time is so remote it is not worth considering.  In any case, the file sharing site warns you if more than one upload/ download of a file occurs within a short period of time. But I can understand that in some professional environments that might be a problem requiring the whole complicated SQL database cursor locks and god knows what else   that I don’t really fully understand, ( and I am not totally convinced yet that  anyone does fully ) ..

From time to time I save all files  with their same names in a Folder with part of its name a date ( the folder name I mean). ( The current month and year is good enough for me). That is near enough for me for any form of “version control”

I know most professionals will find me annoyingly naive. But it does not hurt once in a while to step back and think of a more primitive but often adequate solution.  

Alan


*P.s. Jacc*, you might have considered posting this Thread in the Water Cooler: That is where these discussional Threads seem to *“Percoulate” the best.



** https://www.excelforum.com/excel-pro...liarities.html 
https://www.excelforum.com/excel-pro...-xl-files.html 
https://www.excelforum.com/excel-pro...screws-up.html 
https://www.excelforum.com/excel-pro...al-arrays.html 
*** https://www.excelforum.com/excel-pro...ml#post4673161 
https://www.excelforum.com/developme...ml#post4672793 
https://www.excelforum.com/developme...ml#post4674360 
* http://www.thefreedictionary.com/percolate *  http://www.urbandictionary.com/defin...term=Perculate 
**** https://www.excelforum.com/excel-pro...ge-object.html 
https://www.excelforum.com/excel-pro...de-please.html

----------


## Jacc

I considered the Watercooler and in hindsight that would have been the right choice. If a moderator is willing to do the switch, I'm all for it.

I appreciate all your posts here guys!  :Smilie:  This is reaally interesting and I'm reading it over and over again.

----------


## xladept

I cut my teeth on maintaining other peoples programs in Assembler , Cobol and Excel as well as writing my own systems; that's why I say "You can't do one thing" :Smilie:

----------


## scottiex

> Hi Scottiex, I think you're getting mixed up between version control and source control



In all honesty, it was just a vague excuse for me to comment anyway. "I have this comment I want to shoe horn it into the conversation"...  :EEK!:  :Confused: 

BTW yes All, this really does seem more water cooler stuff.

----------


## CAntosh

For many of the reasons already mentioned, I try to keep my "large" workbooks from getting too big.  Initially, my one "big" workbook (100+ sheets) was too heavy for the 32-bit version of Excel, so I had to upgrade to 64.  I used VBA to do a sweep of all workbooks in a folder, determine what type of report each workbook was, then take the relevant info and import it into the main workbook.  The VBA wasn't overly complicated, but some of the reports being swept were large, poorly designed, and unwieldy (and govt. issued, so beyond our control) and the summary calculations on the back end weren't helping any.  Fortunately, as I became more familiar with both the job and Excel (which I'd never used prior to 2014), my ignorance abated, I learned a few tricks, and now everything runs smoothly on any version of Excel.

One general strategy I have for big workbooks is to (as much as possible) restrict the workbook to merely storing/organizing the raw data sensibly.  The temptation to include a bunch of different summary sheets and reports in the same workbook is strong, but the result is cumbersome.  Linking data workbooks to output workbooks is also an option, but not my preference.  What worked best for me was keeping just the numerous-but-organized data sheets and one summary sheet, then creating a few template sheets that could be combined with a userform to generate customizable reports at a moment's notice.  The template/userform approach has been much less taxing for me to maintain than a bunch of different summary/report sheets that are constantly recalculating and updating.

I've tried to keep the setup simple enough that everything can be used in my absence, since I'm the only one in my office who knows VBA.  I weep for my successor, though.  They'll replace me with a non-VBA analyst and everything will work fine until the State modifies one of its report templates, then my error handler message will bludgeon the poor schlub.

----------


## MarvinP

Hey,

I've done a few very large projects using Excel and Access, both ty using VBA as the backbone.  They work fine until a year or two later when something upstream changes, then I need to remember what I was thinking when I wrote the whole thing.  This is a nightmare because some people believe you just need to change some small thing and the "computer will figure it out".  

For larger projects in the modern Excel, Power Query (Get & Transform) seems to be the tool that will allow better solutions.  VBA seems to be too hard for people to use and PQ (M Language) which is easier to understand and use.  Appending many files in the same folder or even combining files using an index between them seems to be the Excel Answer.  

I've done projects in Access, which is a great tool, if you can spend the time to learn it.  I was always told that Access was too expensive for the whole group to buy so just do the problem using Excel, which comes with Office.  

I'd start looking at Power Query

----------


## dflak

> One general strategy I have for big workbooks is to (as much as possible) restrict the workbook to merely storing/organizing the raw data sensibly.



We use the dashboard strategy here as well: limited summaries that can be filtered / sliced and drill down to details only on demand.

However, our customer isn't always so reasonable: we have one report that lists every type of defect for every model with a month's worth of detailed data displayed for the past 30 days and a two-year history. I have to ask, who looks at all that data? There are literally tens of thousands of data points per day! Is someone really going to ask, "What did model XYZ do on January 15th, 2015 for failure code 7A?"

----------


## CAntosh

> Is someone really going to ask, "What did model XYZ do on January 15th, 2015 for failure code 7A?"



Yes. Via court subpoena.  The week after you've deleted it all.

My organization stores physical file folders (10-40 pages thick) for _years_ just in case there's a court order.  We add roughly 200,000 new files to the collection each year.  Sure enough, though, we received a subpoena this spring asking for a few hundred files from a specific user during a specific time period in 2010.  They're a pain to find, but we have 'em.  It's inconceivable to me that we're not allowed to move to digital storage.

----------


## dflak

The information is archived, so I don't think it needs to be reported daily.

----------


## scottiex

> We use the dashboard strategy here as well: limited summaries that can be filtered / sliced and drill down to details only on demand.
> 
> However, our customer isn't always so reasonable: we have one report that lists every type of defect for every model with a month's worth of detailed data displayed for the past 30 days and a two-year history. I have to ask, who looks at all that data? There are literally tens of thousands of data points per day! Is someone really going to ask, "What did model XYZ do on January 15th, 2015 for failure code 7A?"



OMG pulling my hair out over this now.
"hey we need everything to be shown all at the same time."
"But that means no one will be able to answer any individual question quickly, it will take longer to maintain, more errors and we can't link your customized layout to any other system"
"I don't care - it works for me"
"  'Works'? "

----------


## Jacc

> OMG pulling my hair out over this now.
> "hey we need everything to be shown all at the same time."
> "But that means no one will be able to answer any individual question quickly, it will take longer to maintain, more errors and we can't link your customized layout to any other system"
> "I don't care - it works for me"
> "  'Works'? "




Is it possibly that simple that they want excessive amount of  "semi-raw" data so they can just play around with Autofilter later? Or am I thinking too simple here?

I've converted at least one "professional" report back to simple "list with Autofilter" here on the forum. OP was over the moon happy with it.

----------


## Jacc

> In terms of train wrecks, I've had to find a solution for a "system" that comprised of over 175,000 (yes, that's not a typo) workbooks, where someone left and they wanted it to work multi-site.



That is one h#ll of a number!  Crazy!

----------


## Jacc

> I despise the type of "large excel systems" discussed here, I've seen far too many of them where excel is simply the wrong tool. They're usually of a questionable standard, contain loads of VBA with no thought out structure or architecture that simply gets bodged to add on the latest requirement. They inevitably end up as business critical systems that are passed from person to person, each of which adds more bodge without really understanding how most of it functions.
> 
> I understand how they come about, usually out of necessity by people with little knowledge that just "need something simple" that then spirals out of control. I've got a deal of sympathy with this, I have a natural inclination to improve process and get frustrated by inefficiency, but it doesn't mean that these solutions are good.
> 
> Access isn't much better, but at least it can be split and the backend migrated onto sql server. 
> 
> I think that most of the issues stem from only needing something "simple" and then building on it as scope increases rather than stepping back and looking for a better solution when requirements grow. This can happen with other bespoke software too, but it's more chronic and debilitating in excel due to the low barrier to entry. The tools don't help much either, source control is particularly troublesome without 3rd party add ins (though in all honesty I can't say I've ever once encountered someone using source control on VBA in the wild - though I'm sure some people must).



Why isn't Access much better? I've considered stepping up to Access, not because I need to at this moment but because I imagine I could take on more database -like (duuh!) tasks in the future.
However, people seems to like the crude and ugly user interfaces I create in Excel (I'm lazy so they are easy to use  :Smilie:  ), and in Access it seems like you are restricted to the active x forms type of userinterface.  Also, for developing systems or tools up to a certain size Excel is the fastest platform (this sentence feels pretty safe on this forum  :Smilie:   ).

----------


## AB33

The Screwdriver and Hammer analogy is as relevant as it was 20 years ago.  VBA is still excel (Okay- can have a bit of automation) and not a database.
I was stunned when a poster asked a question on how to parse a string in to excel and was trying to work out how to get around of excel’s maximum rows limit of 1m. I was speechless!

One of the main reasons of using excel instead of database is: "It is free for all". You can dump in any data anywhere in the pigeonhole-without much concerning as to whom is going to pick up the mess.

----------


## Jacc

The $10 000 question is where to draw the line...
When does the Excel utility using VLOOKUP on 100 lines of data go from little and cute to a ba***rd monster that should be killed off?

----------


## CK76

There isn't any hard set rule. But as general guideline, if you can't send the file via email, because of size or because the file depends on other files only accessible from within local network... then it's high time you killed the file and move on to better system.

----------


## tim201110

> There isn't any hard set rule. But as general guideline, if you can't send the file via email, because of size or because the file depends on other files only accessible from within local network... then it's high time you killed the file and move on to better system.



Not quite, there are cloud discs

----------


## CK76

> there are cloud discs



True. But, I personally prefer my report files and Excel application to be less than 5Mb  :Wink: 

I've shifted most of my dashboards from formula based to OLAP data model based in last year or two for integrity and performance.
And more recently to PowerBI.

----------


## tim201110

Excel is quite vulnerable. My biggest concern is users' skills, their reasoning. To save volume I'd prefer to minimize formatting, data validation, forms.
It is a pity there is no physical punishment any more.  :Smilie: 
Taking into account the above a special soft is more preferable.

----------


## scottiex

> Is it possibly that simple that they want excessive amount of  "semi-raw" data so they can just play around with Autofilter later? Or am I thinking too simple here?
> I've converted at least one "professional" report back to simple "list with Autofilter" here on the forum. OP was over the moon happy with it.



The primary requirement is it printed out on paper. So the key aspects of it they are asking for can't be dynamic.
Should it be on computer? well maybe - but they would discuss it in a meeting so they need everyone on the same "page" so to speak without anyone (many of whom are not 'computer people') changing anything.

----------


## Doc.AElstein

My main File is about 2MB. It contains a lot of basis/Main data and quite a few macros, or rather a few very long event type codes to do a lot of stuff when I enter quantities of stuff into the main File..
One of my most important extra data files which it can call on if and when necessary is about 11MB. That is simple data , a few extra headings and extra columns I do not use yet, and some very simple formatting
The 2MB File is frequently passed, more or less daily, by Email without problem. ( I keep meaning to get around to writing a simple code to just take the entered data and pass that instead, but as the 2MB File is no problem to send , I am still doing and recording daily that ) . 
 The bigger data files are more difficult to send. But anyway all Files  are  kept in a free file sharing facility which has proved very reliable. The most important extra data file was produced from scrapping all the info available from a Web site. It is about 35,000 rows, @ 150 columns . The web site is increasing in rows constantly as things are added to it. So I update the data file made from it from time to time. Interestingly, scraping a single row from the web site is comparable in time, or sometime faster, than getting that info from the large data file containing all the Webs current info. 
When importing a row into the Main 2MB file, you can choose to do it from the large data file if you have it, or get the info from scrapping the web site if you are on line. That is , I guess an unusual “Belt and braces” approach to be able to get at your data.
An interesting story about another large data file I use is the following: This contains the entire USA Government data records  for info on Food products used in the USA. It is about 4MB and again has some extra headings, columns and some simple formatting. I made that File with the help from some _real_  Forum Experts from what I think is some sort of Access file ( a .accdb at about 147MB size ). After I made that simple single Worksheet File,_ they_ (the USA govermant people ) asked me for a copy. They found it a neat novel concept: A 2 dimensional table like thing with all the products listed down column A and columns with the headings of all the things to do with those products along the top, so that each row gave all the info about a particular product. 
They did not quite understand my concept of updating, whereby I replaced my file from time to time with one containing all the newest current data
( I did a few other things they found neat as well, like unifying all to the same units of measure) . 
They were all experts in things like relational  databases, and the data they had was organised to allow them to do all the queery interaction stuff they did  whilst playing/ interacting with  themselves as  they did / do daily. I know that is an over simplification, but to some extent I think a form of mass hysteria or brain washing  sets in when a group of people get together and learn about and use data bases.. it can lead to a sort of tunnel vision, and loss of grip on a common sense of rationality. 

Alan

----------


## dflak

A lot depends on what kind of a hammer people hand you. Our requirement is to provided dashboard-like reports to our customers. Our data is kept in an Oracle database.

My SQL skills are better than beginner, but not all that great. Also I don't have the magic decoder ring: AKA data definition of the database because there is none, you just "sort of have to know" what's in what table and what the table relationships are. I do know a few key tables and I do well with them.

I am the company's Excel expert. Two desks over is the company's SQL expert. Prior to my arrival, he was developing reports using SQL. I've seen some of the machinations he has to go through to develop a cross-tab query and whenever the user changes the requirement, it's back to the drawing board to modify the SQL. Also the reports are static: the user really can't do a lot with them.

On the other hand, I am fond of concatenating tables and doing VLOOKUPs.

We now have our acts coordinated. The database does a lot of things better than Excel. UNION queries are better than concatenating Excel tables and JOINs are usually faster than VLOOKUPs. Also selecting from subqueries is something Excel doesn't do very well.

However, Excel does some things better than databases: pivot tables in particular and charting. Also it's easier for Excel to "look back" at previous records or "look ahead" to a total than it is to do the corresponding tasks in SQL.

So I ask for my SQL to put out the data in "denormalized" format. That is if the main table has 5 things linked to it by a foreign key, then I get 5 records for the output. This is the format Excel does well with for pivot tables, charts and formulas. It's also the least likely format that people who put Excel reports together will use.

MS-Query works very well to extract data from a database to Excel as long as you copy and paste the SQL in. The user interface for designing queries in MS-Query is ugly (and I'm a fan of the application). However, once you use MS-Query to establish the connection, you can use an Excel table and a simple bit of VB code to rewrite the connection string (SQL) at run time. This makes the combination of a database query and excel very powerful.

So let the database manage the data and use Excel to report it.

----------


## fornight

I am not a developer but I was in involved in many of our "from excel into system" projects in our company for the past 16 years.

We used to do a lot on excel for forecasting (both long and short term).  Simple modelling and not much VBA but the links were driving us nuts.  In addition, every regions start to create their own model and there was no similar assumptions etc.  Short term forecasting is done every month so our management decided that we need to have a "system" to do it for us.

It was a relatively intensive project but not exactly a proper system.  We used the "engines" and "database" of our financial system (not meant for forecasting at all) together with excel VBA and SQL script.  The users will use an excel template to upload preliminary data into the "database".  We have a script written in the "engines" to process the data and calculate the results.  The results is then stored in the "database" which there is a reporting tool that will produce excel reports.

Its all customized using existing tool and surprisingly it works!  It is easy to understand and easy to maintain.  When there is a change in the assumption, every regions will get the change at the same time.  If a specific region need an isolated change, we are able to make isolated development that does not affect the other regions.  We can even partition the changes needed at group level and not regional level.  Best of all, it is using metadatas that is same as the financial system, which means we can compare our forecast data in this hybrid with the actual data produce from the financial system in the same database and reporting tool.  everything goes seamlessly well.

Now we have "upgraded" to Hyperion .......it is a pain.....maintanance is expensive because everything is so linked together that a single change in script might ruin everything else.  The whole system or model is so interlinked that every change is a BIG development.

I wish we can go back to the previous old tool that we came up with.

----------


## Jacc

That was an interesting story, thanks for sharing!  :Smilie:

----------


## Jacc

I just reread this entire thread, the amount and quality of the answers far exceeded my expectations! Again, massive thanks to all of you who contributed!  :Smilie:

----------


## scottiex

i suppose a forum is like with excel projects in that often it is about asking a good question to bring out good answers.

----------

