Hello guys,

I'm the new guy in my department in a big company and from what I see the work here could be much improved because people are losing so much time with large excel files.
At the moment they are using Excel for preparing the CSV files that are used in lots of reports.
For example they have some excel files that are 100 000 rows and 30 columns(about 15-20mb each), in 20 of those 30 columns there are formulas (mostly vlookups), the formulas point to another file ( a support file ) that is full of info , it has 30 sheets and some sheets have 30-40 columns. The procedure is like this : They extract an excel file with 10 columns and 100 000 rows from a webapp, they copy and paste that data in the file full of formulas and they wait 2-3 hours for the calculations to be complete. They upload that file to the webapp. They have to make 20-30 of those files monthly.
Computers are always stuck with excel computing and many times not responding or crashing.
I want to help and resolve this situation but I need some help.
My first idea was to split the large file in 10-20 smaller files and run the formulas on them and after that add the result in one large file, all of that using some vba macros. But when I was half way with my idea I found out that it was pointless because of all those vlookups it will take almost the same time maybe more...
My second idea was to design a database but I don't know where to start with that ... my SQL skills are kinda limited to some simple queries from my last job where I was using PLSQL for some selects, where , like, between, etc basic queries.
I would appreciate some help because I want to save the day here and be the hero for once.
Thank you very much.