I have a very large program in Excel (.xlsm) that has a COM portal (xlQ) designed to bring in trading data. I have found that the process of bringing in data will not function when my code is running. Since the interface uses functions to insert market data into the Excel workbook, I guess it stands to reason that these cpu dependent cells may not calculate while code is running. I have to provide for some quiet time through the use of an on-time macro to allow the data to flow. This causes a problematic increase in the cycle time of getting data and doing something with it. I need to find an asynchronous approach that allows the data to process efficiently while my code is running.
In my vba program, I have experimented with DoEvents and PowerPivots but lack the experience to implement these to achieve a parallel processing environment.
So, my goal is simple. I want to be able to generate BUY SELL signals during a 10 second cpu intensive cycle. When I am finished with the cycle, then I will begin it again (loop) and grab the data from the COM without any interruptions. So, the DO LOOP runs without interruption because the COM data is seamlessly available.
Due to the difficulties I have experienced I was forced to come out of the loop, based on an on-time macro, then re-enter the loop after 5 seconds of quiet time so the COM interface could process. These 5 seconds have rendered by software useless since the trades are being implemented 5 seconds too late.
In order to process the billions of bits involved in the 10 second interval, I am using a 12 processor tower with 25G of memory and very fast clock speed. Even with this I am getting system warnings about cpu usage being too high.
What advice do you have?
Bookmarks