Memory increase in 32 bit Excel

32-bitmemory
8 posts / 0 new
Last post
Tarjei Kirkesaether A+ 58
TK
Memory increase in 32 bit Excel

For anyone who wasn't aware, my IT guys just told me that in December 2019 the maximum accessible memory for 32 bit excel running on 64 bit windows was increased from 2GB to 4GB. My ongoing battle to get 64 bit Excel playing nicely can take a break until next year when my model has doubled in size again:-)

Source: https://docs.microsoft.com/en-us/office/troubleshoot/excel/laa-capabilit...

Jun Yan A+ 124

Hi Tarjei.

Handy to note! Out of curiousity, how big are your files getting?

Is it worth looking at revisiting the model structure at this point?

We've had instances where we've significantly been able to reduce file sizes from restructuring flows in a model (sometimes in excess of halving model sizes). This is especially true with the additions of global lists as well as modules as categories.

Jun

Tarjei Kirkesaether A+ 58
TK

Hi Jun,

The model is 41MB and uses 1.3GB of RAM. It is a 12 year monthly corporate planning model, and I would be extremely interested in understanding how you managed to restructure your models to reduce the size so significantly, would you mind sharing the basic concepts, specifically elaborating on what you mean by using modules as categories?

Thanks

T

Jun Yan A+ 124

Hi Tarjei,

The concept behind modules as categories can get pretty complicated and I'm sure Mike won't be too pleased if I give a poor explanation of this, so I'll try my best! The model we scaled down was c. 20MB and we got it under 7mb. It was a 10 year monthly corporate planning model that allowed for long range forecasting and historical tracking at a GL level.

One big problem, especially when you push out to say 10 years is repeating category blocks, which used to be a necessity when you deal with categories of categories. One fix that was released a little while back was to allow modules as categories where you centralise a category block - say divisions and then on category insertions, you'll get whole new sets of modules. This allows you to then repeat centralised blocks across different modules by link-ins. Further to this, you can then do funky stuff with layering the modules as categories, but this is way too far down the rabbit hole...

An example of this can be found in the multi-business unit content, and we actually use this a lot in multi-entity consolidations.

Another way was to strip down some of the content in the dynamic templates as well as to utilise mapping / allocations in lieu of repetitions.

Sorry, it's a bit high level, but I'd need some context to help further.

Jun

Michael Hutchens A+ 189

Hi Guys,

From my experience the real challenge with larger models is not file size on disk - as Excel supports opening 1GB+ files without formulas - it's the number of calculation iterations required by the model each time it calculates.

Most Excel users don't really think about how the Excel calculation actually works, and the fact that it's one big iterative loop. As a result, they commonly include a huge number of SUMIF, INDEX and MATCH functions in their models, often when doing so can/should be avoided, and thereby make Excel sweat and become unstable in the process.

I've almost never come across a model that can't be made significantly faster via a combination of restructuring to remove repeated calculations - as Jun discusses above - and removing avoidable lookup-based functions. The last model I optimized went from an 9 second calculation time down to 0.6 seconds.

The big thing to think about is whether or not each large calculation block really needs to calculate each time Excel calculates. For example, I regularly see financial models that use tens/hundreds of thousands of SUMIF functions to restructure historical financial data into a format consistent with historical financial statements. This approach is disastrous, because the historical data is usually only updated once a month, but the SUMIF functions end up completely killing the performance of the model all the time.

A few simple workarounds to this issue are:

  1. Put the filtering of the historical data into a separate workbook, which is only opened when historical data is updated; or
  2. Use a macro to insert and remove the SUMIF formulas before and after historical data is updated; or
  3. Use a macro/API to bring the historical data into the model at the click of a button rather than via a huge bunch of formulas.

I use these approaches to aim for workbooks <= 20MB file size with a calculation time of < 1 second, but I must admit it can still be hard given the scoping requirements of some of our users...

M.

Jun Yan A+ 124

To add to Mike's comments around separate workbook for historical data. This is actually something we utilise quite a bit.

We often use power query to wrangle historical data, especially in cases where we need to tap directly into a database. It is also particularly useful when we build models that carry operational data, and we can prepare all of this data with one refresh button and it can also pull from multiple sources.

Once we've done this, we can summarise everything into a time series for one import assumptions tool process.

The added benefit here is that all the historical data can be easily filtered / pivoted / requeried / sliced etc. to run BI or delve into historical drivers that can be used to populate forecasts assumptions in the modular models. Further, these files do not face the same calculation or size impediments.

Hope this all helps...

Jun

Tarjei Kirkesaether A+ 58
TK

Thank you both for taking the time to respond - all very useful advise to keep in mind!

We use the same approach for historical data and have avoided array formulas whenever possible so there isn't too much to be gained there. Having said that he calculation speed is 1.4 seconds which is acceptable for my use.

My main concern with regards to RAM usage has always been the instability once I've opened one or more other workbooks and then run out of memory when trying to save but that has now been mitigated by the RAM increase in 32 bit:-)


 

Michael Hutchens A+ 189

Yes the really painful thing about memory limits is that the most likely crash time is during save, which is also a worst-case scenario.

Ultimately, the only really safe way to mitigate your issues is to run extremely powerful hardware and avoid 32-bit, but at least the Large Address Aware capability change for Excel has created some respite for users not easily able to move from 32-bit to 64-bit operating systems.