How to Improve Excel Performance - Common Culprits & Solutions

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
do you find yourself frustrated by sluggish Excel files that take forever to load or recalculate if so you're not alone fortunately there are many tips and tricks you can use to improve performance and in this video we'll identify the most common causes of slow Excel files and how to speed them up so you can improve productivity there's also a PDF file for this video with links to further tutorials on the solutions I cover that you can download from a link in the video description one of the most obvious causes of slow Excel files is simply too much data although this is rarely the sole contributor the solution is to be diligent and Only Store data in the file that's required for the task you can use power query to filter out the unnecessary Data before loading it to the Excel file when data is used in a pivot table it's typically stored in the file twice once in the worksheet and again in the pivot cache the solution is to use power query to get the data and load it directly to the pivot cache by selecting pivot table report in the import data dialog box single cell array formulas those we enter with Ctrl shift and enter can be processed several times depending on the number of cells referenced in the formula there are a few Solutions you can try avoid mixing row and column references or overlapping array references try separating the calculation into multiple cells rather than an array formula move the calculation to power query power query only calculates when you load the data or refresh the data whereas formulas calculate every time something they reference changes and in the case of volatile functions every time anything changes just note that modern dynamic array functions are more efficient than their older control shift and enter counterparts that I'm referring to here nested formulas in Excel like nestatives can cause performance issues because they involve multiple levels of functions or calculations that require more processing power and memory they can be especially problematic when they involve large data sets such as when performing lookups or other operations across multiple worksheets or workbooks so you can try breaking down complex formulas into smaller more manageable Parts which can help improve performance and make the worksheet easier to read and debug you can move the calculations to power query where they're calculated once on loading the data rather than potentially multiple times during the use of the file or simply copy and paste the formulas as values if they're not expected to change volatile functions can cause performance issues because they recalculate every time anything changes in the workbook even if the change has no direct impact on the function's output Solutions are to use volatile functions sparingly and only when necessary if possible use non-volatile Alternatives like index instead of offset for dynamic named ranges you can also switch to manual calculation mode this way you only recalculate the workbook when necessary using the Upperline key just be sure not to forget to recalculate formulas can cause performance issues because they often involve searching through large data sets which can be time consuming and resource intensive the solutions are to use power query to perform lookups you can do exact match and approximate match with power query just like Excel and there are some links to tutorials for this in the file download for this video now if you must use a lookup formula store the lookup table and the formula on the same sheet also avoid using exact match lookups if you can instead use approximate match with a sorted list and be sure to only reference cells containing the data being looked up including empty rows in the formulas just going to reduce efficiency now I should point out that 365 users may not experience performance issues to the same extent as those using earlier versions of excel due to the new internal cast index Excel creates for lookup functions conditional functions in Excel can cause performance issues because they involve testing each cell in a given range against a set of criteria and that can be time consuming and resource intensive especially for large data sets now as with the lookup functions in 365 these functions now create an internal cached index for the range being searched and this cached index is reused in any subsequent aggregations that are pulling from that same range so the solutions are to avoid selecting more cells are necessary or use a pivot table instead pivot tables can perform these calculations and you don't even need to know how to write the formula Define names are recalculated each time a formula that refers to the name is recalculated even if the value of a cell doesn't change when calculated now Define names are one of the most valuable Excel features so don't avoid them because they might cause performance problems instead check it for any of the other courses could be contributing and resolve them first then if you still have problems try replacing names with direct references to cells the old if is nav lookup technique requires Excel to do double the work that is two lookups instead use the if error function or if n a function to handle errors return by lookup functions as a general rule external links should be avoided links to external Excel files aren't only slow to calculate they're also easily broken plus many functions can't evaluate on a closed workbook internal links can also slow down calculation so the solution is to use power query to bring the data into the current file and if you must use external links open the file being linked to before opening the file doing the linking excessive self-formatting including font Styles colors borders and other visual attributes that are applied to cells can consume a significant amount of processing power and memory especially when applied to large data sets excessive formatting can also make the file size larger and that's going to slow down the loading and saving times for the workbook instead of applying formatting like self-fill color to a large range of cells apply it to the whole column or row it's easier for Excel to know that a whole row or column is formatted in a particular way than it is to keep track of thousands of separate cells redundant formatting can sometimes linger in cells unbeknownst to you cells can appear empty but Excel is still storing information about those cells in memory if you press Ctrl n you'll be taken to the last cell in the sheet that Excel is storing information for if this isn't the end of your table then you know you have redundant formatting so the last used cell is s337 but there's no data in the cells to the left or above it one way to fix it is to delete rows and columns that are empty but I found this often doesn't resolve the problem by the way make sure you back up the file before you delete anything thankfully there's a new tool available in Excel online 2365 users that can check for performance issues like this called optimize sheet it's available on the review tab of the ribbon clicking on the check performance button brings up the workbook performance pane this summarizes the sheets that contain any issues from there you can see a list of the individual cells or ranges and the issues clicking optimize all or optimize sheet will remove them for you storing large amounts of data in Excel tables in Excel 2013 and earlier can sometimes yield worse performance than not formatting it in a table there's an unofficial cut off that's more than 500 000 rows by 10 columns this is sometimes too much to store in Excel 2013 tables although that can depend on whether you have a lot of formulas or not if the file is slow consider storing the data in powerpivot rather than the worksheet alphabet has an advanced compression algorithm that enables it to store data more efficiently than Excel itself pound paper can also store tens of millions of rows of data and overcome the row limitations of excel files with password protected workbook structures will be slower to open and close than one without a password now given that Excel passwords can be removed easily you might want to consider whether the password is worth the performance hit udfs are typically less efficient than the built-in Excel functions so consider using the built-in functions if you need to break them into separate calculations or you could write custom functions with the new Lambda function many improvements to excel's calc engine were released for 365 users for Summits averages countifs maxif's minutes and their singular counterparts as well as vlookup hlookup and match the Improvement is dramatic for example calculating 1200 sumifs averages and countifs formulas aggregating data from 1 million cells on a four Core 2 gigahertz CPU that took 20 seconds to calculate using Excel 2010 now only takes 8 seconds with 365. so where possible update 2365 take advantage of the new improved functions and calc efficiencies also use the 64-bit version of excel the 32-bit version that's commonly used for compatibility with old add-ins only has two gigabytes of virtual memory well I hope you found this tutorial useful don't forget to download the PDF ebook for this lesson from the link here and if you like this video please give it the Thumbs Up And subscribe to our channel for more and why not share it with your friends who might also find it useful thanks for watching foreign [Music]
Info
Channel: MyOnlineTrainingHub
Views: 21,788
Rating: undefined out of 5
Keywords: excel slow, excel performance, speed up excel
Id: e4no3HpW1NY
Channel Id: undefined
Length: 9min 50sec (590 seconds)
Published: Tue May 02 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.