![]() |
Any thoughts - VBA Slow Down with Range.Clear Command
Hi,
I use some Excel VBA code repeatedly to extract data from text files, then to paste into new Excel workbooks, created from a template. Now the code has worked successfully in the past, and for my current application is taking 45 seconds to run. In order to make my code more robust, I have added one line of code that clears the contents of the template worksheet, before I paste in the new data. In this way, I don't need to worry whether the previous template "dummy" data has been overwritten. The line of code is thus:- Worksheets("Loads").Range("A1:H65536").Clear Nothing too fancy you might think! However, this one extra line of code has meant the code now takes 5 minutes to run, as opposed to 45 seconds. Any thoughts anybody? (I have noticed this kind of slow-down in the past: if for example I write data to the template, then copy and paste it into the new workbook, there is no slowdown. If I create the workbook, then write data directly to the workbook, it takes exponentially longer.) Thanks in advance for any thoughts, Paul. p.s. It is nothiong obvious like I have nested the Clear command within the extraction loops; it is standalone, before any loops, and right after I have created the new workbook from the template. |
Any thoughts - VBA Slow Down with Range.Clear Command
Paul
First thought: check the usedrange of the worksheet as it may be "corrupted" (press ctrl-end: if you come to an unexpected cell, it's dirty. if you come unexpectedly to the last row, it's probably corrupted. i've particularly known this to happen when row 65536 is either formatted or hidden, and rows are deleted. once you're sure the usedrange of that sheet is as it should be then your code /*should*/ work fine. another thought is: (m)any eventhandlers running on that sheet? (m)any (named) formulas pointing to that range? (with calculation auto?) -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Paul Fenton wrote in Hi, I use some Excel VBA code repeatedly to extract data from text files, then to paste into new Excel workbooks, created from a template. Now the code has worked successfully in the past, and for my current application is taking 45 seconds to run. In order to make my code more robust, I have added one line of code that clears the contents of the template worksheet, before I paste in the new data. In this way, I don't need to worry whether the previous template "dummy" data has been overwritten. The line of code is thus:- Worksheets("Loads").Range("A1:H65536").Clear Nothing too fancy you might think! However, this one extra line of code has meant the code now takes 5 minutes to run, as opposed to 45 seconds. Any thoughts anybody? (I have noticed this kind of slow-down in the past: if for example I write data to the template, then copy and paste it into the new workbook, there is no slowdown. If I create the workbook, then write data directly to the workbook, it takes exponentially longer.) Thanks in advance for any thoughts, Paul. p.s. It is nothiong obvious like I have nested the Clear command within the extraction loops; it is standalone, before any loops, and right after I have created the new workbook from the template. |
Any thoughts - VBA Slow Down with Range.Clear Command
Thanks for the fast response keepITcool,
OK, the range is not corrupted, and there are no event handlers. However, there are lookup functions from another worksheet that reference the dummy data. Now I have been having all sorts of fun this morning. In order to progress the job, and not have to check each time that the dummy data has been completely overwritten, I have tried a few things:- 1. I deleted the dummy data from the templates, by selecting all the rows and deleting. The Lookup functions in the other worksheet then updated to show, rather than ".........Loads!$A$4:$A$65535", showed "....Loads!$A:$A" The code still took 5 minutes to run. 2. So I deleted the dummy data, whilst making sure the lookuop functions did not change, i.e. still showed as ".........Loads!$A$4:$A$65535". However the code still took 5 minutes to run. Further to 2. above, part of the lookup function, tested for a match in the lookup table, and if one wasn't found, pasted "xxx" into the other worksheet. So what I guessed was happening, was that when the lookup table in the new workbook was populated, the lookup functions had to recalculate the cell values. I don't find this logical though, because whenever new data is pasted in, the lookup functions have to recalculate anyway. I have tested this hypothesis, by modifying my code to make sure the old dummy data is overwritten (hadn't wanted to do it this way, because I have 30 different templates, and therefore 30 separate code routines now to write). But doing it this way, the code now runs in 45 seconds again. I (alomost) give up!!! Thanks keepITcool, Paul. "keepITcool" wrote: Paul First thought: check the usedrange of the worksheet as it may be "corrupted" (press ctrl-end: if you come to an unexpected cell, it's dirty. if you come unexpectedly to the last row, it's probably corrupted. i've particularly known this to happen when row 65536 is either formatted or hidden, and rows are deleted. once you're sure the usedrange of that sheet is as it should be then your code /*should*/ work fine. another thought is: (m)any eventhandlers running on that sheet? (m)any (named) formulas pointing to that range? (with calculation auto?) -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Paul Fenton wrote in Hi, I use some Excel VBA code repeatedly to extract data from text files, then to paste into new Excel workbooks, created from a template. Now the code has worked successfully in the past, and for my current application is taking 45 seconds to run. In order to make my code more robust, I have added one line of code that clears the contents of the template worksheet, before I paste in the new data. In this way, I don't need to worry whether the previous template "dummy" data has been overwritten. The line of code is thus:- Worksheets("Loads").Range("A1:H65536").Clear Nothing too fancy you might think! However, this one extra line of code has meant the code now takes 5 minutes to run, as opposed to 45 seconds. Any thoughts anybody? (I have noticed this kind of slow-down in the past: if for example I write data to the template, then copy and paste it into the new workbook, there is no slowdown. If I create the workbook, then write data directly to the workbook, it takes exponentially longer.) Thanks in advance for any thoughts, Paul. p.s. It is nothiong obvious like I have nested the Clear command within the extraction loops; it is standalone, before any loops, and right after I have created the new workbook from the template. |
Any thoughts - VBA Slow Down with Range.Clear Command
Another thought
I'd strongly suggest to move your code to an addin. that way you have 1 place to edit and maintain your code. -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Paul Fenton wrote in 30 different templates, and therefore 30 separate code routines now to write). But doing it this way, the code now runs in 45 seconds again. |
Any thoughts - VBA Slow Down with Range.Clear Command
Does anything like this help?
Sub Demo() With Application .Calculation = xlManual .EnableEvents = False Worksheets("Loads").Range("A:H").Clear .EnableEvents = True .Calculation = xlAutomatic End With End Sub -- HTH. :) Dana DeLouis Windows XP, Office 2003 "Paul Fenton" wrote in message ... Thanks for the fast response keepITcool, OK, the range is not corrupted, and there are no event handlers. However, there are lookup functions from another worksheet that reference the dummy data. Now I have been having all sorts of fun this morning. In order to progress the job, and not have to check each time that the dummy data has been completely overwritten, I have tried a few things:- 1. I deleted the dummy data from the templates, by selecting all the rows and deleting. The Lookup functions in the other worksheet then updated to show, rather than ".........Loads!$A$4:$A$65535", showed "....Loads!$A:$A" The code still took 5 minutes to run. 2. So I deleted the dummy data, whilst making sure the lookuop functions did not change, i.e. still showed as ".........Loads!$A$4:$A$65535". However the code still took 5 minutes to run. Further to 2. above, part of the lookup function, tested for a match in the lookup table, and if one wasn't found, pasted "xxx" into the other worksheet. So what I guessed was happening, was that when the lookup table in the new workbook was populated, the lookup functions had to recalculate the cell values. I don't find this logical though, because whenever new data is pasted in, the lookup functions have to recalculate anyway. I have tested this hypothesis, by modifying my code to make sure the old dummy data is overwritten (hadn't wanted to do it this way, because I have 30 different templates, and therefore 30 separate code routines now to write). But doing it this way, the code now runs in 45 seconds again. I (alomost) give up!!! Thanks keepITcool, Paul. "keepITcool" wrote: Paul First thought: check the usedrange of the worksheet as it may be "corrupted" (press ctrl-end: if you come to an unexpected cell, it's dirty. if you come unexpectedly to the last row, it's probably corrupted. i've particularly known this to happen when row 65536 is either formatted or hidden, and rows are deleted. once you're sure the usedrange of that sheet is as it should be then your code /*should*/ work fine. another thought is: (m)any eventhandlers running on that sheet? (m)any (named) formulas pointing to that range? (with calculation auto?) -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Paul Fenton wrote in Hi, I use some Excel VBA code repeatedly to extract data from text files, then to paste into new Excel workbooks, created from a template. Now the code has worked successfully in the past, and for my current application is taking 45 seconds to run. In order to make my code more robust, I have added one line of code that clears the contents of the template worksheet, before I paste in the new data. In this way, I don't need to worry whether the previous template "dummy" data has been overwritten. The line of code is thus:- Worksheets("Loads").Range("A1:H65536").Clear Nothing too fancy you might think! However, this one extra line of code has meant the code now takes 5 minutes to run, as opposed to 45 seconds. Any thoughts anybody? (I have noticed this kind of slow-down in the past: if for example I write data to the template, then copy and paste it into the new workbook, there is no slowdown. If I create the workbook, then write data directly to the workbook, it takes exponentially longer.) Thanks in advance for any thoughts, Paul. p.s. It is nothiong obvious like I have nested the Clear command within the extraction loops; it is standalone, before any loops, and right after I have created the new workbook from the template. |
Any thoughts - VBA Slow Down with Range.Clear Command
Dana,
No it didn't work :-( Thanks for the tip though, Paul. "Dana DeLouis" wrote: Does anything like this help? Sub Demo() With Application .Calculation = xlManual .EnableEvents = False Worksheets("Loads").Range("A:H").Clear .EnableEvents = True .Calculation = xlAutomatic End With End Sub -- HTH. :) Dana DeLouis Windows XP, Office 2003 "Paul Fenton" wrote in message ... Thanks for the fast response keepITcool, OK, the range is not corrupted, and there are no event handlers. However, there are lookup functions from another worksheet that reference the dummy data. Now I have been having all sorts of fun this morning. In order to progress the job, and not have to check each time that the dummy data has been completely overwritten, I have tried a few things:- 1. I deleted the dummy data from the templates, by selecting all the rows and deleting. The Lookup functions in the other worksheet then updated to show, rather than ".........Loads!$A$4:$A$65535", showed "....Loads!$A:$A" The code still took 5 minutes to run. 2. So I deleted the dummy data, whilst making sure the lookuop functions did not change, i.e. still showed as ".........Loads!$A$4:$A$65535". However the code still took 5 minutes to run. Further to 2. above, part of the lookup function, tested for a match in the lookup table, and if one wasn't found, pasted "xxx" into the other worksheet. So what I guessed was happening, was that when the lookup table in the new workbook was populated, the lookup functions had to recalculate the cell values. I don't find this logical though, because whenever new data is pasted in, the lookup functions have to recalculate anyway. I have tested this hypothesis, by modifying my code to make sure the old dummy data is overwritten (hadn't wanted to do it this way, because I have 30 different templates, and therefore 30 separate code routines now to write). But doing it this way, the code now runs in 45 seconds again. I (alomost) give up!!! Thanks keepITcool, Paul. "keepITcool" wrote: Paul First thought: check the usedrange of the worksheet as it may be "corrupted" (press ctrl-end: if you come to an unexpected cell, it's dirty. if you come unexpectedly to the last row, it's probably corrupted. i've particularly known this to happen when row 65536 is either formatted or hidden, and rows are deleted. once you're sure the usedrange of that sheet is as it should be then your code /*should*/ work fine. another thought is: (m)any eventhandlers running on that sheet? (m)any (named) formulas pointing to that range? (with calculation auto?) -- keepITcool | www.XLsupport.com | keepITcool chello nl | amsterdam Paul Fenton wrote in Hi, I use some Excel VBA code repeatedly to extract data from text files, then to paste into new Excel workbooks, created from a template. Now the code has worked successfully in the past, and for my current application is taking 45 seconds to run. In order to make my code more robust, I have added one line of code that clears the contents of the template worksheet, before I paste in the new data. In this way, I don't need to worry whether the previous template "dummy" data has been overwritten. The line of code is thus:- Worksheets("Loads").Range("A1:H65536").Clear Nothing too fancy you might think! However, this one extra line of code has meant the code now takes 5 minutes to run, as opposed to 45 seconds. Any thoughts anybody? (I have noticed this kind of slow-down in the past: if for example I write data to the template, then copy and paste it into the new workbook, there is no slowdown. If I create the workbook, then write data directly to the workbook, it takes exponentially longer.) Thanks in advance for any thoughts, Paul. p.s. It is nothiong obvious like I have nested the Clear command within the extraction loops; it is standalone, before any loops, and right after I have created the new workbook from the template. |
All times are GMT +1. The time now is 12:20 AM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com