![]() |
Download multiple html files and save as txt
Hello,
I am in the middle of an analysis project that requires me to use a sophisticated text parsing software suite. The only problem is that it cannot access text content over the internet. Files must be local or on network. I am a VBA novice, but I have managed to build a script that has generated several thousand web page URLs in a worksheet . My hope is to loop on these URLs and save a local TEXT copy of the html web page. I've seen many discussions about displaying webpages from Excel, but I think my problem is slightly different, since rendering each one of these pages will probably be inadvisable from a performance perspective (though I don't know that for sure) and viewing them in a browser is not my end goal anyway. Ideally, my 2000 URLs ('URLSheet'!A2:A2001) are used to fill a directory with 2000 corresponding txt files. Then, I can leverage the text software I need to complete my analysis. I feel a bit out of my depth. Any help would be greatly appreciated. |
All times are GMT +1. The time now is 01:29 PM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com