![]() |
Parsing Huge Files of Text
Group, As a test engineer I must compile and crunch large amounts of data (up to a million rows of data) prior to a new product launch. In the past we (I) manually separate, sum, analyze numbers. With my new found application called VBA I'd like to perform this number crunching in Excel. Now I know Excel has a row limit size. Once each group of data is parsed into columns, my net would be only several hundred thousand rows with the rest of the data being discarded. My question, is this possible in Excel? And if it isn't what can I use as an intermediate step before crunching in Excel? Tony -- ajocius ------------------------------------------------------------------------ ajocius's Profile: http://www.excelforum.com/member.php...o&userid=17695 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
Parsing Huge Files of Text
Excel supports about 65K a sheet before it becomes a little unstable. Why not try Access? -- JohnDK ------------------------------------------------------------------------ JohnDK's Profile: http://www.excelforum.com/member.php...fo&userid=7184 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
Parsing Huge Files of Text
The main reason for not using Access is my inability to work in it. I'm a C programmer for the most part and recently found a new love in VBA. With gobs of data to parse I wanted something thats easy to pick up and easy to use. All my data appears in a single column. Can Access rearrange and convert to Excel? Tony -- ajocius ------------------------------------------------------------------------ ajocius's Profile: http://www.excelforum.com/member.php...o&userid=17695 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
Parsing Huge Files of Text
You can export to Excel from Access, but there is still a 65K limit in excel.
Possibly you can grout your data in Access and keep the export under 65K? -- David "ajocius" wrote: The main reason for not using Access is my inability to work in it. I'm a C programmer for the most part and recently found a new love in VBA. With gobs of data to parse I wanted something thats easy to pick up and easy to use. All my data appears in a single column. Can Access rearrange and convert to Excel? Tony -- ajocius ------------------------------------------------------------------------ ajocius's Profile: http://www.excelforum.com/member.php...o&userid=17695 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
Parsing Huge Files of Text
It has been pointed out that Excel has a 65K limit to rows.
Depending on the requirements you cold cut down the intervals, use multiple columns or use access. Access is quite easy to use in the simple sence and making an unlinked database is just like a spreadsheet. There is another text tool out that called PARSE which allows traversal of text files and you can do quite powerful things with it. Do a search to find it. |
Parsing Huge Files of Text
Not sure what you are trying to do with the data - but if as I suspect you
are trying to carry out some statistics I would avoid Excel and Access. Have a look at SPSS this product can handle virtually unlimited number of rows and provide all the stats and graphics you are likely to need. I am a user of all three products and each has its place, although the edges are sometimes blurred! -- Cheers Nigel "ajocius" wrote in message ... Group, As a test engineer I must compile and crunch large amounts of data (up to a million rows of data) prior to a new product launch. In the past we (I) manually separate, sum, analyze numbers. With my new found application called VBA I'd like to perform this number crunching in Excel. Now I know Excel has a row limit size. Once each group of data is parsed into columns, my net would be only several hundred thousand rows with the rest of the data being discarded. My question, is this possible in Excel? And if it isn't what can I use as an intermediate step before crunching in Excel? Tony -- ajocius ------------------------------------------------------------------------ ajocius's Profile: http://www.excelforum.com/member.php...o&userid=17695 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
Parsing Huge Files of Text
Hi,
What a dilemna! Write a macro to: Using the "File Open" method, you can implement the "Start at Rows" option By importing chunks of data, say 20,000 rows at a time, you can then analyse your data, save it as "Rows 1 to 20000.txt" and do this as many times as it takes. The next save as file name would be "Rows 20001 to 40000.txt" You may end up with up to or more than 50 text files. Depending on the length of the cell value you could use another macro to open the first file to append open next file to read the data and then append it to the first file. I will not give you the code here because you probably know it already. If you need help, let me know. See my stuff at: http://www.geocities.com/excelmarksway http://au.geocities.com/windsofmark - -Mark "ajocius" wrote: Group, As a test engineer I must compile and crunch large amounts of data (up to a million rows of data) prior to a new product launch. In the past we (I) manually separate, sum, analyze numbers. With my new found application called VBA I'd like to perform this number crunching in Excel. Now I know Excel has a row limit size. Once each group of data is parsed into columns, my net would be only several hundred thousand rows with the rest of the data being discarded. My question, is this possible in Excel? And if it isn't what can I use as an intermediate step before crunching in Excel? Tony -- ajocius ------------------------------------------------------------------------ ajocius's Profile: http://www.excelforum.com/member.php...o&userid=17695 View this thread: http://www.excelforum.com/showthread...hreadid=482412 |
All times are GMT +1. The time now is 07:24 AM. |
Powered by vBulletin® Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
ExcelBanter.com