Like you I used QueryTables extensively a few years ago. Since then the
web-sites have become more complicated, requiring a user to login, using
frames so that the actual URL needed is not displayed in the address line and
also using Java Virtually none of the modules I wrote then continue to work.
Even the microsoft example given in
http://support.microsoft.com/default...b;en-us;213730 fails. For
what it is worth I found the following sub which detects an error 1004 and
retries the Query.
Public Sub Get_Query_Data(a_strSheet As String, a_strConnection As String)
Dim intErrorCount As Integer, intResponse As Integer
On Error GoTo GetQueryData_Error
intErrorCount = 0
Worksheets(a_strSheet).Activate
GetQueryData_Retry:
With ActiveSheet.QueryTables.Add(Connection:=a_strConne ction, _
Destination:=Range("A1"))
.RefreshStyle = xlOverwriteCells
.Refresh (False)
End With
Data_Obtained:
Exit Sub
GetQueryData_Error:
If Err.Number = 1004 Then 'This error occurs if the Web site cannot be
found
intErrorCount = intErrorCount + 1
If intErrorCount 10 Then
intResponse = MsgBox("Error Count Exceeded", vbRetryCancel)
If intResponse = vbCancel Then
Exit Sub
End If
End If
Resume GetQueryData_Retry
End If
End Sub
Hope this helps.
--
Thanks in anticipation
"Hfly" wrote:
The issue is not the loop around the web query, it's the query itself. When I
run the macro, it loops through about 80 different values of "id", and all of
these "id" values, when combined with the URL in my code, form an existing
webpage.
When I run the macro, it goes on fine until the run-time error occurs. Then
I hit Debug, then continue running it, and keep doing that until it loads the
query okay. For the most part, it eventually finds the data, but some don't
load at all. Between April and October of last year, I ran this macro a few
times each week and never had a run-time error, so something is different
that's causing problems.
Robin responded to my theory with something that unfortunately makes sense.
So now the issue is error response. Instead of me having to manually hit
Debug and Continue with each run-time error, I'd like to have code that
recognizes the error and refreshes the QueryTable again. Any assistance with
that would be appreciated.
Thanks,
Hfly
"Don Guillett" wrote:
After a problem or two I got this to work for 7307 & 7308. It could be
improved.
Sub doloop()
x = 3
For Each c In [mylist]
[a1].Select
With Selection.QueryTable
.Connection = _
"URL;http://sports.espn.go.com/mlb/players/profile?statsId=" & c
.WebSelectionType = xlSpecifiedTables
.WebFormatting = xlWebFormattingNone
.WebTables = "6"
.WebPreFormattedTextToColumns = True
.Refresh BackgroundQuery:=False
End With
Sheets("sheet1").Range("b3:r3").Copy Sheets("sheet2").Range("b" & x)
x = x + 1
Next c
End Sub
--
Don Guillett
SalesAid Software
"Don Guillett" wrote in message
...
As usual, post your macro for comments
--
Don Guillett
SalesAid Software
"Hfly" wrote in message
...
I'm using Excel 2002 SP3...
I wrote a macro over a year ago that performs several web queries and
copies/pastes data between each new query. Last year, after writing the
macro, the program would nicely zip through all of the queries and
everything
would be great. The last time I ran the macro successfully was in
October
2004.
In April 2005, I ran the macro again. The URLs in question are still
exactly
the same. The problem now is the run-time error '1004', "unable to open"
the
URL. If I hit "Debug", then run it again to continue, it might
successfully
refresh that web query and move on. It's hit or miss.
My theory is that the web query refresh method is not being patient
enough
for data. I can immediately open up an IE window and load the same pages
easily. I'm wondering if the refresh method is not waiting long enough
for
the data to load. Is there a way to set a parameter to tell it to wait
longer
for the URL to open?
Alternately, I've attempted to write an error-handling mechanism, but
the
best I can do is to get it to refresh one extra time. After that, it
just
crashes. I know I'm flirting with an infinite loop, but it would be nice
to
get the query to keep trying until it finds the data. I know it's out
there!
Any feedback would be greatly appreciated.
Thanks,
Hfly