Jun 07 2020 10:03 AM
Hi there..
I need to import some data from >2000 webpages into excel (or any other database). I have the complete list of webpages. I found a few tutorials where Power Query was used to solve the issue. However, when i use the power query method, many errors pop up (URLs are correct).
Can anyone help with this. Any method (Power query or not) will suffice.
Thank you.
P.S. https://www.youtube.com/watch?v=dAjw9Vu8wYg
The tutorials I saw were all doing the things just like the above.
Jun 07 2020 11:09 AM
That's too abstract. URL could be correct, but other steps to transform data could be incorrect. Without the sample it's hard to say what could be done.
Jun 07 2020 07:46 PM
Thnku for the reply sir.
With errors I meant that only some showed the error, while others were fine (Also, sometimes different row entries show the errors even though they had no error previously and vice versa). So, I presumed that the error can not be with the urls. I also tried to replicate this in Google Sheets via the importhtml formula. If I run the formula in bulk it shows Errors again and if the same formula is used one by one then no error pops up.
Could it be that the website does not entertain so many requests at the same time? If yes, how can we increase the timegap?
If any other method can be used, kindly elaborate that too.
Thank you.
Jun 08 2020 12:35 AM
SolutionYes, that could be. You may delay an execution of each call with Function.InvokeAfter(). More details at Using Function.InvokeAfter() In Power Query and others like Building delays into Power BI API queries — Function.InvokeAfter() and Google Maps API
Jun 08 2020 07:47 PM
Jun 08 2020 12:35 AM
SolutionYes, that could be. You may delay an execution of each call with Function.InvokeAfter(). More details at Using Function.InvokeAfter() In Power Query and others like Building delays into Power BI API queries — Function.InvokeAfter() and Google Maps API