Hi, I'm relatively new to RSelenium. I'm webscraping content in dynamic websites using this package to set a Remote Driver and access the desired content through Firefox. The thing is that I feel that the quality of content I get depends on the power of my computer to genereate the website's html code and to process it in each iteration of my loop. I was wondering if using
rsDriver(browser="phantomjs")
instead of
rsDriver(browser="firefox")
may improve the performance of my request.
Any help with this, advise or comment about scraping dynamic websites with Rstudio would be appreciated.
Thanks