Login session webscraping R

Dear all

I'm working on a scrapetool of a couple of site's. However, I have to login at a site before I can scrape the data.

When looking at another discussion: https://stackoverflow.com/questions/38844041/web-scrape-password-protected-website-but-there-are-errors

However, it does not work by me..


jungoinlog <- html_session("https://app.jungo.nl/login")
#> Error in html_session("https://app.jungo.nl/login"): could not find function "html_session"

login <- jungoinlog %>%
html_node("input") %>%
html_form() %>%
set_values(login = "username", password = "password**")
#> Error in jungoinlog %>% html_node("input") %>% html_form() %>% set_values(login = "username", : could not find function "%>%"

jungo1 <- jungoinlog %>%
submit_form(login) %>%
#> Error in jungoinlog %>% submit_form(login) %>% read_html(): could not find function "%>%"

For some reason, I cannot read the html..

Thanks in advance

All the errors you are showing imply that you are not loading the libraries you want to use before trying to use them, you are missing this at the top of your script


Thanks for response. I've already included those libraries in the first file I scrape (for every site I do have separate files).

I tried this:

url <- "https://app.jungo.nl/login"
my_session <- html_session(url) #Create a persistant session
unfilled_forms <- html_form(my_session) # find all forms in the web page
login_form <- unfilled_forms[[1]] # select the form you need to fill
filled_form <- set_values(login_form,username="USERNAME",password="PASSWORD") # fill the form
login_session <- submit_form(my_session,filled_form) # Now you're logged in

But it also does not work..

Kind regards

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.