Hello all, a first-timer here. I am iterating through a list of names and locations, then calling Google Maps API to covert the location to a country. Most of the time, this works very well. There are a few weird corner cases and I would like your advice in decreasing my dependency on hardcoding exceptions.
Reprex 1:
Asking google for the country of "Philadelphia, Pennsylvania, United States" isn't too bad:
test_ireland <- geocode("Philadelphia, Pennsylvania, United States", output = 'all')
gives me:
$results[[1]]$address_components[[4]]
$results[[1]]$address_components[[4]]$long_name
[1] "United States"
and most of the time, "$results[[1]]$address_components[[4]]$long_name" is exactly what I want.
Reprex 2:
Naturally, the world isn't that simple. For some reason, 'Ireland' will return multiple rows:
> test_ireland <- geocode("Ireland", output = 'all')
which returns results in $results[[1]] that does not contain a country. Sigh. Country is in $results[[2]].
$results[[2]]$address_components[[1]]$types[[1]]
[1] "country"
Now, if I could find "country" in any of the rows, I could take the (ooh, I don't know the word for the element pointer) and simply substitute ...$types[[1]] with $long_name[[1]] and get the value of "country". My main problem is that I can't predict how many rows I get per invocation of the API and where they will put my answer.
I thought to use:
purrr::map(geocode('location', output = 'all'), "$long_name")
just to see if I get anything; but it comes back NULL.
Suggestions? Am I making this too hard?