Adding "Breaks" in "htmlescape" - html

I am following this tutorial here (https://rstudio.github.io/leaflet/popups.html):
library(htmltools)
library(leaflet)
df <- read.csv(textConnection(
"Name,Lat,Long
Samurai Noodle,47.597131,-122.327298
Kukai Ramen,47.6154,-122.327157
Tsukushinbo,47.59987,-122.326726"
))
leaflet(df) %>% addTiles() %>%
addMarkers(~Long, ~Lat, popup = ~htmlEscape(Name))
Now, I want to the popups to display the information about the name, the longitude and the latitude (i.e. title + value) - I would like it to say:
Name = Insert Restaurant Name Here
(new line)
Longitude = Insert Longitude Name Here
(new line)
Latitude = Insert Latitude Here
I thought that this could be done as follows:
leaflet(df) %>% addTiles() %>%
addMarkers(~Long, ~Lat, popup = ~htmlEscape(df$Name, df$Lat, df$Long))
But this is giving me the following error:
Error in htmlEscape(df$Name, df$Lat, df$Long) : unused argument (df$Long)
I tried to read about this function (https://www.rdocumentation.org/packages/htmltools/versions/0.5.2/topics/htmlEscape), but there does not seem to be too much information on how to use it. I thought that maybe this might require "combining" all the arguments together:
leaflet(df) %>% addTiles() %>%
addMarkers(~Long, ~Lat, popup = ~htmlEscape(c(df$Name, df$Lat, df$Long)))
But now this only displays the final argument (and that too, without the title).
Is "htmlescape()" able to handle multiple arguments?
Thank you!

Related

Lag in HTML Outputs within R

I found this post on stackoverflow to make an interactive map in R (second answer): Search button for Leaflet R map?
library(leaflet)
library(inlmisc)
df = read.csv(textConnection(
'Name, Lat, Long
<b>location 1</b>,42.3401, -71.0589
<b>location 2</b>,42.3501, -71.0689'))
map=leaflet(df) %>%
addTiles() %>%
setView(lng=-71.0589,lat=42.3301, zoom=12) %>%
addMarkers(~Long, ~Lat, popup = ~Name, group="marker")
map=inlmisc::AddSearchButton(map, group = "marker", zoom = 15,
textPlaceholder = "Search here")
This code works and produces a map:
But for some reason, when I try to search for one of the locations (e.g. "location 1") - the map seems to take a long time to load and experience a lot of lag. As such, I have waited several minutes and was unable to actually search for one of these cities.
Can someone please help me figure out what I am doing wrong?
Thank you!
The problem occurs when there is no label given for the locations.
library(leaflet)
library(inlmisc)
df <- read.csv(textConnection(
'Name, Lat, Long, Label,
<b>location 1</b>,42.3401, -71.0589, Loc 1
<b>location 2</b>,42.3501, -71.0689, Loc 2'))
map <- leaflet(df) %>%
addTiles() %>%
setView(lng=-71.0589,lat=42.3301, zoom=12) %>%
addMarkers(~Long, ~Lat, popup = ~Name, group="marker", label = ~Label) %>%
inlmisc::AddSearchButton(group = "marker", zoom = 15,
textPlaceholder = "Search here")

How to pass a list of urls contained in a dataframe column into a leaflet map?

I want to make a map using leaflet so that the points in the map have popup notes. Each popup will have a clickable link to redirect to an Internet page. The URLs that will be inserted in such popups are in a column of my data frame, which has thousands of rows. Some toy data:
place <- c("a", "b", "c", "d", "e", "f")
thing <- c("potato","melon","black pepper", "bigfoot","black panther", "orchidaceae")
lat <- c(-17.456, 31.4009, 24.293, -8.956, 8.697, -25.257)
long <- c(-63.658,-111.144,-106.759,-81.029,-83.2052,-52.026)
urls <- c("https://en.wikipedia.org/wiki/Potato",
"https://en.wikipedia.org/wiki/Melon",
"https://en.wikipedia.org/wiki/Black_pepper",
"https://en.wikipedia.org/wiki/Bigfoot",
"https://en.wikipedia.org/wiki/Black_panther",
"https://en.wikipedia.org/wiki/Orchidaceae")
d <- data.frame(place, thing, lat, long, urls)
And this is the code I've been trying to use to plot the map:
library(leaflet)
library(tidyverse)
content <- paste("The", thing,
"occurs near.You can find some information",
"<b><a href=d$urls>here</a></b>")
mymap <- d %>%
leaflet() %>%
addProviderTiles(providers$Esri.WorldImagery, group = "World Imagery") %>%
addProviderTiles(providers$Stamen.TonerLite, group = "Toner Lite") %>%
addLayersControl(baseGroups =
c("Toner Lite", "World Imagery")) %>%
addMarkers(label = thing,
popup = content,
icon = ~ icons(iconUrl = "marker_red.png",
iconWidth = 28, iconHeight = 24))%>%
addMiniMap(
toggleDisplay = TRUE,
tiles = providers$Stamen.TonerLite
) %>%
print()
The problem is that the word "here" in the popup is sort of clickable, but does not redirect me to any internet page. I don't know what to do in this situation where the URLs are contained in a column of my data frame. Besides, I have no experience working with HTML objects. Could anyone help me figure out a way to pass those URLs into the popup notes?
Thanks in advance!
The problem is with href=d$urls in the content, d$urls is assigned as the URL and the actual URL is not referred here. It can be resolved using paste0 function.
The content should be
content <- paste("The", thing,
"occurs near.You can find some information",
paste0("<b>here</b>"))
mymap <- d %>%
leaflet() %>%
addProviderTiles(providers$Esri.WorldImagery, group = "World Imagery") %>%
addProviderTiles(providers$Stamen.TonerLite, group = "Toner Lite") %>%
addLayersControl(baseGroups =
c("Toner Lite", "World Imagery")) %>%
addMarkers(label = thing,
popup = content,
icon = ~ icons(iconUrl = "marker_red.png",
iconWidth = 28, iconHeight = 24))%>%
addMiniMap(
toggleDisplay = TRUE,
tiles = providers$Stamen.TonerLite
) %>%
print()

RVEST - Extracting text from table - Problems with access to the right table

I would like to extract the values in the table on the top right side of this Webpage:
https://www.timeanddate.de/wetter/deutschland/karlsruhe/klima
(Wärmster Monat : VALUE, Kältester Monat: VALUE, Jahresniederschlag: VALUE)
Unfortunately, if I use html_nodes("Selectorgadgets result for the specific value"), I receive the values for the table on the top of the link:
https://www.timeanddate.de/stadt/info/deutschland/karlsruhe
(The webpages are similar, if you click "Uhrzeit/Übersicht" on the top bar, you access the second page and table, if you click "Wetter" --> "Klima", you access the first page/table (the one I want to extract values from!)
num_link= "https://www.timeanddate.de/wetter/deutschland/Karlsruhe/klima"
num_page= read_html(num_link)
rain_year = num_page %>% html_nodes("#climateTable > div.climate-month.climate-month--allyear > div:nth-child(3) > p:nth-child(1)") %>% html_text()
temp_warm = num_page %>% html_nodes("#climateTable > div.climate-month.climate-month--allyear > div:nth-child(2) > p:nth-child(1)") %>% html_text()
temp_cold = num_page %>% html_nodes("#climateTable > div.climate-month.climate-month--allyear > div:nth-child(2) > p:nth-child(1)") %>% html_text()
I get " character (empty) " for each variable . :(
THANK YOU IN ADVANCE!
You can use the html_table function in rvest, which is pretty good by now. Makes it a bit easier to extract, but I do recommend learning to identify the right css-selectors as well, as it does not always work. html_table always returns a list with all tables from the webpage, so in this case the steps are:
get the html
get the tables
index the right table (here there is only one)
reformat a little to extract the values
library(rvest)
library(tidyverse)
result <- read_html("https://www.timeanddate.de/wetter/deutschland/karlsruhe/klima") %>%
html_table() %>%
.[[1]] %>%
rename('measurement' = 1,
'original' = 2) %>%
mutate(value_num = str_extract_all(original,"[[:digit:]]+\\.*[[:digit:]]*") %>% unlist())

Not getting expected output in we scraping R

I have written a small program.
Where I scrape Google search website and I want all the URL on the Google search web page. But I'm getting character(0) in the O/P. Plz help me.
CODE -
library("rvest")
r_h = read_html("https://www.google.com/search?q=google&oq=google&aqs=chrome.0.69i59j0l2j69i60l2j69i65.1101j0j7&sourceid=chrome&ie=UTF-8")
d = r_h %>% html_nodes(".iUh30") %>% html_text() %>% as.character()
That class is not present in the returned html. You need a different selector strategy and then extract href
library(rvest)
library(stringr)
r_h = read_html("https://www.google.com/search?q=google&oq=google&aqs=chrome.0.69i59j0l2j69i60l2j69i65.1101j0j7&sourceid=chrome&ie=UTF-8")
d = r_h %>% html_nodes(".jfp3ef > a") %>% html_attr(., "href")
for(i in d){
res <- str_match_all(i,'(http.*?)&')
print(res[[1]][,2])
}

How to troubleshoot missing OSM tiles in leaflet html widget?

I need to create a web page that includes an interactive map where users can see popup information about data collected at many locations. Using Rstudio and leaflet on Windows, wanting to use OSM base map tiles.
My leaflet map works fine in Rstudio viewer. However, when the 'knitted' page is viewed in Firefox, no OSM map tiles appear although other components of the map are okay. Similarly OSM tiles missing in saved html widget.
I made a simple example to demonstrate.
```{r}
library(leaflet)
library(htmlwidgets)
rand_lng = function(n = 10) rnorm(n, 145.7, .01)
rand_lat = function(n = 10) rnorm(n, -17, .01)
m = leaflet() %>%
addTiles(group = "OSM (default)") %>%
addProviderTiles("Esri.WorldImagery") %>%
addCircleMarkers(rand_lng(5), rand_lat(5), group = "Points")%>%
addLayersControl(
baseGroups = c("OSM (default)","Esri.WorldImagery"),
overlayGroups = c("Points"),
options = layersControlOptions(collapsed = FALSE)) %>%
setView(lng = 145.7, lat = -17, zoom = 12)
m
saveWidget(m, "leaflet_OSMplusEsri.html")
```
This is the output I get in Rstudio viewer, with OSM tiles selected and displayed correctly. When selected, Esri tiles are correct also.
This is the html file shown in Firefox, where OSM tiles do not display despite being selected.
I've been searching all day without discovering how to troubleshoot this. As a newbie perhaps I'm missing something obvious?
I will be very grateful for advice: how to troubleshoot this problem in simple steps?
Took a long time, but I eventually resolved this. In case it helps anyone else, here is the revised version that works properly.
```{r}
library(leaflet)
library(htmlwidgets)
rand_lng = function(n = 10) rnorm(n, 145.7, .01)
rand_lat = function(n = 10) rnorm(n, -17, .01)
m = leaflet() %>%
addProviderTiles(providers$OpenStreetMap, group = "OSM") %>%
addProviderTiles(providers$Esri.WorldImagery, group = "Esri") %>%
addCircleMarkers(rand_lng(5), rand_lat(5), group = "Points")%>%
addMiniMap() %>%
addLayersControl(
baseGroups = c("OSM","Stamen", "Esri"),
overlayGroups = c("Points"),
options = layersControlOptions(collapsed = FALSE)) %>%
setView(lng = 145.7, lat = -17, zoom = 12)
m
saveWidget(m, "leaflet_OSMplusEsri.html")
```
Cause of the problem was addTiles() with default values. I'm not sure why this did not work, hoping someone might be able to explain.