Travel times and trip distances are at the core of urban economics. Many models of competition, housing markets, etc., rely on travel times or distances to explain the variance in economic outcomes. Determining travel times, especially non free-flow travel times (i.e., accounting for congestion) is however no trivial task.
Google maps offer a unique opportunity to compute travel times for an origin and destination pair by different modes, i.e., automobile, transit, and walk. The technology is still in Beta stage, but offers realistic travel time estimates for intra-urban trips for many North American cities.
In the recent Stata journal (Volume 11, No. 1), Adam Ozimek and Daniel Miles highlight their code (now available in Stata) that can not only geocode (determine longitude and latitude) addresses, but also determines travel times by different modes using Google maps.
I thought R must have some utility already available through CRAN. However, I couldn’t find one. R does offer several interesting spatial analytical capabilities under the Task: Analysis of Spatial Data. However, not much is available on harnessing Google’s analytics to determine distances or travel times. I hope I am wrong and have missed the package that offers these capabilities in R.
Also worthy of mention is the TravelR project, which is in pre-alpha stage, but once completed will allow R users to develop travel demand models capable of forecasting congested travel times on street networks in addition to other capabilities. Further details about TavelR are available from Jeremy Raw.
The TravelR project looks interesting. Getting travel distance and time from Google Maps is easy using the rjson library:
ReplyDeletelibrary(rjson)
json_file<- "http://maps.google.com/maps/nav?output=js&q=from:%20Montreal%20to:%20Toronto"
json_data <- fromJSON(paste(readLines(json_file), collapse=""))
json_data$Directions$Duration$html
json_data$Directions$Routes[[1]]$Distance$meters
I'm just working on that kind of travel distance calculations, and I use this method (rjson).
ReplyDeleteHere is my code with coordinates specifications (and not cities specifications) :
library(RCurl)#To get the result of an url
library(rjson)#To read json files
#An example of coordinates
xlat<-57.372801
xlong<-2.016214
ylat<-57.459688
ylong<-2.790558
#Writing the corresponding url
z<-paste("http://maps.google.com/maps/api/directions/json?origin=",xlat,",",xlong,"&destination=",ylat,",",ylong,"&sensor=false",sep="")
#To get and read the json file
x<-fromJSON(getURL(url=z))
#To catch the Google limitation on requests (it often happens)
if(x$status=="OVER_QUERY_LIMIT"){
while(x$status=="OVER_QUERY_LIMIT"){Sys.sleep(10*60) ;print("wait for 10 mins")}
}
x<-fromJSON(getURL(url=z))
#To get the total travel time
TRAVEL_TIME<-x[[2]][[1]][[2]][[1]][[2]]$text
print(TRAVEL_TIME)
But I often get the OVER_QUERY_LIMIT message : Anybody knows How to nicely avoid this annoyance ?