I am exploring the idea of purchasing a property in Brisbane and am thinking through parameters that impact pricing. In my opinion, travel time to the CBD should be an input. To explore that idea; I needed first to map travel times. This post will go through the process of using geo-computational tools in R and Google Maps to answer this question.
I don’t want to map the entire state, only a region within 20km of the CBD. To reflect this, I am going to filter down to only the CBD then generate a buffer for use as a mask and transform everything into EPSG:4326 as this is what I will ultimately send to Google’s Distance Matrix API.
The Distance Matrix API (https://developers.google.com/maps/documentation/distance-matrix/start) I was planning on using, takes multiple origins and destination points and returns a JSON array with the key data I want.
For simplicities sake, I used a single origin and destination for each API call. The destination was obvious, the enduring spiritual epicentre of Brisvegas - Queen Street Mall Hungry Jacks, but I needed a grid of starting points to complete the API calls.
For my gridded search, I compromised on a spatial resolution of 400m so as to not exhaust my API allocation but still have enough fidelity to identify trends.
I used the awesome purrr:cross_df function to combine a list of latitudes and longitudes into a combination of points in a data frame, and by intersecting with the BNE object, I drop points that are not on land.
I deliberately did not use sf::st_make_grid, it was simply faster and cleaner to use cross_df in this particular use case.
Building the API Call
I already have a developer account with Google for past projects, so I only needed to create new credentials and restrict them to the Distance Matrix API. I am not going over this; there is plenty of great documentation online.
There are a few things to be aware of when using this API:
No spaces between latitudes and longitudes.
Specified as Latitude then Longitude.
Default units are metric, but I prefer to be explicit.
Departure times need to be in the future (They were when I wrote the code).
Append your API key last.
Use ‘&’ for concatenation.
This returns a data-frame containing URLs like this:
Calling the API
With the data frame of points, I mapped over the URL column and stored the GET request as a list-column. I wrote a small helper to delay the calls, mainly so that I could verify that I wasn’t running up a terrifying bill then push the result out to a .RDS file in case I needed it later.
I wrote another helper function to parse the JSON tree structure.
I really like this design pattern of simplifying the problem into discrete steps, write a function to solve an individual case then map the solution to all inputs and repeat. I find the tidyverse packages really support this paradigm.
Plotting the results
From here, it is straightforward to plot simply as points in ggplot. For my model, however, I took the CSV into QGIS and used a TIN interpolation to fit a surface and fill in the distance between observations then bring this TIF back into R as a raster.
A couple of neat observations:
There is a clear corridor along the M3 in the South West corner
Travel timely is generally proportioanl to distance (as expected) by there are some isolated pockets in the North West and West that are above expected duration.
Interesting to see asymetry depending on the side of the river (Bulimba / Teneriffe)