Optimizing Complex Webmaps using Vector Tiles, Cacheing, and PostGIS
11-04, 15:30–16:00 (America/New_York), Lake Fairfax

Our leaflet.js based webmap had been providing users with a dependable, but frustratingly slow, process to create spatial queries. We share our experience transitioning to Vector Tiles and other FOSS libraries, which have greatly sped up our application.


In 2018, UMD's Center for Advanced Transportation Technology (CATTLab) released its first iteration of the Trip Analytics project. This web application was designed for use by transportation planners and operations managers to help them analyze both recent and historical traffic trends, using licensed vehicle trip data from providers such as Inrix and Geotab.

The application guides users through the creation of a geospatial query using a webmap, which is then submitted to the back-end for processing. The team has made continual improvements to the front-end user experience by incorporating new open source software libraries that have become more prevalent since the project’s instantiation. In this talk we will present our changes and lessons learned. The following section describes briefly a few of the changes we intend to discuss.

  1. Transition to Vector Tiles:
    When this project was first developed, leaflet.js was the recommended webmapping library. As such, our development focused on sending geojson back and forth between the PostGIS database and the front-end. We primarily work with political boundaries at various resolution levels (country, state, county), which can get quite complex. Moving over to Vector Tiles allowed us to display levels of data previously unfeasible - Where we initially had to limit users to viewing boundaries only within a smaller selected region of interest, we can now display the entire country’s worth of data.

  2. Pg_tileserv:
    In the past year, we have started incorporating pg_tileserv into our project, in order to reduce our own development costs. Before this, we worked with a custom tile server developed in-house. However, using pg_tileserv allows our team’s developers to iterate more quickly as we can access desired geospatial columns and implement custom MVT functions with minimal friction.

  3. Moving geospatial processing from turf.js to PostGIS:
    Our original development efforts focused on developing a rapid prototype. As such, much of our geospatial processing was baked into the front-end using turf.js. This worked initially, but pushing all of this work onto the browser was not scalable. One example of this is the calculation of time zones for a user’s selected region. When working only within the United States, it was trivial to calculate the intersection of 5 possible times for a user’s selection. Once we began working with global data sets, we realized we had to move this to the back-end in PostGIS. Now, the user’s selection ID is sent to the back-end, where a PostGIS function determines the intersection of the selection’s geometry with all global time zones. To further optimize the calculation, we have precalculated and stored the timezones for many popular regions of interest.

  4. Redis Cacheing and Tanstack Query:
    Pushing processing to PostGIS didn't eliminate all processing speed concerns. There’s no way to avoid that things such as bounding boxes and intersections take time to calculate. Proper use of these two libraries allowed us to avoid repeating work.

  5. The value of proper DB design and learning postgresql:
    During the development process, we went through multiple methods for storing user’s query data. We’ll describe how we ended on our current method for storing user defined areas in a custom table, and how this table facilitates some of our development processes, as well as point out some pitfalls where we did not make of use of the table initially (and really wished we had later on). We’ll also talk about the benefits we found as our front-end developers became more familiar with PostGIS functions.