Some of the Shiny apps we’ve developed for a financial trading firm have pretty consistent requirements:
- Show a lot of data, and highlight the important stuff.
- Update that data frequently.
The DataTables library makes a lot of sense for those. It’s easy to use – just pop in a data.frame and by default you have a sortable, searchable, page-able table with a nice default style.
But then we noticed the client-side (browser) memory leak!
Eventually the browser tab would crash, and the app had to be restarted.
After investigating, we noticed the server-side rendering function
DT::renderDataTable actually creates a NEW instance of the DataTable every time it updates, rather than just updating the data in the existing table.
For most apps, this isn’t a big deal – a table might update 2, 3, even 10 times based on user requests, but never enough to crash a tab.
But for these apps, updating 12 DataTables every 3 seconds means 240 new instances of DataTables per minute! These apps are intended to open and running throughout an 8-hour trading day, so that was unacceptable.
It was not clear that generating new table instances was the default behavior, but thankfully there was a clear solution in RStudio’s documentation of the DT package. See section 2.3 of this page, or their example Shiny app here.
We just needed to create the DataTable once on initialization with
DT::renderDataTable, and then create a
dataTableProxy to continue to interact with that same table instance.
After that, we just need to put an
observe in place to listen for reactive data updates on the server side, and call
replaceData to insert the new data.
Simple! And easy on the memory!
There is one catch: this did not work well with data.frames that require use of
%...%T to split processing into another R session. That’s because the initial table gets processed with data as a
NULL, and then subsequent updates (
data.frames) have different columns, of course. So it may be possible, but wasn’t immediately obvious how to handle that.