A client working on process improvement with a focus on ecological impact contacted us with a request, saying he was:
…interested in getting a demo app built in next 30 days for a client and your skills could cut my development time (thinking of working up a Shiny app).
Well, great! We spoke on the phone later that week and learned a bit more about the requirements. Our client’s client wanted to:
…assess impact of changes to operations and equipment on energy use, costs and CO2 emissions.
We learned that the energy use data (electric and gas) could easily be exported from the utility service website, and they had already created a model for expected energy consumption under the equipment prior to changes.
With that in mind, we sketched out a basic user interface that allowed the following:
- User could select a location from a list of facilities.
- User could upload the utility data in spreadsheet form, and the app would process it and append to any previously existing data.
- User could select a date range up to the full extent of available energy consumption data.
- User could then view a table of daily energy usage predictions. The model accounted for local daily temperature, weekdays versus weekends, and holidays.
- User could also view graphics that highlighted the forecast values, the model prediction error, and the change in usage over time.
Development versions were only usable in local environments with R and the required packages, along with the underlying energy usage data. That’s no way to treat an end user!
So we moved the application onto Amazon Web Services to make it accessible to anyone with an internet connection.
Given relatively low traffic we could host this inexpensively on a single virtual server, and regularly back up the data to AWS S3 (Simple Storage Service).
As we iterated forward, we decided there were some key improvements that would make this more helpful to the users:
- Automating data retrieval: The utility company had developed a REST API, and after some good honest wrestling with OAuth tokens, we wrote a background Python script to gather data several times daily and integrate it into the main application.
- Annotation: Visualizing changes in graphs is helpful, but what if you see several changes, and can’t remember what happened before each of them? We allowed users to attach notes to dates (for example, noting when a solar panel system was installed). Now they could quickly tell if changes in energy consumption happened when they expected!
- Data update alerts: We noted a few issues in the utility API, one being an occasional freeze that kept us from accessing current data. With a few basic inputs and a cron job, we were able to alert anyone who wanted to know when the age of the most recent data reached a user-defined threshold, so they could call it to the utility company’s attention.
This was several years ago, when Shiny was just becoming ready for production applications – that might have seemed like a risk to our client, but instead they focused on several advantages of Shiny and R:
- Rapid prototyping: Shiny enabled them to have a prototype ready to demonstrate in a matter of weeks.
- Quick, easy modeling and graphing: the client needed simple support for modeling and graphing table-based structured data. R was an asset for those requirements, and therefore Shiny was an asset.
In addition, we learned that Amazon Web Services becomes a more powerful tool when you begin to use multiple services. We went beyond a simple server instance by backing up data to S3 and using SES (Simple Email Service) to send data notifications. It was great experience to build up a system that worked together so naturally.