This Gist outlines a basic extension of the Raspi-Sump project. It takes the data/output from the project's logfile and feeds to the Telegraf agent (locally on the Pi), which in turn feeds it to an InfluxDB time-series database, which is then graphed using Grafana. All of these tools are freely available and setup is beyond this scope.
I wanted to quickly capture what I did for future reference, and to benefit others who may have been equally inspired by Al's project.
I've enhanced the bash script to compare the previous/current water levels in an effort to detect run cycles. If the water level has dropped by at least 3 inches (since the last run) we write an additonal entry to Telegraf, which gives us the ability to put some fancier data into Grafana. Yes, I'm making some presumptions here, but running data collection every 1m makes it a fairly safe bet. We don't know (definitively) the pump ran, we're just presuming it did because the water level changed (significantly).
Grafana is a webui for rendering the data pulled from InfluxDB. This is a screenshot, not a static image (like Matplotlib) - through my browser there are interactive controls (change the time range, refresh rate, labels, rearrange/resize each panel, set warning levels and alerts, etc.). Grafana will actually email me when any given panel reaches a critical level.
In this Gist I've attempted to give you the pieces to copy/paste. There are links in the readme for setting up Telegraf/InfluxDB/Grafana in a matter of minutes. Tuning always takes time, but if you're up for an experiment it's a rewarding one!