This Gist outlines a basic extension of the Raspi-Sump project. It takes the data/output from the project's logfile and feeds to the Telegraf agent (locally on the Pi), which in turn feeds it to an InfluxDB time-series database, which is then graphed using Grafana. All of these tools are freely available and setup is beyond this scope.
I wanted to quickly capture what I did for future reference, and to benefit others who may have been equally inspired by Al's project.
I've enhanced the bash script to compare the previous/current water levels in an effort to detect run cycles. If the water level has dropped by at least 3 inches (since the last run) we write an additonal entry to Telegraf, which gives us the ability to put some fancier data into Grafana. Yes, I'm making some presumptions here, but running data collection every 1m makes it a fairly safe bet. We don't know (definitively) the pump ran, we're just presuming it did because the water level changed (significantly).
Correct there is no reason at all to use the bundled matplotlib graphing I have included with Raspi-Sump. You can use any approach you like to graph your readings.
There are many other ways, I just wanted to use a pure python approach as part of the application and not rely on external services where the terms of service can change.
EDIT:
It is the same reason I don't use services like Twilio for the notifications when there is a python library like smtplib that can handle it. :-)