We use openHistorian to store data calculated by a custom adapter. These data are not regularly distributed over time, unlike those concerning phasors (50 data per second for frequency, voltage, current … for example). We can therefore have 10 successive points over a range of 5 seconds; then no points for 10 minutes, the only 3… In order to display this data in Grafana without the default sampling mechanism (chart width / time range) omitting some of this data; we use the Interval(0, [expression]) function to request them: all the data is displayed fine. however, it happens that the values of these data fluctuate significantly, which over ‘significant’ time periods can generate graphs that are difficult to read.
Is there in Grafana’s OpenHistorian adapter a moving average function which would make the visualization of the curves ‘smoother’?
This function seems to exist for some other data sources; but I did not find it in the page gsf/GrafanaFunctions.md at master · GridProtectionAlliance/gsf · GitHub.
I do not think that function exists, the
Interval function will downsample values to a target interval - however, the graphing/query engine also automatically downsamples to reduce the number of values going to the screen.
Not sure if this will help, but, if you use
Interval(0, query_expression), this will show “all” values in the database with no downsampling. Do use this with caution, i.e., only use for very short query ranges because the magnitude of data that will be returned.
We are actively reworking the Grafana datasource for the openHistorian, we will look into added a moving average interpolation function.
Thank you very much for your answer.
I am well aware that the interval function does not meet the need for a moving average. I already use this Interval(0,…) function to query all the data calculated by an openPDC custom adapter (which is then stored in openHistorian) over a given time range. Indeed, these data are not uniformly distributed over time; and if I don’t use the interval(0,calculated_data_query) function in my queries then some of these values (most in fact) are not displayed in Grafana.
Due to the fact that I have to catch/retrieve/grab all of these values in Grafana, it happens that for a significant range of time, the graphs appear in a ‘rough’ form that is sometimes difficult to interpret at first glance. Using a ‘moving average’ function would then make this graphic display more ‘readable’ and ‘smooth’ by reducing the number of points to be displayed. For example where my request INTERVAL(0,‘data_query’) would raise 30,000 data points, the application of a moving average function MOVINGAVERAGE(3,INTERVAL(0,‘data_query’)) would make it possible to lower this number of points to 10,000 (with the parameter ‘3’ here ; which would specify that only the average values of the points taken by 3 should be retained for the graph). Another solution would use a temporal parameter ; such as MOVINGAVERAGE(10s,INTERVAL(0,‘data_query’)) which allow to specify that we want the average of each 10 seconds slice of time.
I fully understand that reworking the openHistorian datasource/plugins/visualizations for Grafana (probably due to Grafana 10 breaking changes requirements) represents a very significant workload. This is not an imperative need of my activity but I thought that this kind of function would complete the very exhaustive list of those already available; and might also be useful to others.
On the subject concerning Grafana 10, I use the following plugins/visualizations; whose renewal on this new version 10 would be appreciated:
- OH Data Download
- OpenHistorian Alarm Panel
I do not use à the moment the SCADAVISvis Synoptic Panel ; but I planned to do in the more or less long term.
I wish you a great day,