I’m building out a Stat panel in Grafana and was wondering if there is a way to optimize large queries. Currently, the panel is querying 100+ binary points to establish an overall status for the location I’m looking at. Is there a better way to handle this amount of data?
It depends a little bit on what you are trying to achieve.
If you are using the latest datasource it will downsample the data by default.
If you are only interested in looking at summaries (like Max, Total, etc) you can use the provided GSF fuctions. Those are evaluated in the openHistorian/openPDC, which reduces the ammount of data Grafana has to handle.
Similarly if you mostly care about a summary of the device you can combine data points using a similar approach (SLICE and SET versions of the GSF functions) that may further rescue the number of tags being send back to Grafana.
Thanks for reply! I am currently using a SetMaximum function to summarize the points. It is good to know that the function is evaluated on the openHistorian side. Are there any other steps that can be taken?