Hello everyone i need your advice and guidance.
The scope of what was required from me from the head of my department is the following:
Connect a PMU and be able to receive alarms once the df/dt and the frequency values drop or raise after specific numbers so it would only register those values during the incident has occured but also be able to get data right before the alarm would occur.
So i started working on the project with OpenPDC, but i thought since some historical data might be needed i switched the project on using solely OpenHistorian which is using a local Postgresql Server with and then migrate the database to the corporate SQL Server.
I have stumbled on a couple of issues during my efforts:
- OpenHistorian’s archiving is filling up the VM Server disk ( 20gb have been used) which probably is due to the fact that we have placed the resolution on a high number per second
( 200/s), and since it is a Historian it keeps all the daily data.
- I can’t seem to be able make use of the postgresql database on the Grafana UI because i don’t know on which table i should search for both those values along with the alarm status to display them on my grafana dashboard. Please note that i do get graphical representations on the dashboard for these but it’s using a “OHDATA” database as a source.
Is the OpenPDC or some other product from the Grid Protection Alliance more suitable for this scope? Could it be that OpenHistorian is writing on two databases and that’s why i have such large save files?
Please any suggestions, advice or feedback are welcome.