In order to enrich the F/Df, U and I data from my PMUs, I use OpenHistorian to generate the V+ / V-, V0, I+, I- and I0 values for my 3 phases via the custom action adapter Bulk Sequence Calculator.
I left its default settings except for the framerate which I forced to 50.
Once this adapter is initialized for the first time, it creates in the Measurement table the records corresponding to each calculated value: IA Zero Sequence Current Phase Angle, IA Zero Sequence Current Magnitude and so on …
I manage to retrieve these values in Grafana, and display them in graphs. The values are therefore well archived in the .d2i and .2d files.
However, I encounter the following problem:
- When stopping and restarting the openHistorian service, the measurements created to store the data of the BULK adapter in the Measurement table are automatically deleted. It is then necessary to click again on the initialization button of the adapter so that they are recreated. When these are, they do not have the same IDs as the previous ones: in Grafana it is then no longer possible to view the previous ones since OpenHistorian no longer has the references corresponding to the recorded values in the database.
Is there a way to make these measurements permanent in the Measurement table, including after stopping / restarting the service please?
FYI: I’m using the 2.8.52 version of openHistorian, but I seem to have noticed the same issue with the latest release.
Thanking you in advance for any help you may provide,
I cannot think of a reason for why the measurements would be deleted by the adapter itself. However, for tags that are not part of the original metadata set, STTP will remove them unless they are marked as
CALC. For example, the bulk sequence calculator (if running in openHistorian) creates “new” local calculated tags marked as phasor types, e.g.,
IPHM, associated with the source devices. This is OK, except at the next STTP metadata sync, the system will recognize that these tags do not exist in the source metadata set and remove them. STTP attempts to make local and remote metadata match.
Are you feeding data from openPDC via STTP/GEP into openHistorian? If so, you can set the
ForceCalcSignalType property to
True in the
BulkSequenceCalculator settings so that the sequence calculation tags will be marked ask
CALC and not be automatically removed.
What I would generally recommend in this scenario would be doing the calculation “upstream” closer to the source input data - this way the calculations become part of the original source data.
That said, I realize that this is not always possible because of resource constraints since the sequence calculations can get expensive.
Thanks for your quick reply.
Yes that’s exactly it : my main PDC sends data to OpenHistorian in STTP. I use this protocol because it makes it possible to benefit from datagap recovery (which works well on small data gaps; less well on those that exceed several minutes).
I calculate the V+ / V- / V0 … (as well as the powers) on the OpenHistorian because all the calculation capacity of my main PDC is reserved to run the WSU OMS oscillation detection modules which are very CPU demanding. (I saw on this subject that we could perhaps soon have an adapter for detecting oscillatory events directly in OpenPDC ?).
So I can confirm that after activating the ForceCalcSignalType=True option in the BULK adapter, the measurements are indeed persistent even after restarting the service.
Yes, they should not get deleted when the sequence tags are marked as
CALC - in this scenario, STTP will ignore local
CALC tags during synchronization.
One thing to consider in the future, assuming you have the hardware budget, would be a scenario like this:
Thanks for the suggestion.
I didn’t know that it is possible to send two separate STTP streams to the same Historian (two separate subscriptions).
So far I have only managed to perform an internal subscription (create internal subscription) between the main PDC (which is also the one doing the OMS calculations) and the Historian. I’ve tried setting up subscriptions involving the use of certificates for TLS, but I can’t get them to work due to unrecognized certificate issues on one side or the other.
There is a topic on the method to apply to make these subscriptions work because for the moment I have given up using TLS even though it is a significant prerequisite for the overall security of the infrastructure. in production
I was also planning to make a topic on the subject because I can’t get out of it despite the discussions on the subject that I have read here and there…