I am new to OpenPDC but wondered if there is a way to easily output a user definable delayed output stream of C37.118 synchrophasor data to another PDC (i.e. user definable time shifting of data)?
I would say the easiest way would be to create an output stream with a large lag time and turn of preemptive publishing. However, there are a few problems with this approach since the system will rely on RAM to save real-time data to the data frame buffer. First, the delayed stream won’t start right away; it has to wait for real-time data to accumulate in the buffer. This will also be a problem for restarts and configuration changes. Second, there will be hardware limitations, since you will have significantly less space in RAM than you would on your hard drive. Third, this approach could severely hurt performance of the entire system because of the way the .NET garbage collector works.
I’m guessing the first issue would be a deal-breaker for that approach. Ideally, you could set the system up to retrieve delay-stream data from the archive. However, that is typically done through temporal sessions, and temporal sessions are generally used for on-demand data streams for fixed time ranges; not for ongoing data streams like this one. Unfortunately, I don’t think it’s going to be possible with a temporal session without some changes to the source code.
Which brings me to my final suggestion, which would be custom code. Theoretically, you could design an adapter to query temporal data, buffer it, and send it into a C37.118 concentrator using source code and/or objects from GSF.TimeSeries.Transport.DataPublisher and the GSF.PhasorProtocols namespace. In answer to your original question, I wouldn’t really call this an easy solution, but it seems doable.
We have a client wanting to buffer data up to as much as 24 hours. I would assume that would be far too much for internal RAM. Am I understanding you correctly in your last point that the best option may be to stream the data down to a database where it could then be read back via custom scripting and streamed back out to a secondary PDC using a selected offset from the clock? This would all you to “play back” the queried data on an output stream with whatever delay you would select. Am I understanding you correctly?
Well, 24 hours of data at 30 fps would be 2.6 million frames of data that you’d have to store in memory. That’s 2.6 MB of data per byte of data per frame. Or, in other words, if a single frame of data takes 100 bytes of memory, it would take 260 MB of data. That may be an acceptable amount, but it does depend on the size of your frames.
Although it’s hard for me to say how this approach would affect system performance, I should think that the biggest problem would be that the system would need to buffer 24 hours of real-time data before the delayed stream can start. In other words, the stream wouldn’t start until 24 hours after the last time you restarted the service or initialized the output stream.
In answer to your question, it does sound like you are understanding me correctly. The openPDC already has the capability of streaming data from a historic archive, but it isn’t designed for a continuous delayed stream. So the custom script would really just need to periodically query historic data and then provide that buffer to a C37.118 concentrator.
Thank you for your insights. I’ll head down this path and see where it leads! Appreciate the help.
If you use the openHistorian as a “buffer” where your real-time data flows into the openHistorian - you can create a temporal subscription from a stand-alone openPDC where
startTimeContraint=*-1D, i.e., now minus one day and
endTimeConstraint=2030-00-00, i.e., some far fetched future time, then so long as the playback speed is set to as close to input speed as possible (e.g., 30 frames per second) - you should maintain a delayed playback. There may some drift since replay is based on a timer, but the drift should be slow.
FYI - all the data flows in this setup between openHistorian and openPDC assume use of GEP protocol. The GEP protocol supports both real-time and historical data transfers.
Also, if you already have an archive with the openPDC, you can simply use another openPDC with a temporal subscription to the first without having to install openHistorian.
Thanks for the great responses, really appreciate it! I’ll be giving this a try.