Hello!
We are considering the ingestion of data into OpenHistorian from a significant number of PMUs (72+). I need to estimate the volume these new data will occupy in the ‘.d2’ format to correctly size the required disk space.
Does a spreadsheet exist that can estimate this volume based on:
- The number of PMUs involved (here 72)
- The number of phasors per PMU to be stored + frequency + df/dt (here V and I for A,B,C phases and V+)
- The number of analog and/or digital values P, Q, S, V+, V-, V0, I+, I-, I0, etc … (here all of those I mentionned)
- The framerate : (here : 100 fps)
- The desired retention period (ideally 366 days)
- Other relevant variables/constants for the calculation that did not came to my mind at the moment
I had found this type of spreadsheet some time ago, but I suspect that it was for the ‘.d’ archive format of OpenPDC, which I no longer use.
I also wonder at what point a significant volume of real-time data to be archived implies that we need to consider the read/write capabilities of the storage medium, as we are using NAS network drives for this purpose and not local NVMe-type drives. Is it possible to assess the required read and especially write rates based on this data volume?
Thank you very much for the attention you will kindly give to this message.
Regards !