OpenHistorian archive estimate size calculator

Hello!
We are considering the ingestion of data into OpenHistorian from a significant number of PMUs (72+). I need to estimate the volume these new data will occupy in the ‘.d2’ format to correctly size the required disk space.

Does a spreadsheet exist that can estimate this volume based on:

  • The number of PMUs involved (here 72)
  • The number of phasors per PMU to be stored + frequency + df/dt (here V and I for A,B,C phases and V+)
  • The number of analog and/or digital values P, Q, S, V+, V-, V0, I+, I-, I0, etc … (here all of those I mentionned)
  • The framerate : (here : 100 fps)
  • The desired retention period (ideally 366 days)
  • Other relevant variables/constants for the calculation that did not came to my mind at the moment

I had found this type of spreadsheet some time ago, but I suspect that it was for the ‘.d’ archive format of OpenPDC, which I no longer use.

I also wonder at what point a significant volume of real-time data to be archived implies that we need to consider the read/write capabilities of the storage medium, as we are using NAS network drives for this purpose and not local NVMe-type drives. Is it possible to assess the required read and especially write rates based on this data volume?

Thank you very much for the attention you will kindly give to this message.

Regards !

See if this one helps:

https://gridprotectionalliance.org/NightlyBuilds/openHistorian/Temp/OH2.0SizeEstimates.zip

Thanks!
Ritchie

Hello Ritchie,
Thank you very much : this help a lot!
Could I ask why the value of 2.8 bytes per measurement ? Is it an average I can adapt ; or is it a scientifically defined value ?
(I’ve compared it to the one year data I have and found 2.45 suit better for my case).

Regards,
Stephane

The number is just a conservative average, to err on over estimating.

Thanks!
Ritchie