Frame rate far smaller than defined

Hi all,

I have met a problem about the openECA frame rate.

Using the Client Web Tool, I generated the default ‘Hello World!’ C# project just to test the frame rate. The ‘Estimated mean frame rate’ is around 20 frames/sec as shown in the following figure. I have set it to be 30 frames/second.

  • (a) What could be the possible reason(s) of this difference between defined frame rate and the actual one?

  • (b) Is there any way that I could create the data streams that has perfect frame rate to be 30, or at least around 29?

Thanks a lot!

Best,
Chen

Are you using test data or real data? The test data that comes with the system is not very reliable in terms of delivery of data - so I wouldn’t put too much importance on that statistic unless real data is flowing through your analytic…

Hi Ritchie,

Thanks for replying!

I am using the data I generated by myself. I created the Measurements and used a CSV adapter to integrate my own data which is generated by PSSE and should be in ideal condition.

I have tried to change the frame rate of the device but even if I change it to 50 frames/second, there is nothing changed in terms of the statistics.

Thanks,
Chen

Are you using the CSV adapter that comes installed with the openECA or did you build your own?

Hi Ritchie,

I built the CSV adapter by myself. But I don’t think I have seen a CSV adapter that comes installed with the openECA. When I went to the Adapters -> Input page, there was only one adapter when I installed the openECA which was PPAREADER. I think that this one is for Historian data set, right?

Thanks,
Chen

openECA comes with a CSV adapter project, and the CSV input adapter can be configured through the Adapters > Inputs page. Here is a screenshot.

Did you configure openECA to use that adapter, or did you use a different approach to get the CSV data into your project? Estimated mean frame rate is determined by counting the number of frames published by the concentrator, so if the value is different from the defined frame rate, it means that the number of frames it has actually received from the data source is less than it was expecting to receive. It means that estimated mean frame rate is dependent on how that input adapter works.

Hi Stephen,

Thanks!

I used the CSV adapter you listed here in the picture. I just wrote the Connection Strings to integrate my own .CSV file where the data was saved.

And thanks for the instructions about the Estimated mean frame rate! I think I have got a clearer idea about it. I am wondering what could be the reason(s) that the number of frames it has actually received from the data source is less than expectation?

I have talked to my colleagues about it and they showed their frame rate to me. Theirs seem to be normal around 95% of the defined rate. We used different data. But still, the actual frame rate on my computer is only 67% of the defined one.

Thanks a lot!

Chen

Hi Chen,

Were you able to figure this out? Here’s some more information.

The timer interval can be adjusted using the inputInterval connection string parameter. The default of 33.333333 means it will wait at least 33.333333 milliseconds between CSV file reads (in other words, 30 times per second). The actual timing mechanism can be controlled using the useHighResolutionInputTimer connection string parameter. You can try changing this setting to see if that has any effect, but both timing mechanisms can be affected by CPU burden so it may not matter.

In short, make sure inputInterval is set to the default 33.333333 milliseconds, try changing useHighResolutionInputTimer, and make sure your CPU isn’t working too hard.

Thanks,
Stephen

Hi Stephen,

Thanks a lot! I will try this and provide update later on.

Best,
Chen

Hi Stephen,

I have tried to set up the connection string parameters you mentioned. The data quality has been largely improved from 20 frames/sec to 27 frames/sec. Thank you very much!

Best,
Chen

I have a similar problem so I think it would be better to post it under this topic.

The Frames Per Second in the openPDC Manager has been changed to 50(from default 30), but according to the result of PMU Connection Tester, the actual frame size is around 30.

Considering the transmission from openPDC to Tester is a loopback(127.0.0.1), it is unlikely due to the packet loss. The records of Wireshark could also support this opinion. When the Lag Time and Lead Time in openPDC have been set as 0.1, I still could receive data frames. So I assume the latency of dataframes is good enough to be not filtered.

I have also tried to add the connection string, but seems nothing happened.

My question is, if there are any other configurations I should check? What could be the potential reasons?

In connection tester, the device shows only 30 frames per second as input.

The openPDC will not artificially inject frames where no source data exists.

When you select an output frame rate of 50 and the input source is 30, it tries to best match timestamps of 30fps inputs into 50fps buckets - but where no frame data exists, no frame will be published.

Thanks,
Ritchie

Assuming the source device is actually transmitting at 50 fps, you may need to use the Initialize button to apply the change you made to the output stream’s configuration.