I’m guessing the measurements don’t make it to the custom output adapter because the openHistorian doesn’t have any metadata associated with the data points entering the system via GEP. Your simplest option would be to manually define the metadata in openHistorian and then use matching signal IDs when generating measurement keys for the data publisher.
Alternatively, you can attempt to go through the steps to automate the exchange of metadata between the application and openHistorian. I suspect that the process of getting that working would be prohibitively complicated, but here’s a rough outline of what you’d need to do:
- Organize the metadata in your application into devices and measurements.
- Assign GUIDs to each of your application’s devices and measurements that will not change. This needs to be a static metadata field associated with these objects. openHistorian will need to use these to match remote objects to local objects when synchronizing configuration.
- Create a new class which is a subclass of DataPublisher. In this class, override the AquireMetadata function to provide a DataSet with the configuration from your application. Try to match the schema of the resultset defined by the
DefaultMetadataTables expression, which is defined in DataPublisher.
- In your application, use an instance of your DataPublisher subclass instead of using the DataPublisher class directly.
- Check for error messages in the openHistorian console to troubleshoot issues with the metadata synchronization.
Here’s the definition for the
DefaultMetadataTables expression mentioned in step 4.
public const string DefaultMetadataTables =
"SELECT NodeID, UniqueID, OriginalSource, IsConcentrator, Acronym, Name, AccessID, ParentAcronym, ProtocolName, FramesPerSecond, CompanyAcronym, VendorAcronym, VendorDeviceName, Longitude, Latitude, InterconnectionName, ContactList, Enabled, UpdatedOn FROM DeviceDetail WHERE IsConcentrator = 0;" +
"SELECT DeviceAcronym, ID, SignalID, PointTag, SignalReference, SignalAcronym, PhasorSourceIndex, Description, Internal, Enabled, UpdatedOn FROM MeasurementDetail;" +
"SELECT ID, DeviceAcronym, Label, Type, Phase, DestinationPhasorID, SourceIndex, BaseKV, UpdatedOn FROM PhasorDetail;" +
"SELECT VersionNumber FROM SchemaVersion";
As for your question about performance, it really depends on the characteristics of the data and the protocols supported by your application. GEP was designed to be a very low-latency, high throughput, pub/sub streaming protocol, and stress testing indicates that it scales very well. We consider GEP to be a very widely applicable solution in terms of performance. However, your application’s protocol may be better suited to your data or the performance requirements of what you’re trying to accomplish.
Both solutions will use an input adapter to receive measurements and pass them to the routing engine, so there is no difference in terms of what the openHistorian has to do once the measurements arrive in the system.