Multiple Subscriptions over Internet

#1

Hi,
As per the previous thread “Subscription over internet”, this is related to this but a different question, so I made a new topic.

For creating a requested demo. We have a VM setup in our company domain with SIEGate and OpenHistorian installed on it. We have some PMU’s sending data to this OpenHistorian. SIEGate has an internal subscription to OpenHistorian. Thats the local setup.

We also have SIEGate and OpenHistorian installed on a VM on Azure Cloud services.So this SIEGate is subscribed to the local VM SIEGate using the Generate Authorization Request/Authorize subscription procedure with certificates, and the Azure OpenHistorian has an internal subscription to the Azure SIEGate. That’s the cloud setup.

So far everything works, and all this operates on port 6172. Data is moving up into the cloud and we can view the history with no issues. We can only view the data in Azure if we log onto the Azure VM, so we wanted to stream the data back to the local VM using SIEGate so we can view it locally for a coming demo.

The problem arises when I setup the local SIEGate to subscribe to the Azure SIEGate using the Generate Authorization Request/Authorize subscription procedure so people in the corporation can view phasor history using a local OpenHistorian website. This is the same VM that data was originally sent from in the beginning paragraph. I know this sounds stupid sending data back onto itself, but it’s being done this way just for demo purposes.

I’ve done the Subscriber Measurement Access and Measurements Subscription process and no input devices show up on the input devices screen, just the concentrator device is showing.

No errors show up in the log, and on the subscribing machine ( the local VM ) it shows

[5/31/2019 2:55:23 PM] [AZURE.SG] Attempting command channel connection to publisher...
[5/31/2019 2:55:23 PM] [AZURE.SG] Connection established.
[5/31/2019 2:55:23 PM] [AZURE.SG] Data subscriber command channel connection to publisher was established.
[5/31/2019 2:55:23 PM] [AZURE.SG] Success code received in response to server command "MetaDataRefresh": latest meta-data received.
[5/31/2019 2:55:23 PM] [AZURE.SG] Received a total of 132 records spanning 4 tables of meta-data that was uncompressed and deserialized in 3 milliseconds...
[5/31/2019 2:55:23 PM] [AZURE.SG] Meta-data synchronization is 20.0% complete...
[5/31/2019 2:55:23 PM] [AZURE.SG] Meta-data synchronization is 40.0% complete...
[5/31/2019 2:55:23 PM] [AZURE.SG] Meta-data synchronization is 60.0% complete...
[5/31/2019 2:55:23 PM] [AZURE.SG] Meta-data synchronization is 80.0% complete...
[5/31/2019 2:55:23 PM] [AZURE.SG] Meta-data synchronization completed successfully in 13 milliseconds

It’s acting as if i didn’t do the Subscriber Measurement Access procedure, but I have.

I’m wondering if its because the port cannot be used for multiple subscriptions? Or the software doesn’t like you routing phasor data back onto itself?

Any ideas why I can’t receive data back from the cloud?

0 Likes

#2

It sounds like you’re using the same openHistorian instance to receive data directly from the PMU and also from the Azure SIEGate. The problem is that there is a unique identifier generated for every PMU that cannot be duplicated in the database. The metadata synchronization process is probably detecting the fact that the remote PMUs have the same unique IDs as the ones that are already in the database and is therefore not importing the metadata.

Signals also have a similar unique identifier that would cause problems if the openHistorian subscriber tried to synchronize metadata. You will need to set up another system or service to receive the data from the cloud.

0 Likes

#3

Thanks Stephen.
That makes sense. thanks for explaining how that works.

0 Likes