Can OpenHistorian Read data from another OpenHistorian Instance?

A debug build of openHistorian.exe will not run as a service unless you add the -RunAsService flag. It’s designed to run as an application by default to simplify debugging with Visual Studio.

Hi Stephen,

thanks for your information.

I’m getting below error in status log and not able to see page in OpenHistorian Web Manager.

[1/28/2020 1:23:24 PM] Failed to pre-compile razor template “GSF.Web.Security.Views.Login.cshtml”: Errors while compiling a Template.
Please try the following to solve the situation:

Error 1:

Application Domain: openHistorian.exe
Assembly Codebase: C:/Program Files/openHistorian/openHistorian.exe
Assembly Full Name: openHistorian, Version=2.5.183.0, Culture=neutral, PublicKeyToken=null
Assembly Version: 2.5.183.0
Assembly Build Date: 1/28/2020 11:40:41 AM
.Net Runtime Version: 4.0.30319.42000

Exception Source: RazorEngine
Exception Type: RazorEngine.Templating.TemplateCompilationException
Exception Message: Errors while compiling a Template.

  • If the problem is about missing/invalid references or multiple defines either try to load
    the missing references manually (in the compiling appdomain!) or
    Specify your references manually by providing your own IReferenceResolver implementation.
    See https://antaris.github.io/RazorEngine/ReferenceResolver.html for details.
    Currently all references have to be available as files!
  • If you get ‘class’ does not contain a definition for ‘member’:
    try another modelType (for example ‘null’ to make the model dynamic).
    NOTE: You CANNOT use typeof(dynamic) to make the model dynamic!
    Or try to use static instead of anonymous/dynamic types.
    More details about the error:
  • error: (22, 21) The type or namespace name ‘Ajax’ does not exist in the namespace ‘Microsoft’ (are you missing an assembly reference?)
    Temporary files of the compilation can be found in (please delete the folder): C:\Users\LoganathanM\AppData\Local\Temp\RazorEngine_ojoqed35.m04
    The template we tried to compile is:

Error2:
[1/28/2020 3:43:50 PM] [TLS!DATAPUBLISHER] Data publisher encountered an exception while connecting client to the command channel: Unable to authenticate connection to client [::ffff:10.10.151.141]: No matching certificate found in the list of trusted certificates.

I’m getting below error, while launching web page.

This localhost page can’t be foundNo web page was found for the web address: http://localhost:8180/@GSF/Web/Security/Views/Login.cshtml?redir=Lw%3D%3D

Could you please explain the cause for the both errors
If there any settings or am I missing anything?

If there are errors compiling the Login.cshtml template, then you certainly would not be able to get to the web interface. There should be more details about Error 1 in the error log that would help to determine what the cause is.

Error 2 is indicating that you created a subscription to the TLS data publisher, but the data publisher has not been properly configured to allow access for that subscriber. If you are using an internal subscription, you need to change the port either to 6175 for GEP or 7175 for STTP. If you actually are creating a TLS subscription, you will need to import the subscription request (.SRQ file) on the publisher system via the Actions > Data Publisher Configuration > Authorize Subscribers screen.

Hi Stephen,

I’m keep getting below error. Even I placed subscriber and publisher certificate in certs\remotes folder in each machine.

“Data subscriber encountered an exception while attempting command channel publisher connection: Unable to authenticate connection to server: No matching certificate found in the list of trusted certificates.”

Could you please explain cause for this error?

I followed below steps

  1. Subscriber machine (S), I created subscriber request (.srq) file.
  2. Publisher machine §, I imported (.srq) file and authorized subscriber and saved subscriber details as per procedure.
  3. Publisher machine, certificate was created while authorizing subscriber and stored under /certs/remotes folder.
  4. Subscriber machine no certificate was created in publisher name, so I copied the certificate from publisher machine and put it under /certs/remotes folder of subscriber machine.
  5. Imported certificates in both machines to trusted root authority using mmc.

Result:
Publisher machine, no error is logged but status log is saying subscriber connection is closed.
subscriber machine ,“Data subscriber encountered an exception while attempting command channel publisher connection: Unable to authenticate connection to server: No matching certificate found in the list of trusted certificates.”

I’m not sure, where to put the certificate?
Do I need to create any self-signed certificate using make-cert or IIS tools?

Whatever documents we have and you provided, did not describe about certificate part clearly.

Please help.

thanks & Regards
Logu

Hi Logu,

First of all, let me clarify that my recommendation was to use an internal subscription if the communications channel was trusted. An internal subscription does not require a certificate exchange, so the material I provided would not have explained what to do. The process for setting up a secure connection using TLS can be significantly more complicated. The basic procedure for a unidirectional data flow is as follows.

  1. Locate openHistorian.cer in the openHistorian installation folder on the publisher system.
  2. Transfer openHistorian.cer to the subscriber system.
  3. Use the Create Authorization Request page to configure the subscriber. When you create the request, make sure to click the Advanced... button and use the Import CER... button to import the publisher’s certificate into the trusted certificate store on the subscriber system. Also make sure the check the Self-signed? checkbox.
  4. After generating the subscription request (the .SRQ file), transfer that file to the publisher system and then use the Authorize Subscribers page to configure the publisher. The first thing you do should be to import the SRQ. You can edit the information from there. Make sure to check the Self-signed? checkbox.
  5. Go back to the subscriber and enable the device that represents the subscription.

For more detailed information, you can refer to the following discussion.
https://gridbits.gridprotectionalliance.org/t/subscriptions-using-tls/445

Thanks,
Stephen

Hi Stephen,

Thanks for your help. I’m able to subscribe with publisher after importing certificate.

Please clarify below with respective of TLS publisher/subscription.

Query 1: Measurement details are not appearing in subscriber machine under option “input/Subscription based inputs/Measurement Subscriptions”

We have below environment,

  1. We have 3 machines (A, B and C).
  2. Machine A connected with Synchro phaser & PDC.
  3. Machine B & C is running with PDC (OpenHistorian) only.
  4. I enabled TLSPublisher between A & B, A & C.
  5. A is successfully connected with B and C; stream statistics are good.
  6. A is configured subscriber measurement access under Action/publisher menu.
  7. In B, I’m able to see all measurements, which measurements are enabled access from A.
  8. In C, I’m not able to see all measurements, which measurements are enabled access from A.
  9. Between A & B, A & C, streaming statistics is fine and showing green, but measurement is not
    appearing between A & C.

Could you please explain, if I need to do anything?

Query 2:

To perform TLSPublishing between PDCs, we are using default certificate “OpenHistorian.cer”, which is arrived with setup.

In future, we need to use our own certificate, which is received from certificate authority.
What is the procedure for this? Could you please explain.

Thanks & Regards
Logu

Query 1: Please ensure that you have configured measurement access for subscriber C on publisher A by selecting the appropriate option in the combo box on the Measurement Access screen.

Query 2: The procedure is pretty simple. Install the certificate into the certificate store, adjust permissions to give openHistorian access to the private key, use Windows tools to export the .cer file, then override the appropriate setting in openHistorian configuration to tell the system to use that .cer file to identify the local certificate. Note that GEP/STTP uses mutual authentication, so every system should have its own certificate whether subscriber or publisher. Please refer to the link in my previous post for information about the configuration settings that need to be overridden in order to use your own certificates. Here is the link again.

https://gridbits.gridprotectionalliance.org/t/subscriptions-using-tls/445

Hi Stephen,

please clarify below

I’m using TLS publishing and trying in below environment.

3 systems running with OpenHistorian (such as A, B and C)

  1. ‘A’ is connected with PMU and running with OpenHistorian.
  2. ‘B’ and ‘C’ both are running with OpenHistorain only.
  3. Here ‘B’ is subscribed with ‘A’.
  4. ‘C’ is subscribed with ‘B’.

Result:

“A” is publishing all measurement details to “B” using PPA and through Gateway Exchange Protocol (GEP) by default.
But B is not publishing any information to C, but it’s connected successfully. Expectation is we need to see ‘A’ is measurement details in ‘C’ through ‘B’.

I observed, B is publishing A’s details to ‘C’ through Gateway Protocol by default.
Also, by default its exposing all measurements through STAT, not through PPA.

Could you please clarify, where I’m missing and why I’m not able to see A’s measurements in ‘C ‘through ‘B’

If you need any other information, please let me know.

Thanks & Regards
Logu

Ah, if C is connected to B then it makes more sense. Your issue has to do with internal and external data flows. The short answer is that you need to tweak the connection string on B’s subscriber to make sure internal=true is applied. For details, please refer to the following post.

Dear Stephen,

Thanks for your great information. It is working fine.

I have few more queries, please kindly clarify.
According to this current scenario, All devices (A,B,C) are in same network (with in corporate network).

C is a subscriber for B and C is located at other network (public or Cloud).
B is located within corporate network.

We need to forward the message to C through B.
Now, we are using internal forwarding or internal subscription, when all are in same network.
How will you configure when in other network?

thanks & Regards
Logu

For data coming from another network, you should probably be using receiveInternalMetadata=true; receiveExternalMetadata=false; internal=false. This will bring in the remote network’s internal data, but your subscriber will consider it to be external data within your network.

In general, you should do what seems appropriate to maintain the necessary data flows. Internal/external is really just a tool to help simplify the configuration of data flows for a typical synchrophasor architecture. If it makes sense to mark another network’s data as internal because you’re defining a unidirectional data flow in order to forward that data along to other networks, then it’s perfectly reasonable to do so. If you have very complicated data flows that can’t be managed with just internal/external, then you have other options such as marking everything as internal and using subscriber measurement access in the TLS data publisher configuration to filter things down. The important thing is to know how the concept of internal and external works in order to get your data flows right.

Thanks,
Stephen

Hi Stephen,

I’m performing TLSSubscription between Azure VM running with OpenHistorian and PC running with OpenHistorian at my office network.

I configured all required settings between Azure VM and my PC.
Here my PC is a publisher, Azure VM is a subscriber.

I’m not able to find any error message in publisher log file and showing green color in subscriber status.
Whereas Azure Vm(Subscriber), stream statics is in gray color and logging below details in statuslog.txt file.

Could you please explain its happening?

[2/13/2020 3:55:12 PM] [ABC] Attempting connection to tcp://ABC IPAddress:6177…

[2/13/2020 3:55:12 PM] [ABC] Attempting command channel connection to publisher…

[2/13/2020 3:55:12 PM] [ABC] Connection established.

[2/13/2020 3:55:12 PM] [ABC] Data subscriber command channel connection to publisher was established.

[2/13/2020 3:55:12 PM] [ABC] Failure code received in response to server command “Unsubscribe”: Subscriber not authenticated - Unsubscribe request denied.

[2/13/2020 3:55:12 PM] [ABC] Failure code received in response to server command “MetaDataRefresh”: Subscriber not authenticated - MetaDataRefresh request denied.

Thanks & Regards
Logu

When the subscriber connects, what information does the publisher’s status log provide?

Hello Stephen,

Good Morning

I’m trying to perform TLS subscription between two machines (Cloud and Non-cloud machines), I’m able to perform subscription successfully, However I have a query, please clarify.

• Machine-A Cloud
• Machine-B Non-cloud running with in corporate network.

Machine details

Machine A:
o Running with OpenHistorian 2.6.x version
o Not connected with PMU
o Locating at Azure cloud environment
o It’s a subscriber to machine B
o Firewall is configured to allow all relevant ports.
Machine A’s subscriber configuration
interface=0.0.0.0; compression=true; autoConnect=true; securityMode=TLS; server=Public IP of Corporatenetwork:6177; remoteCertificate=C:\Program Files\openHistorian\Certs\Remotes\ABC.cer; validPolicyErrors=RemoteCertificateChainErrors, RemoteCertificateNameMismatch; validChainFlags=UntrustedRoot; checkCertificateRevocation=False;receiveInternalMetadata=True; receiveExternalMetadata=True

Machine B:
o Running with OpenHistorian 2.6.x version
o Connected with PMU
o It’s a publisher to cloud machine.
o Running with in corporate network.
o Subscriber configuration is enabled with “Self-signed” and “Enable PG connection”.
o Relevant certificate is placed at C:\Program Files\openHistorian\Certs\Remotes\SubscriberName.cer

My Queries:

  1. Subscriber machine’s, server configuration parameter, we have mentioned Public IP of Corporatenetwork:6177, whereas local network, we are giving exact publisher IP address.

Here, if I have 2 publishers and will be subscribed by one cloud machine, in this case, how can I mention each publisher details? because, we have specified public IP of corporate network.

This may be infra related question, but if you have any suggestion for this use case, please clarify.

What is the purpose of having two publishers? In other words, is the data redundant between the two publishers or partitioned? Are you looking to set up something like a fail-over connection or a redundant connection?

Hello Stephen,

Please see my below requirement:

I have 2 testing instances of openHistorian (running inside corporate firewall) and both are connected with PMU and I want to subscribe both pdc(s) from one cloud VM to monitor both PDCs.

E.g.

Machine-A, it’s a cloud VM running with OpenHistorin

Machine-B is PC running with openHistorian and connected with PMU

Machine-C is PC running with openHistorian and connected with PMU

Here B and C is inside the corporate firewall.

Now I want to see the timeseries data of both Machine-B and Machine-C in Cloud VM (Machine-A).

According to this requirement,

What is the solution in openHistorian?

Is this requirement correct?

If this requirement is not correct, then how user can see all PMU units timeseries data through single PDC?

Moreover I would like to know the real-time topologies for OpenHistorian, if you have any documents related to these use case, please help.

Thanks & Regards
Logu

Ah, I see. With both B and C behind the corporate firewall, they share the same public IP address. In that case, you will need to put the publishers on B and C on different ports. This can be accomplished using Network Address Translation (NAT) in the corporate network’s firewall to map an external port (for example, 6178) to an internal endpoint (for example, Machine-C:6177).

It shouldn’t be necessary to do this, but you can also change the publisher port in the openHistorian configuration file (openHistorian.exe.config) by stopping the service, locating the /configuration/categorizedSettings/tlsdatapublisher/add[@name=ConfigurationString] setting, changing the port value, then restarting the service.

Thanks,
Stephen

Hi Stephen,

Thanks for your information.

I have some queries about PDC Subscription/Publishing concepts in OpenHistorian, please clarify.

In OpenHistorian Manager, we’ve below list of subscription options

  • Input -> Create Internal Subscription
  • Input-> Create Authorization Request-
    • TLS
    • Gateway

Queries:

When should I use an option “Input->Create Internal Subscription”?

My understanding is, if both subscriber and publisher are located within same corporate network, then “Input->Internal subscription” will be suitable. So please correct me, if my understanding is wrong.

When should I use an option “Input->Create Authorization request”?

My understanding is, this option is also used to perform subscription with publisher, which is located anywhere (within corporate network or external network - Cloud). Please correct me if I’m wrong.

Here,

An option (a) TLS mode, will be used with SSL certificate to perform the secure communication.

This can be applicable both within internet (corporate network) and connect from public network to corporate network (anywhere)

  • An option (b) Gateway, what is the purpose of this option? How does it differ from “Input -> Create Internal Subscription”? or both are same?
  • When should I use an option (b) Gateway?

Additional Query:
Could you please explain logic of Subscriber/Publisher mechanism of OpenHistorian?

Especially for TLS publishing

  • Publisher is listening port 6177
  • Subscriber is connecting to PublisherIP:6177
  • Performs certificate authentication and connection gets established.

My query is:

All data transmission is happened through only TCP port 6177 or any other port is getting involved?

Please explain all transaction details, so that I can make better documentation with all configuration details like port enabling with firewall and etc…

Please kindly clarify.

Internal subscriptions are intended for trusted network paths in which authentication, encryption, and ACLs are considered to be unnecessary. It merely simplifies the setup of the gateway connection for transfer of data on an internal network or VPN by eliminating the certificate exchange, subscriber authorization, and subscriber measurement access steps of the configuration. Instead, all the requested data is automatically forwarded to any subscriber that can reach the publisher’s internal port (6175), and the subscriber can choose to filter the data set down to what it needs using filter expressions. For example, this type of gateway connection is used extensively by the visualization tools in openHistorian Manager, which does not need to be authorized since it exists on the local system.

The Gateway option on the Inputs > Subscription Based Inputs > Create Authorization Request page should never be used for anything. This is a legacy option that has long been deprecated, and we’ve recently dropped the option from the STTP standard (IEEE P2664) that is based on GEP.

All data is, indeed, transferred over the connection made by the subscriber to publisher port 6177. The only exception to this rule is if you select the options to transfer data over a separate UDP port instead. This option is typically not recommended, as the compression options for UDP are not as good as they are for TCP. However, if you do select this option, data packets will be sent from the publisher to a UDP port on the subscriber system. This is a unidirectional UDP data stream so the publisher expects no UDP return packets.

Thanks,
Stephen

Hello Stephen,

I’ve raised a query separately “OpenHistorian Archive file reading/writing programmatically”, please kindly help asap.