The CSV import service is offered in the Canary Collectors install. When going through the install, look for this screen:
If a complete install has not been used, customize the current install and allow this feature.
Once installed, two config files will need to be created: SAF_ImportService.config and SAF_Import.config.
SAF_ImportService.config
In this configuration you will designate the frequency in seconds of how often the service will check a directory for a new CSV file (<Interval>) and which directory to monitor for the import (<InputDirectory>). The service can be set up to monitor multiple directories if need be. The SAF_ImportService.config must reside in the same folder as the SAF_ImportService.exe which by default is located in C:\Program Files\Canary Labs\Collectors. If upgrading or installing v20.2 the install path is C:\Program Files\Canary\Collectors and the config file already resides with the collector. It will still need the parameters copied over to the new location. Any changes made to this config file require a restart of the Import service. Below is an example of this config file.
SAF_Import.config
The SAF_Import.config is used to match the format of the CSV file to be imported into the historian. This file must reside in the same directory in which the CSV files will be placed. The location should also be a public folder. If it is a user folder, the collector will not be able to process the file. There are over 30 different parameters to choose from. A list of these parameters and their descriptions can be found in C:\Program Files\Canary Labs\Collectors\Canary SAF Import Service Configuration.pdf and at the bottom of this article. Here is an example:
SAF_Import.config Parameters
• InputDelimiter – Character used to parse each line of data. Default value is “,”
• HeaderCount – Number of rows to skip when reading a file to either get to the beginning of the data for row based processing or to the record layout for table based processing.
• SubHeaderCount – the number of rows between the header and the actual data in the file.
• Historian – Name of Historian to route data to. Default value is localhost.
• DataSet – Name of dataset to route data to in the Historian. DataSet or DataSetOffset are required if dataset not encoded in tag name being imported.
• DataSetOffset – Column position to be used as the dataset name while building a tag name.
DataSet or DataSetOffset are required if the datasetname is not encoded in the tag name being imported.
• BranchOffset – Column position to be used as a branch name while building a tag name.
• BranchPosition – Row/Column offset of the name to be used as a branch name while building a tag name.
• UseFileNameAsBranch – the file name without extension that will be used to name a branch in the historian.
• TagOffsets – Used in table based processing to define a comma separated list of column offsets for tag names. Each entry may contain a range such as 2-10.
• TagNameOffset - Column position to be used as a tag name.
• TagNameMatch – (Used in conjunction with TagNameReplace) Allows the user to define a
regular expression in order to parse column headers.
• TagNameReplace – (Used in conjunction with TagNameMatch) Allows the user to define which
capture group to use as the tagname.
• ValueOffset – Column position to be for the value.
• DateOffset – Column position to be used for the date. Can be used in conjunction with the
TimeOffset parameter if date and time are located in separate columns. Otherwise the
TimeStampOffset parameter would be used when date and time are combined in a column.
• DateFormat – Format of date being imported if a non-standard format is used that .Net can’t
convert automatically.
• TimeOffset - Column position to be used for the time. Can be used in conjunction with the
DateOffset parameter if date and time are located in separate columns. Otherwise the
TimeStampOffset parameter would be used when date and time are combined in a column.
• TimeFormat - Format of time being imported if a non-standard format is used that .Net can’t
convert automatically.
• TimeStampOffset - Column position to be used for the date and time. Can be used if date and
time are located in the same column. Otherwise the DateOffset and TimeOffset parameters
would be used when date and time are in separate columns.
• DateTimeFormat - Format of datetime being imported if a non-standard format is used that .Net
can’t convert automatically.
• FileDate – Allows the user to define the date of the hdb file within the historian.
• StartDatePosition – For file formats that are interval-based this gives the user an initial datetime
to use to calculate the timestamp of each value.
• IntervalOffset – For file formats that are interval-based this gives the user the delta between
each row
• IntervalOffsetUnits – The units that are used for the interval between each row
• QualityOffset - Column position to be used for the quality.
• DataTypeOffset - Column position to be used if the data type is included with the value.
• KeepProcessedFiles – Determines if files are copied to another directory or deleted when
processing is complete.
• ProcessedDirectory – Directory to copy files to when processing is complete if the
KeepProcessedFiles is set to true.
• AutoCreateDataSet – Determines if SAF would create a dataset in the Historian if the dataset
doesn’t currently exist. Default value is true.
• AllowInsertedData – Determines if SAF will insert out of order data or if data must be received in
chronological order. Default value is true.
• AutoWriteNoData – Determines if SAF will insert a no data quality for each tag when a session
disconnects. Default value is false.
• NoDataValues – A comma separated list that will cause SAF to insert a no data quality for a
particular TVQ if the value supplied in the file matches one of the values in the list.
• UseSystemCultureForValueParsing – This is a Boolean value that tells us to use the machine’s
culture instead of the invariant culture.
• UTCtimeStamp – This is a Boolean value that indicates whether or not the timestamps are in UTC. The default value is false and assumes the timestamps are local time.
Comments
0 comments
Article is closed for comments.