Adding Standard File Data Feeds

Complete the following tasks to add a Standard File Data Feed.

Task 1: Add a data feed

  1. From the menu bar, click Admin menu > Integration > Data Feeds.

  2. Click Add to create a new data feed.
  3. In the General Information section, do the following:
    1.  Enter the name and description of the data feed.

      Note: The alias populates automatically when you set the name of the data feed for the first time and save the data feed. You can modify the alias after you save the data feed. The remaining fields in the General Information section are read-only and populate when the data feed is created, updated, and run.

    2. Select Select to make the data feed active.
  4. In the Feed Information section, do the following:
    1. In the Feed Type field, select Standard. See Adding Transport Only Data Feeds to create a Transport Only data feed.
    2. In the Target Application field, select the application or questionnaire where you want to import the data into.
    3. In the Service Account Name field, enter the user account associated with the data feed. If the user does not exist, you can create a new user. Enter the username. See Data Feeds Service Account for more information on the Service Account Name.
  5. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 2: Define the transport method

  1. Go to the Source Connection tab of the data feed.

  2. From the Source Locale list, select a locale.
  3. From the Transport Method list, select a transport type.

  4. In the Transport Configuration section, complete the configuration options.

    • File Type
    • The following table describes the options.

      Option

      Description

      Single Data File

      References a single data file. This option requires you to specify a path in the Path field.

      Note: The JSON Iterator only supports the Single Data File type.

      Manifest File

      Points the Data Feed Manager to a file that contains a list of data files to process. This option requires you to specify a path in the Path field.

      Zip File

      References a ZIP file. This option requires you to specify a path in the Path field. Do not include files for Attachment fields.

      Note: All file names within a ZIP file must consist of characters from the code page 437 character set.

    • Path: The path to the external source from which data is imported when you run the data feed.
    • File Filter: Specifies which files in the path the data feed processes. This field can contain a single file filter or a list of file filters separated by semicolons. The data feed only processes data included in the File Filter field.
    • Note: When you set the File Filter for a ZIP file, enter the ZIP file as the first file type in the path. In the case of ZIP files, the data feed searches through the parent directory first and then searches through the file inside the specified ZIP file, based on the filter criteria.

      Example: Archer.zip;data.csv

    • Encryption Type: This option is only available for ZIP files. Select an Encryption Type from the list and enter a password to complete the encryption. Choose one of the following Encryption Types for your data feed:
      • None
      • WinZip
      • Rijndael / Advanced Encryption Standard (AES)
  5. (Optional) In the Post Processing - Source file section, determine how the data feed should handle the source data when the integration is complete.

    Option

    Description

    Do nothing

    Does not alter the source file when the data feed successfully completes. The data feed also deletes any local copy of the source information.

    Rename

    Saves the source file under a new name when the data feed successfully completes. In File Path and Name, specify the new name for the file and the location to save the file.

    If you select this option, use filename tokens for specifying the location or name of the file.

    Filename tokens

    Filename tokens are available for post processing when you want to save the source information and specify a location or name for the file. When you select the Rename option, you can use tokens to generate unique names automatically for the files.

    Here are the usable tokens for renaming data files.

    • Now. Insert a user-defined date format within the new filename. Possible formats include Now(MM/dd/yyyy) or Now(MM-dd-yyyy). See the Microsoft .Net Framework Developer Center for available custom date/time formats.
    • DataFileDirectoryName. Update the filename with the directory name, including the drive, of your file.

    • DataFileName. Insert the original filename, excluding the directory name and extension.

    • DataFileExtension. Insert the file extension, such as .csv, in the new filename.

    • DataFileFullName. Insert the fully qualified filename. This data includes the drive, directory, filename, and extension of the original file.

    For example, if the data file came from the following location, C:\DataFeed\Source\ESL\processed\ThreatData.csv, any files renamed using tokens provide the following output.

    Example 1

    • Input Tokens: {DataFileDirectoryName}\success\{DataFileName}_{Now(MM.dd.yyyy)}.{DataFileExtension}
    • Output: C:\DataFeed\Source\ESL\processed\success\ThreatData_01.31.2008.csv

    Example 2

    • Input Tokens: \\DFSRepository\{Now(yyyy)}\{Now(MM)}\{DataFileName}_success.{DataFileExtension}
    • Output: \\DFSRepository\2008\01\ThreatData_success.csv

    Delete

    Deletes the source file when the data feed completes successfully.

    This option is available only for File and FTP transport methods.

  6. (Optional) The data feed creates a local copy of the source data for further processing. In the Post Processing - Local copy of source file section, select from the following options to specify how the data feed handles the local copy of the source data after processing the source data.

    The following table describes the options for post processing the local copy of the source data.

    Option

    Description

    Delete

    Deletes the processed source file when the data feed successfully completes. The data feed also deletes any local copy of the source information.

    Rename

    Saves the source file under a new name when the data feed successfully completes. In File Path and Name, specify the new name for the file and the location to save the file.

    To save the data, ensure that the account running the Job Engine service can access the path of the destination file.

    If you select this option, use filename tokens for specifying the location or name of the file.

    Filename tokens

    Filename tokens are available for post processing when you want to save the source information and specify a location or name for the file. When you select the Rename option, you can use tokens to generate unique names automatically for the files.

    Here are the usable tokens for renaming data files.

    • Now. Insert a user-defined date format within the new filename. Possible formats include Now(MM/dd/yyyy) or Now(MM-dd-yyyy). See the Microsoft .Net Framework Developer Center for available custom date/time formats.
    • DataFileDirectoryName. Update the filename with the directory name, including the drive, of your file.

    • DataFileName. Insert the original filename, excluding the directory name and extension.

    • DataFileExtension. Insert the file extension, such as .csv, in the new filename.

    • DataFileFullName. Insert the fully qualified filename. This data includes the drive, directory, filename, and extension of the original file.

    For example, if the data file came from the following location, C:\DataFeed\Source\ESL\processed\ThreatData.csv, any files renamed using tokens provide the following output.

    Example 1

    • Input Tokens: {DataFileDirectoryName}\success\{DataFileName}_{Now(MM.dd.yyyy)}.{DataFileExtension}
    • Output: C:\DataFeed\Source\ESL\processed\success\ThreatData_01.31.2008.csv

    Example 2

    • Input Tokens: \\DFSRepository\{Now(yyyy)}\{Now(MM)}\{DataFileName}_success.{DataFileExtension}
    • Output: \\DFSRepository\2008\01\ThreatData_success.csv
  7. If you selected the Rename post-processing option, enter the location and name of the new file you want to save in the File Path and Name field.

  8. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 3: Define the file format of the source data

This task applies only to Standard data feed types and to whether you need to transform the XML structure of the source file.

Use this task to transform the XML structure of the source file.

Option 1: Define the XML format

  1. Go to the Source Parsing tab of the data feed.
  2. In the Source Format field, select XML.

  3. In the File Definition section, if you want to modify the structure of your source data, either:
    • Enter the XSLT you want to use to transform your source data.
    • Click Load XSLT to use either a pre-written file or your own custom file. For more information on XML formatting guidelines and samples, see XML Formatting Used in Field Results.
  4. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Option 2: Define the JSON format

The File, FTP, HTTP, and JavaScript transporters support the JSON Iterator. JSON processing only supports the Single Data file type.

Requirements

The following requirements apply to using JSON source files:

  • A source file can be either a valid JSON or JSON enclosed in XML tags.

The following requirements apply to the XML input:

  • Add input inside the <root> node for JSON that the XML node does not enclose. The XML specifications indicate how to encode the characters.
  • You must code input according to the XML specifications for JSON that the XML node encloses.

Examples

The following examples show that a source file can be either a valid JSON or JSON enclosed in XML tags:

  • Valid JSON

    {"Assets": [ { "Asset": {"Name": "IP Phone","Description": "<my description>","Status": "Active"}}, { "Asset": {"Name": "Laptop","Description": "My Laptop","Status": "Active"}}] }

  • JSON enclosed in XML tags

    <data> {"Assets": [ { "Asset": {"Name": "IP Phone","Description": "&lt;my description&gt;","Status": "Active"}}, { "Asset": {"Name": "Laptop","Description": "My Laptop","Status": "Active"}}] }</data>

Process

  1. Go to the Source Parsing tab of the data feed.
  2. In the Source Format field, select JSON.

    Note: If your source files are JSON, you can only upload a single file.

  3. In the File Definition section, if you want to convert the JSON source data to XML, either:
    • Enter the XSLT you want to use to transform your source data.
    • Click Load XSLT to use a pre-written file.
  4. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Option 3: Define delimited text files

The Delimited Text option indicates the format of your source data and allows you to configure how the feed parses your source data. This option assumes that you have prior knowledge about how to parse data, including delimiters and the escape sequence.

  1. Go to the Source Parsing tab of the data feed.
  2. In the Source Format field, select Delimited Text.
  3. In the File Definition section, select the encoding and delimiters to match the source file.
  4. Skip Record Count indicates the number of lines that the Data Feed Manager ignores in your source data before parsing the data. For example, if the first row in your source data contains column names, you type "1" so that the Data Feed Manager ignores this row and moves to the next row to start reading data.

  5. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 4: Configure the source data

Use the Source Definition tab to ensure that only data you want is included from the source data. The Source Data tab is available only for Standard data feed types.

Data options

  • Import the source data using its current format into Archer.
  • Convert the source data to a format that matches the requirements of the target application or questionnaire, using advanced options such as lookup translations and calculations.
  • Use the Source Filter tab to only import certain records into the target application or questionnaire.
    • Leave the field values blank to return all records from the source data.
    • Enter field values and use Advanced Operator Logic to only return specific records from the source data.
  • Capture data tokens from the latest run of a data feed to configure the next data feed run.

Process

  1. Go to the Source Definition tab > Source Data tab.

  2. To add a source field, in the Actions column, click Ellipsis and select Add Child.
  3. Enter a Source Field Name.
  4. From the list, select a Field Type.
  5. Select Selectto include a Token to use for the next data feed run. To configure the token, see Define data tokens.

    Note: Only child source fields support tokens.

  6. Click Add New.
  7. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 5: Define data filters

Use data filters to limit the number of records retrieved from your source data. If no filters are defined, the Data Feed Manager returns all records. After a filter has been added, only those records meeting the defined criteria are included in the data feed. You can combine your data filters through advanced operator logic to provide additional filters to your data.

Important:

  • Archer-to-Archer data feeds: Do not use this option. It is recommended that you filter the report data instead.

  • Database Query data feeds: It is recommended that you filter the report data from the Source Definition tab modifying the SQL query set on that tab. For example, add a where clause to the select statement. The statement select * from tblcontent becomes select * from tblcontent where status = 'Active'.

  • Mail Monitor data feeds: It is recommended that you filter the mail by defining filters on the Source Definition tab.

  1. Go to the Source Definition tab > Source Filter tab.

  2. In the Source Fields column, select the source name to which you want to apply a filter.
  3. Note: You can only filter by source fields if a source field includes a child field with no subsequent child fields. For example:

    • If Field A has a child Field B, which has a child Field C, since Field B has a child field C, you cannot filter by Field A.

    • If Field A has a child Field B, but Field B does not have any child fields, you can filter by Field A.

  4. From the Field Name list, select the field name from your data source to which you want to apply a filter.
  5. From the Operator list, select an operator to define which type of filter you want to apply to the source data.
  6. In the Values column, enter a value based on your selection in the Operator column.
  7. (Optional) In the Advanced operator logic field, enter the custom operator logic to create custom operator logic to form relationships between the individual filters.
  8. Complete either of the following optional tasks:
    • To add an additional data filter, click Add located in the Source Filter section title bar.
    • To remove a data filter, in the Actions column of the filter you want to remove, click Remove.
  9. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 6: Map the source fields to the target fields

  1. Go to the Data Map tab > Field Map tab.

  2. Map the source field to the applicable application or questionnaire. Do one of the following:
    • In the Target Field panel Actions column, click Data feeds map modeto enter Map Mode and select a source field to map the target field to.

    • In the Source Field panel, select a source field and click Map a New Target Field to enter Map Mode. In the Target Field panel Actions column, click Map source fieldto map the source field. To map additional target fields, select a source field and click Map Another Target Field.

    • Note: To resize the Source Fields column and display long source field names, drag the left or right scroll bar.

  3. (Optional) To configure the mapped fields, in the Actions column, select Inline edit and complete the settings for the selected field.
  4. (Optional) When you activate the Trust Level option, you can assign a trust level to your source data for a mapped field. Enter a value from 0-99 in the Trust Level of the target field settings. The highest trust level is 0 and 99 is the lowest.
  5. Note: A data feed cannot overwrite a previous feed with a higher trust level. For example, a data feed with a trust level of 75 cannot overwrite a data feed with a trust level of 20.

  6. (Optional) Do one or more of the following:
    • To delete a mapping for a single field, in the Actions column, click Delete.
    • To delete the mappings for all fields, in the Source Field panel, click Ellipsis, and select Clear Mappings.

    Note: Deleting a mapping also deletes any child field mappings.

  7. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 7: Define key fields

To define key fields, complete the following steps:

  1. Go to the Data Map tab > Key Field Definition tab.

  2. In the Reference Fields section, select the field that requires a key field definition.

    Note: The Reference Fields section contains the target application or questionnaire and any mapped cross-reference, related records, CAST, or sub-form fields that require the creation of a key field definition.

  3. In the Key Field Definitions title bar, click Add.

    Note: You can use the Key Field Definitions section to define the simple keys and the data feed actions during the feed run.

  4. In the Field Name field, select a target application or questionnaire field that uniquely identifies the record.
  5. (Optional) Assign combination keys for the record. Do the following:
    1. In the Actions column, click Add.
    2. In the Field Name field, select a field.
  6. (Optional) In the Actions column, click Add simple key to add simple keys in a hierarchical structure for sub-form field types.

    Note: After setting the order of key fields, the Data Feed Manager scans the data source file for matches to each simple key in the specified order. When any key field is found as a match to a field in the target application, the record is considered matched.

  7. In the Operator column, select the applicable option for the matching criteria for the simple key.
  8. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 8: Define the data feed schedule

You can set up data feeds to run automatically at regular intervals. This reduces the time and effort required to import data from an external file. You can initiate data feeds at various times and configure them to run in regular increments for an indefinite period of time.

You can also run the data feed immediately.

To prevent excess server load, schedule data feeds on a staggered basis. You can schedule a maximum of 10 data feeds to run at a time. If more than 10 data feeds are scheduled, each remaining data feed run as the previous one completes.

A reference feed allows you to specify another feed. This indicates to the Data Feed Service that this feed will start running as soon as the referenced feed completes successfully. A successful data feed run processes all input data, completes all expected record updates, and does not report any failures in the Run Details Report.

  1. Go to the Run Configuration tab > Schedule section.

  2. Do one of the following to schedule your data feed.

    Run on Schedule

    You can configure your data feed to run on a defined schedule.

    The following table describes the fields in the Run on Schedule option.

    Field

    Description

    Start Date

    Specifies the date on which the data feed schedule begins.

    Start Time

    Specifies the time the data feed starts running.

    Time Zone

    Specifies the time zone in which the data feed schedule begins.

    Recurring

    Specifies the interval in which the data feed runs, for example, Minutely, Hourly, Daily, Weekly, or Monthly.

    • Minutely. Runs the data feed by the interval set.
    • For example, if you specify 45 in the Every list, the data feed runs every 45 minutes.

    • Hourly. Runs the data feed by the interval set, for example, every hour (1), every other hour (2) and so forth.
    • Daily. Runs the data feed by the interval set, for example, every day (1), every other day (2) and, so forth.
    • Weekly. Runs the data feed based on a specified day of the week, for example, every other Monday (2), every third Monday (3), and so forth.
    • Monthly. Runs the data feed based on a specified week of the month, for example, on the first Monday of every month, on the second Tuesday of every third month, and so forth.

    Every

    Specifies the interval of the frequency in which the data feed runs.

    On

    Specifies the frequency of the days of the week on which the data feed runs.

    Weekday

    Specifies the days of the week on which the data feed runs.

    Run After

    Runs a specified data feed before the current one. The Data Feed Service starts the current data feed after the referenced data feed completes successfully.

    For example, you can select to have a Threats data feed run immediately after your Assets data feed finishes. From the Reference Feed dropdown, select the data feed that runs before the current data feed.

    Run Now

    Click the Run Now button in the toolbar on the Manage Data Feed page to run the data feed manually.

  3. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 9: Define data tokens

  1. Go to the Run Configuration tab > Tokens section.
  2. (Optional) Click Add to add an additional token.
  3. In the Value field of the token that you want to modify, enter the updated value.
  4. (Optional) In the Actions column, click Remove in the row of the token that you want to remove.
  5. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 10: Set rules for archiving and updating records

You can use the update and archive options to update existing records, create new records, or both. In addition, when target records in Archer cannot match records in the external data source, you can select to modify or delete those records. This option can be useful if you are deferring the accuracy and current status of your data to the external system. By deleting or modifying records in the system that are not in your external data source, you ensure that both the external source and the system are synchronized.

  1. Go to the Run Configuration tab > Target Record Processing section.
  2. Select whether to create, update, or ignore records in the target application.
  3. The following table describes the options.

    Option

    Description

    Create a new record, if no matching record is found

    Create new records for data found in the source file and not in the target application or questionnaire.

    Update existing record, if a matching record is found

    Updates records in the target application or questionnaire when a simple key match exists in the source file. If you are also selecting the Unmatched records option of Delete and want to retain existing records, make sure to select the Update existing record option.

    Ignore data validation rules for the target application

    Ignores data validation rules applied when importing source data into required fields in the target application.

    For example, if a values list in the target application required a minimum of two selections, but the source data row only provided one selection, the source row still imports into the target application.

     

  4. Select what happens when the data feed does not find the matching record in the target application.
  5. The following table describes the options.

    Option

    Description

    Ignore

    Does nothing when a matching record is not found.

    Delete

    Deletes records in the target application or questionnaire when a matching record is not in the source data. If you want to retain existing records, also select the Update existing record option.

    Data feeds match records using simple keys, only during the update process. If you want to retain existing records while archiving the target application, also select Update existing record in the Target Record Processing Options. If you do not select the Update existing record option, the data feed permanently deletes all records in the target application.

    Set a value from a target values list

    Sets a value in a Values List field in a record whenever the external data file does not contain a matching record.

    Use this option to set a Values List to a value that identifies this record as Inactive or Not Current. For example, a Devices application with a record for a specific laptop and the external data file does not have a matching record for that laptop. You can use this option to set a Values List field in the laptop record to the value Inactive.

    When you select this option, you also select the Values List field in the target application or questionnaire and the value that you want to set in that field. This set value only sets the specified value in the Values List field, instead of the normal content save behaviors, such as changing the last updated date or triggering calculations and notifications.

    You cannot set the value in the Values List field of the target leveled application under the following conditions:

    • Level 3 or lower in a leveled application.
    • You are modifying the data feed configuration.

    In most scenarios, select the Set with a values list value option and flag these records with a specific value rather than deleting them. For example, you can add a field to your application called Status and include the values Current and Archived. If a data feed cannot find a matching record in the data source with a system record, the system record could be updated to have a value of Archived for the Status field.

  6. Do one of the following:

    • To continue configuring the data feed, go to the next task.
    • To finish setting up the feed later, click Save or Save and Close.

Task 11: Define optimization and notification settings

Use this task to configure email and job status notifications, and whether to optimize related calculations until after the data feed completes.

  1. Go to the Run Configuration tab > Post Processing section.
  2. (Optional) Select whether to delay running calculations until the data feed completes and condense calculations for optimal processing. If this option is enabled, the optimized calculation jobs run after the data feed completes. This improves the performance of data feed processing and calculations.
  3. Note: Condense jobs that are in the queue during an upgrade cause the jobs to fail after the upgrade. Verify that there are no Condense jobs in the queue before upgrading.

  4. (Optional) Specify whether to send an email notification when records are created or updated and when the job status changes.

    The following table describes the options.

    Option

    Description

    Send email notifications when the data feed publishes and updates records

    Select whether to have the data feed trigger notification emails when records are published or updated. If notifications are not enabled in the selected target application, no notification emails are sent when the data feed runs.

    Send job status notifications to selected users when job status changes

    Select whether to have job status notifications sent to selected users or groups. You can also select email addresses to receive job status notifications. If selected, job status notifications are sent showing whether a job succeeded or failed to run.

  5. To save the data feed, click Save or Save and Close.