Import user profiles

You can import user profiles from the ReachFive Console. The instructions below guide you step-by-step in importing your users. The visual below shows you a high-level flow of the process.

I just want to see a video

Skip ahead to a short video tutorial that shows you how to configure an import job using a csv file on a SFTP server.

import jobs user flow

Import users from the console

Follow the instructions below to import user profiles from the ReachFive Console.

Not all User Profile Object fields can be imported. See the table on the User Profile data model page to see which profiles can be imported.

Prerequisites

  • You must have access to the ReachFive Console.

  • You must have a Developer, Manager, or Administrator role.

  • You must have the Import Jobs feature enabled.

  • Unless you choose the Force data update from file option, the following applies to importing user profiles:

    • Updating existing user: The updated_at value in the import file must be more recent than the existing profile.

    • Importing new user: The updated_at value is not mandatory.

Multi-threaded imports

Our import jobs are multi-threaded. This means that many profiles are imported simultaneously. Therefore, it’s important that you ensure all profiles are unique; otherwise, you may get duplicate profiles.

We strongly encourage you to check the following fields to ensure each profile is unique:

  • email

  • external_id

  • custom_identifier

  • phone_number (if SMS feature is enabled)

See the User Profile for more on these fields.

Managed vs Lite profiles

Separate import jobs

Whether you are importing new profiles or updating existing ones, managed profiles and lite profiles should be imported separately as different jobs. If you try to import a mixed group of managed and lite profiles, it could cause production issues.

Created and updated fields
  • If created_at is present in the import file, keep this value in the file.

  • If created_at is not present in the import file, value it with the execution date of the import.

  • If updated_at is present in the import file, keep this value in the file with maximum tolerance of + 10 minutes compared to execution date. If the date exceeds tolerance, bring it back to execution date + 10 minutes.

  • If updated_at is not present in the import file, value it with the execution date of the import.

Instructions

The instructions below apply to both creating and editing an import job definition.

If editing an existing import job, be sure to select the edit icon instead of creating a new definition.
  1. Go to Settings  Import definitions.

  2. Select New definition.

  3. Under General, give the import job a name and description. Don’t forget to Enable the job.

  4. Under Source, choose the protocol you wish to use to import the file.

    • SFTP

    • S3

    • GCS

    1. Specify the Server host for the secure FTP site.

    2. Specify the Server port.

    3. Under Authentication method, choose the authentication method type:

      Username and password:

      1. Enter the Username for the server.

      2. Enter the Password for the server.

      OpenSSH:

      1. Enter the Username for the server.

      2. Enter the OpenSSH private key.

        example
        -----BEGIN ENCRYPTED PRIVATE KEY-----
        MIIBpjBABgkqhkiG9w0BBQ0wMzAbBgkqhkiG9w0BBQwwDgQI5yNCu9T5SnsCAggA
        MBQGCCqGSIb3DQMHBAhJISTgOAxtYwSCAWDXK/a1lxHIbRZHud1tfRMR4ROqkmr4
        kVGAnfqTyGptZUt3ZtBgrYlFAaZ1z0wxnhmhn3KIbqebI4w0cIL/3tmQ6eBD1Ad1
        nSEjUxZCuzTkimXQ88wZLzIS9KHc8GhINiUu5rKWbyvWA13Ykc0w65Ot5MSw3cQc
        w1LEDJjTculyDcRQgiRfKH5376qTzukileeTrNebNq+wbhY1kEPAHojercB7d10E
        +QcbjJX1Tb1Zangom1qH9t/pepmV0Hn4EMzDs6DS2SWTffTddTY4dQzvksmLkP+J
        i8hkFIZwUkWpT9/k7MeklgtTiy0lR/Jj9CxAIQVxP8alLWbIqwCNRApleSmqtitt
        Z+NdsuNeTm3iUaPGYSw237tjLyVE6pr0EJqLv7VUClvJvBnH2qhQEtWYB9gvE1dS
        BioGu40pXVfjiLqhEKVVVEoHpI32oMkojhCGJs8Oow4bAxkzQFCtuWB1
        -----END ENCRYPTED PRIVATE KEY-----
    4. Specify the Path where the import file is located.

      For example

      <serverhost>/path-to-file/file.csv.

    1. Specify the URL for the S3 bucket.

    2. Specify the name Bucket.

    3. Enter the Region for the server.

    4. Enter the Access key for AWS.

    5. Enter the Secret key for AWS.

    6. Specify the Path where the import file is located.

      For example

      <serverhost>/path-to-file/file.csv.

    1. Specify the Project ID for the Google Cloud Storage.

    2. Specify the App name.

    3. Enter the User name for the server.

    4. Specify the name Bucket.

    5. Enter the Credentials in JSON format.

    6. Specify the Path where the import file is located.

      For example

      <serverhost>/path-to-file/file.csv.

  5. If you are importing files with additional encryption:

    1. Select Encrypt.

    2. Enter your password that is used as part of the encryption.

    3. Specify the number of PBKDF2 iterations you used.

      PBKDF2 encryption command
      openssl aes-256-cbc -salt -pbkdf2 -iter 10000 -in file_to_import -out file_to_import.enc

      See encrypting an import file for more details.

  6. Under Schedule, if desired, use a cron expression for scheduling the job.

  7. Under File format, select the file format type you wish to import. This will be either JSON or CSV.

    • JSON

    • CSV

    Choose the Encoding standard for your JSON file.

    {
      "external_id": "1",
      "email": "foo@gmail.com"
    } \n (1)
    {
      "email": "bar@gmail.com",
      "name": "Joe",
      "gender": "M",
      "custom_fields": { (2)
          "has_loyalty_card": true
      },
      "consents": { (3)
        "exampleConsent": {
          "date": "2021-11-23T11:42:40.858Z",
          "consent_version": {
            "language": "fr",
            "version_id": 1
          },
          "granted": true,
          "consent_type": "opt-in",
          "reporter": "managed"
        }
      },
      "addresses": [ (4)
        {
          "id": 0,
          "default": true,
          "address_type": "billing",
          "street_address": "10 rue Chaptal",
          "address_complement": "4 étage",
          "locality": "Paris",
          "postal_code": "75009",
          "region": "Île-de-France",
          "country": "France",
          "recipient": "Matthieu Winoc",
          "phone_number": "0723538943",
          "custom_fields": {
            "custom_field_example": "custom_field_example_value",
            "custom_field_example_2": 42,
          }
        }
      ]
    }
    1 Each profile must separated by a line feed character (\n).
    2 Import custom_fields as an object containing a series of fields.
    3 Import consents as an object or as a flattened field with the format consents.<consent>.<parameter> shown below.
    ...
      "consents.cgu.date": "2021-09-03T19:08:01Z",
      "consents.cgu.granted": true,
      "consents.cgu.consent_version.version_id": 2,
      "consents.cgu.consent_version.language": "fr",
      "consents.cgu.consent_type": "opt-in",
      "consents.cgu.reporter": "managed"
    ...
    4 You can import addresses including custom address fields.

    When deleting profiles, only four fields are accepted:

    • id

    • external_id

    • email

    • phone_number

    Only one of the four fields should be present and have a value per line. This ensures the uniqueness of the targeted profile. However, if multiple fields have values on the same line, only one is used to target a profile to delete, the other values are then discarded.

    The order of priority to choose the value is:

    1. id

    2. external_id

    3. email

    4. phone_number.

    For example: If both id and email are present, only id is used.

    If any additional field is present on a line, the line is omitted with a warning, but the other valid lines are processed.
    1. Choose the Encoding standard for your CSV file.

    2. Enter your Delimiter. The default is ;.

    3. Enter your Quote char. The default is ".

    4. Enter your Escape character. The default is \.

    email,name,gender,custom_fields.has_loyalty_card,consents.newsletter.consent_type,consents.newsletter.granted,consents.newsletter.date,consents.newsletter.reporter,addresses.0.custom_fields.custom_field_example
    bar@gmail.com,Joe,M,true,opt-in,true,2018-05-25T15:41:09.671Z,managed,custom_field_example_value

    When deleting profiles, only four columns are accepted:

    • id

    • external_id

    • email

    • phone_number

    Only one of the four columns should contain a value per line. This ensures the uniqueness of the targeted profile. However, if multiple columns contain values, only one is used to target a profile to delete, the other values are then discarded.

    The order of priority to choose the value is:

    1. id

    2. external_id

    3. email

    4. phone_number

    For example: If both id and email are present, only id is used.

    If any other column is in the file, the whole file is rejected.
  1. If desired, enable the End Job Notification Webhook.

    For more information on the webhook, see the End Job Notification Webhook page.
  2. Under Advanced:

    1. Select Import/Update for the Type.

      For the Delete type, see Delete user profiles in bulk.

    2. If desired, select Testing mode.

      import profiles test mode

      When you import profiles in Testing mode, ReachFive tests:

      • access to the server

      • access to the import file

      • whether the the import file content is valid

      Testing mode does not handle database modifications.
    3. If importing Lite profiles only, select Lite profile only.

      When you import Lite profiles only, you are not importing the fully managed profiles, but rather those lite profiles that registered with your site. You have to import Lite profiles separately from managed profiles.

      Separate import jobs

      Whether you are importing new profiles or updating existing ones, managed profiles and lite profiles should be imported separately as different jobs. If you try to import a mixed group of managed and lite profiles, it could cause production issues.

      Created and updated fields
      • If created_at is present in the import file, keep this value in the file.

      • If created_at is not present in the import file, value it with the execution date of the import.

      • If updated_at is present in the import file, keep this value in the file with maximum tolerance of + 10 minutes compared to execution date. If the date exceeds tolerance, bring it back to execution date + 10 minutes.

      • If updated_at is not present in the import file, value it with the execution date of the import.

    4. If you want to force an update from file, select Force data update from file.

      If there are fields with values in the current ReachFive profile that are not in the import, these fields retain their current value. However, all fields with values replace the existing fields in ReachFive.

Delete existing fields

If you want to delete an existing field, you need to pass a null value as part of the import file.

  • The updated_at value of the import file must be more recent than the existing profile.

  • JSON

  • CSV

{
  "updated_at": "2021-06-04T14:16:34.658Z",
  "email": "bar@gmail.com",
  "name": "Joe",
  "family_name": null, (1)
  "gender": "M"
}
1 Pass null to delete the existing family_name field.
external_id,updated_at,email,name,family_name,gender
1,2021-06-04T14:16:34.658Z,foo@gmail.com,,,,,,bar@gmail.com,Joe,__null__,M (1)
1 Pass __null__ to delete the existing family_name field.

Video tutorial