Delete user profiles in bulk

You can delete user profiles in bulk by using the Import job definition. Although the import job feature is used, the logic is actually deleting the user profiles. The import job definition handles importing, updating, and in this case, removing unwanted or inactive user profiles. The section below instructs you on how to delete profiles in bulk via the ReachFive Console.

How does it work? 🤔

The import job definition matches the user profiles on your ReachFive account by their unique ID.

This can be:

  • email

  • phone_number

  • ReachFive_ID

  • external ID

A boolean, forDeletion, is then passed to the import job definition to let it know that the updates occurring in this job are, in fact, removing the matched users.

File limitations

We support files up to 1 million profiles. That’s a lot. If you have more than that, we recommend running a second job and waiting for the first one to finish.

Delete user profiles from console

You may wish to remove unwanted to inactive user profiles from your system. Do this by removing the profiles via the ReachFive Console.


  • You must have access to the ReachFive Console.

  • You must have a Developer, Manager, or Administrator role.

  • You must have the Import Jobs feature enabled.

Multi-threaded imports

Our import jobs are multi-threaded. This means that many profiles are imported simultaneously. Therefore, it’s important that you ensure all profiles are unique; otherwise, you may get duplicate profiles.

We strongly encourage you to check the following fields to ensure each profile is unique:

  • email

  • external_id

  • custom_identifier

  • phone_number (if SMS feature is enabled)

See the User Profile for more on these fields.


The instructions below apply to both creating and editing an import job definition.

If editing an existing import job, be sure to select the edit icon instead of creating a new definition.
  1. Go to Settings  Import definitions.

  2. Select New definition.

  3. Under General, give the import job a name and description. Don’t forget to Enable the job.

  4. Under Source, choose the protocol you wish to use to import the file.

    • SFTP

    • S3

    • GCS

    1. Specify the Server host for the secure FTP site.

    2. Specify the Server port.

    3. Under Authentication method, choose the authentication method type:

      Username and password:

      1. Enter the Username for the server.

      2. Enter the Password for the server.


      1. Enter the Username for the server.

      2. Enter the OpenSSH private key.

    4. Specify the Path where the import file is located.

      For example


    1. Specify the URL for the S3 bucket.

    2. Specify the name Bucket.

    3. Enter the Region for the server.

    4. Enter the Access key for AWS.

    5. Enter the Secret key for AWS.

    6. Specify the Path where the import file is located.

      For example


    1. Specify the Project ID for the Google Cloud Storage.

    2. Specify the App name.

    3. Enter the User name for the server.

    4. Specify the name Bucket.

    5. Enter the Credentials in JSON format.

    6. Specify the Path where the import file is located.

      For example


  5. If you are importing files with additional encryption:

    1. Select Encrypt.

    2. Enter your password that is used as part of the encryption.

    3. Specify the number of PBKDF2 iterations you used.

      PBKDF2 encryption command
      openssl aes-256-cbc -salt -pbkdf2 -iter 10000 -in file_to_import -out file_to_import.enc

      See encrypting an import file for more details.

  6. Under Schedule, if desired, use a cron expression for scheduling the job.

  7. Under File format, select the file format type you wish to import. This will be either JSON or CSV.

    • JSON

    • CSV

    Choose the Encoding standard for your JSON file.

      "external_id": "1",
      "email": ""
    } \n (1)
      "email": "",
      "name": "Joe",
      "gender": "M",
      "custom_fields": { (2)
          "has_loyalty_card": true
      "consents": { (3)
        "exampleConsent": {
          "date": "2021-11-23T11:42:40.858Z",
          "consent_version": {
            "language": "fr",
            "version_id": 1
          "granted": true,
          "consent_type": "opt-in",
          "reporter": "managed"
    1 Each profile must separated by a line feed character (\n).
    2 Import custom_fields as an object containing a series of fields.
    3 Import consents as an object or as a flattened field with the format consents.<consent>.<parameter> shown below.
      "": "2021-09-03T19:08:01Z",
      "consents.cgu.granted": true,
      "consents.cgu.consent_version.version_id": 2,
      "consents.cgu.consent_version.language": "fr",
      "consents.cgu.consent_type": "opt-in",
      "consents.cgu.reporter": "managed"
    1. Choose the Encoding standard for your CSV file.

    2. Enter your Delimiter. The default is ;.

    3. Enter your Quote char. The default is ".

    4. Enter your Escape character. The default is \.

  1. If desired, enable the End Job Notification Webhook.

    For more information on the webhook, see the End Job Notification Webhook page.

Job reports

Job reports provide a snapshot of the different export and import jobs that you have run. Information such as the id, a definition, the job type, the progress of the job, the job’s status, and when the job was last run is provided for you so you can get a quick look at the latest jobs. To find more information about the job, you can view the job logs.

Only job reports run in the last 6 months are displayed on the Job reports page in the ReachFive Console. Older jobs are automatically deleted.

To access job reports:

  1. Open your ReachFive Console.

  2. Go to Settings  Job reports.

    job reports

    The following statuses are possible for job reports:

    • FAILURE which indicates the job has failed. Check out the logs for more information.

    • WAITING which indicates the job has yet to finish.

    • SUCCESS which indicates the job was successfully run.

    For Bulk deletion jobs:

    • If the job runs completely through the file, the Succeeded: <N> and Failed: <N> displayed on the Job Report accurately reflect the number of profiles successfully deleted and those that weren’t successfully deleted.

    • If the job is stopped during execution, the Succeeded: <N> and Failed: <N> displayed on the Job Report are only an estimate of the number of profiles successfully deleted and those that weren’t successfully deleted.

Job logs

To view more information on the job itself or download the logs, click Show logs found on the far right hand side of the table.

In the job logs, you can see the information Level, the Content of the information, and the Date on which it occurred.

Category Description Example


Denotes the level of the log information.

possible values
  • ERROR indicates an error has occurred while running the job, typically related to incorrect credentials or corrupted files (CSV/JSON).

  • WARNING indicates there is something incorrect about the log, typically related to syntax or duplication.

  • LOG indicates standard log content, typically denoting profiles, or the operation itself (export/import).



Gives a description of the log information.

"Export profiles from beginning"


The timestamp for the log information.

2023-04-01 19:00:06

Download logs

You can download full logs or just the errors by clicking the desired dropdown and choosing the format for the download.

job report logs
Example JSON download
{"Level":"LOG","Content":"Export profiles from beginning","Date":"2023-04-02T00:00:00.899Z"}
{"Level":"LOG","Content":"Total profiles to export: 47","Date":"2023-04-02T00:00:01.296Z"}
{"Level":"LOG","Content":"Uploading to path: /uploads/weekly-export.csv","Date":"2023-04-02T00:00:01.298Z"}
{"Level":"ERROR","Content":"Error occurred: Invalid SFTP credentials.","Date":"2023-04-02T00:00:06.991Z"} (1)
1 Shows where the ERROR occurred and gives details on what the error was.

Export logs

You can export logs for a specific job using the Management API’s Export job logs endpoint.