How to upload data and documents to staging location

Prompt Support Updated by Prompt Support

Before following this guide, ensure you have received AWS credentials that can be used to upload data into Prompt's environment.

In order to migrate data from an existing system into Prompt, new clients must format their data to match a target schema, and upload all of their data and documents into a staging location in AWS. The data is then processed by a migration tool that runs in Prompt's AWS environment. The staging location used is an Amazon S3 bucket, which is a simple key-value blob store.

Step 1: Prepare the data

The data to be uploaded includes a JSON data file and a collection of binary files. The JSON data file must conform to a target schema, and specifies the properties and configuration of various entities in the Prompt system, such as Documents, Departments, and Users. The binary files should be the policy documents that are to be migrated into Prompt.

It is important to work out ahead of time what key each document will be stored under in the staging bucket, so this can be specified in the data file for later retrieval. If your documents are placed in a folder structure before uploading, the file path will form part of the S3 object key. For example, if a document is stored at:

./documents/procedures/Procedure-1052.pdf

Then uploading it to S3 will result in a key like:

prod-prompt-migrationtool-import/clientprefix/documents/procedures/Procedure-1052.pdf

To easily migrate the data, collect all the binary document files in a single directory on disk. You will use the AWS CLI to recursively search and upload all files in the directory. The JSON data file should also be placed in this directory so everything can be uploaded together.

Step 2: Setup AWS CLI

First, download and install the AWS CLI.

Then, run the following command to configure your local credentials:

aws configure --profile promptmigration

You will be prompted to enter your AWS Access Key ID and AWS Secret Access Key. These should have been provided to you. For the default region, use ap-southeast-2 (the Sydney region), and for the default output format, use json.

Check your credentials by running the following command:

aws sts get-caller-identity --profile promptmigration

If you receive a response containing your UserId, then you have successfully configured the AWS CLI.

Step 3: Perform upload

Open a terminal and navigate to the directory containing the data file and documents. Run the following command to begin the upload, replacing clientprefix with the prefix provided to you:

aws s3 sync . s3://prod-prompt-migrationtool-import/clientprefix --acl bucket-owner-full-control --profile promptmigration

This command will recursively search through the current directory (.) and copy files into the S3 bucket. The command will only upload files that are new or missing from the S3 target location. If the operation stops for any reason, or if data needs to be corrected or otherwise modified, then the command can be re-run without requiring another full upload of the data.

Step 4: Validate upload

Once the upload completes, validate its success by downloading one of the files that was uploaded, for example:

aws s3 cp s3://prod-prompt-migrationtool-import/clientprefix/MigrationDataFile.json ./ --profile promptmigration

This command will copy a file from the S3 bucket to the current directory (./). If the file downloads, then the upload worked successfully.

How did we do?

Migration Data Transformation Example

Troubleshooting data validation issues

Contact