Export a SQL file

This page describes how to use the pg_dump tool to export an AlloyDB database to a plain-text SQL file that you can later import using the SQL COPY command.

The procedure to perform the export involves these tasks:

  1. Create a Cloud Storage bucket to store the SQL file in.

  2. Prepare a client host to perform the export operation.

  3. Export the database to the SQL file.

  4. Clean up the resources created to perform the procedure.

Before you begin

  • You must have the Owner (roles/owner) or Editor (roles/editor) basic IAM role in the Google Cloud project you are using, or you must have these predefined IAM roles:
    • AlloyDB Admin (roles/alloydb.admin) or AlloyDB Viewer (roles/alloydb.viewer)
    • Storage Admin (roles/storage.admin)
    • Compute Instance Admin (v1) (roles/compute.instanceAdmin.v1)

Create a Cloud Storage bucket

Create a standard storage, regional storage bucket in the project and region where your AlloyDB database is located.

Prepare a client host

To prepare a client host to perform the export operation, you create a Compute Engine VM that can connect to the AlloyDB primary instance where your database is located, and install the pg_dump tool and the Google Cloud CLI on that VM.

  1. Follow the instructions Connect a psql client to an instance to create a Compute Engine VM with the proper connectivity and the pg_dump tool installed. When following these instructions, make sure to allocate enough local storage to the Compute Engine VM to accommodate the SQL file you will create.

  2. Install the gcloud CLI to provide command-line access to create the SQL file in the Cloud Storage bucket.

Export the database

To export the database to a SQL file, you get the IP address of the AlloyDB primary instance where your database is located and then use the pg_dump tool.

  1. Get the IP address of the AlloyDB primary instance where your database is located by viewing its details.
  2. SSH into the Compute Engine VM.

    Console

    1. In the Google Cloud console, go to the VM instances page.

      Go to VM instances

    2. In the list of virtual machine instances, click SSH in the row of the instance you created.

      SSH button next to instance name.

    gcloud

    Use the gcloud compute ssh command to connect to the instance you created.

    gcloud compute ssh --project=PROJECT_ID --zone=ZONE VM_NAME

    Replace the following:

    • PROJECT_ID: The ID of the project that contains the instance.
    • ZONE: The name of the zone in which the instance is located.
    • VM_NAME: The name of the instance.
  3. Export the database to a SQL file:
    pg_dump -h IP_ADDRESS -U postgres -F plain \
      DB_NAME > SQL_FILE_NAME
    

    You will be prompted to enter the password of the postgres user.

    • IP_ADDRESS: The IP address of the primary instance.
    • -F plain: Sets the output format to a plain-text SQL script file that you can later import using the COPY command.
    • DB_NAME: The name of the database you want to export.
    • SQL_FILE_NAME: Provide a file name for the SQL file to create on the client host's local file system.

    The above shows a simple pg_dump command. For information about the wide range of options the command supports, see the PostgreSQL documentation provides.

  4. Copy the SQL file to the Cloud Storage bucket you created earlier:
    gcloud storage cp SQL_FILE_NAME gs://BUCKET_NAME

Clean up resources

After successfully exporting the SQL file, you can delete the Compute Engine VM you used during the export procedure.