Embed text with pretrained TensorFlow models

This tutorial shows you how to generate NNLM, SWIVEL, and BERT text embeddings in BigQuery by using pretrained TensorFlow models. A text embedding is a dense vector representation of a piece of text such that if two pieces of text are semantically similar, then their respective embeddings are close together in the embedding vector space.

The NNLM, SWIVEL, and BERT models

The NNLM, SWIVEL, and BERT models vary in size, accuracy, scalability, and cost. Use the following table to help you determine which model to use:

Model Model size Embedding dimension Use case Description
NNLM <150MB 50 Short phrases, news, tweets, reviews Neural Network Language Model
SWIVEL <150MB 20 Short phrases, news, tweets, reviews Submatrix-wise Vector Embedding Learner
BERT ~200MB 768 Short phrases, news, tweets, reviews, short paragraphs Bidirectional Encoder Representations from Transformers

In this tutorial, the NNLM and SWIVEL models are imported TensorFlow models, and the BERT model is a remote model on Vertex AI.

Required permissions

  • To create the dataset, you need the bigquery.datasets.create Identity and Access Management (IAM) permission.

  • To create the bucket, you need the storage.buckets.create IAM permission.

  • To upload the model to Cloud Storage, you need the storage.objects.create and storage.objects.get IAM permissions.

  • To create the connection resource, you need the following IAM permissions:

    • bigquery.connections.create
    • bigquery.connections.get
  • To load the model into BigQuery ML, you need the following IAM permissions:

    • bigquery.jobs.create
    • bigquery.models.create
    • bigquery.models.getData
    • bigquery.models.updateData
  • To run inference, you need the following IAM permissions:

    • bigquery.tables.getData on the object table
    • bigquery.models.getData on the model
    • bigquery.jobs.create

Costs

In this document, you use the following billable components of Google Cloud:

  • BigQuery: You incur costs for the queries that you run in BigQuery.
  • BigQuery ML: You incur costs for the model that you create and the inference that you perform in BigQuery ML.
  • Cloud Storage: You incur costs for the objects that you store in Cloud Storage.
  • Vertex AI: If you follow the instructions for generating the BERT model, then you incur costs for deploying the model to an endpoint.

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

For more information, see the following resources:

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Enable the BigQuery, BigQuery Connection, and Vertex AI APIs.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Make sure that billing is enabled for your Google Cloud project.

  7. Enable the BigQuery, BigQuery Connection, and Vertex AI APIs.

    Enable the APIs

Create a dataset

To create a dataset named tf_models_tutorial to store the models that you create, select one of the following options:

SQL

Use the CREATE SCHEMA statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    CREATE SCHEMA `PROJECT_ID.tf_models_tutorial`;
    

    Replace PROJECT_ID with your project ID.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

  2. To create the dataset, run the bq mk command:

    bq mk --dataset --location=us PROJECT_ID:tf_models_tutorial
    

    Replace PROJECT_ID with your project ID.

Generate and upload a model to Cloud Storage

For more detailed instructions on generating text embeddings using pretrained TensorFlow models, see the Colab notebook. Otherwise, select one of the following models:

NNLM

  1. Install the bigquery-ml-utils library using pip:

    pip install bigquery-ml-utils
    
  2. Generate an NNLM model. The following Python code loads an NNLM model from TensorFlow Hub and prepares it for BigQuery:

    from bigquery_ml_utils import model_generator
    
    # Establish an instance of TextEmbeddingModelGenerator.
    text_embedding_model_generator = model_generator.TextEmbeddingModelGenerator()
    
    # Generate an NNLM model.
    text_embedding_model_generator.generate_text_embedding_model('nnlm', OUTPUT_MODEL_PATH)
    

    Replace OUTPUT_MODEL_PATH with a path to a local folder where you can temporarily store the model.

  3. Optional: Print the generated model's signature:

    import tensorflow as tf
    
    reload_embedding_model = tf.saved_model.load(OUTPUT_MODEL_PATH)
    print(reload_embedding_model.signatures["serving_default"])
    
  4. To copy the generated model from your local folder to a Cloud Storage bucket, use the Google Cloud CLI:

    gcloud storage cp OUTPUT_MODEL_PATH gs://BUCKET_PATH/nnlm_model --recursive
    

    Replace BUCKET_PATH with the name of the Cloud Storage bucket to which you are copying the model.

SWIVEL

  1. Install the bigquery-ml-utils library using pip:

    pip install bigquery-ml-utils
    
  2. Generate a SWIVEL model. The following Python code loads a SWIVEL model from TensorFlow Hub and prepares it for BigQuery:

    from bigquery_ml_utils import model_generator
    
    # Establish an instance of TextEmbeddingModelGenerator.
    text_embedding_model_generator = model_generator.TextEmbeddingModelGenerator()
    
    # Generate a SWIVEL model.
    text_embedding_model_generator.generate_text_embedding_model('swivel', OUTPUT_MODEL_PATH)
    

    Replace OUTPUT_MODEL_PATH with a path to a local folder where you can temporarily store the model.

  3. Optional: Print the generated model's signature:

    import tensorflow as tf
    
    reload_embedding_model = tf.saved_model.load(OUTPUT_MODEL_PATH)
    print(reload_embedding_model.signatures["serving_default"])
    
  4. To copy the generated model from your local folder to a Cloud Storage bucket, use the Google Cloud CLI:

    gcloud storage cp OUTPUT_MODEL_PATH gs://BUCKET_PATH/swivel_model --recursive
    

    Replace BUCKET_PATH with the name of the Cloud Storage bucket to which you are copying the model.

BERT

  1. Install the bigquery-ml-utils library using pip:

    pip install bigquery-ml-utils
    
  2. Generate a BERT model. The following Python code loads a BERT model from TensorFlow Hub and prepares it for BigQuery:

    from bigquery_ml_utils import model_generator
    
    # Establish an instance of TextEmbeddingModelGenerator.
    text_embedding_model_generator = model_generator.TextEmbeddingModelGenerator()
    
    # Generate a BERT model.
    text_embedding_model_generator.generate_text_embedding_model('bert', OUTPUT_MODEL_PATH)
    

    Replace OUTPUT_MODEL_PATH with a path to a local folder where you can temporarily store the model.

  3. Optional: Print the generated model's signature:

    import tensorflow as tf
    
    reload_embedding_model = tf.saved_model.load(OUTPUT_MODEL_PATH)
    print(reload_embedding_model.signatures["serving_default"])
    
  4. To copy the generated model from your local folder to a Cloud Storage bucket, use the Google Cloud CLI:

    gcloud storage cp OUTPUT_MODEL_PATH gs://BUCKET_PATH/bert_model --recursive
    

    Replace BUCKET_PATH with the name of the Cloud Storage bucket to which you are copying the model.

Load the model into BigQuery

Select one of the following models:

NNLM

Use the CREATE MODEL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    CREATE OR REPLACE MODEL tf_models_tutorial.nnlm_model
    OPTIONS (
      model_type = 'TENSORFLOW',
      model_path = 'gs://BUCKET_NAME/nnlm_model/*');
    

    Replace BUCKET_NAME with the name of the bucket that you previously created.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

SWIVEL

Use the CREATE MODEL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    CREATE OR REPLACE MODEL tf_models_tutorial.swivel_model
    OPTIONS (
      model_type = 'TENSORFLOW',
      model_path = 'gs://BUCKET_NAME/swivel_model/*');
    

    Replace BUCKET_NAME with the name of the bucket that you previously created.

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

BERT

To load the BERT model into BigQuery, import the BERT model to Vertex AI, deploy the model to a Vertex AI endpoint, create a connection, and then create a remote model in BigQuery.

To import the BERT model to Vertex AI, follow these steps:

  1. In the Google Cloud console, go to the Vertex AI Model registry page.

    Go to Model registry

  2. Click Import, and then do the following:

    • For Name, enter BERT.
    • For Region, select a region that matches your Cloud Storage bucket's region.
  3. Click Continue, and then do the following:

    • For Model framework version, select 2.8.
    • For Model artifact location, enter the path to the Cloud Storage bucket where you stored the model file. For example, gs://BUCKET_PATH/bert_model.
  4. Click Import. After the import is complete, your model appears on the Model registry page.

To deploy the BERT model to a Vertex AI endpoint and connect it to BigQuery, follow these steps:

  1. In the Google Cloud console, go to the Vertex AI Model registry page.

    Go to Model registry

  2. Click on the name of your model.

  3. Click Deploy & test.

  4. Click Deploy to endpoint.

  5. For Endpoint name, enter bert_model_endpoint.

  6. Click Continue.

  7. Select your compute resources.

  8. Click Deploy.

  9. Create a BigQuery Cloud resource connection and grant access to the connection's service account.

To create a remote model based on the Vertex AI endpoint, use the CREATE MODEL statement:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    CREATE OR REPLACE MODEL tf_models_tutorial.bert_model
    INPUT(content STRING)
    OUTPUT(embedding ARRAY<FLOAT64>)
    REMOTE WITH CONNECTION `PROJECT_ID.CONNECTION_LOCATION.CONNECTION_ID`
    OPTIONS (
      ENDPOINT = "https://ENDPOINT_LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/ENDPOINT_LOCATION/endpoints/ENDPOINT_ID");
    

    Replace the following:

    • PROJECT_ID: the project ID
    • CONNECTION_LOCATION: the location of your BigQuery connection
    • CONNECTION_ID: the ID of your BigQuery connection

      When you view the connection details in the Google Cloud console, this is the value in the last section of the fully qualified connection ID that is shown in Connection ID, for example projects/myproject/locations/connection_location/connections/myconnection

    • ENDPOINT_LOCATION: the location of your Vertex AI endpoint. For example: "us-central1".
    • ENDPOINT_ID: the ID of your model endpoint

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

Generate text embeddings

In this section, you use the ML.PREDICT() inference function to generate text embeddings of the review column from the public dataset bigquery-public-data.imdb.reviews. The query limits the table to 500 rows to reduce the amount of data processed.

NNLM

SELECT
  *
FROM
  ML.PREDICT(
    MODEL `tf_models_tutorial.nnlm_model`,
    (
    SELECT
      review AS content
    FROM
      `bigquery-public-data.imdb.reviews`
    LIMIT
      500)
  );

The result is similar to the following:

+-----------------------+----------------------------------------+
| embedding             | content                                |
+-----------------------+----------------------------------------+
|  0.08599445223808289  | Isabelle Huppert must be one of the... |
| -0.04862852394580841  |                                        |
| -0.017750458791851997 |                                        |
|  0.8658871650695801   |                                        |
| ...                   |                                        |
+-----------------------+----------------------------------------+

SWIVEL

SELECT
  *
FROM
  ML.PREDICT(
    MODEL `tf_models_tutorial.swivel_model`,
    (
    SELECT
      review AS content
    FROM
      `bigquery-public-data.imdb.reviews`
    LIMIT
      500)
  );

The result is similar to the following:

+----------------------+----------------------------------------+
| embedding            | content                                |
+----------------------+----------------------------------------+
|  2.5952553749084473  | Isabelle Huppert must be one of the... |
| -4.015787601470947   |                                        |
|  3.6275434494018555  |                                        |
| -6.045154333114624   |                                        |
| ...                  |                                        |
+----------------------+----------------------------------------+

BERT

SELECT
  *
FROM
  ML.PREDICT(
    MODEL `tf_models_tutorial.bert_model`,
    (
    SELECT
      review AS content
    FROM
      `bigquery-public-data.imdb.reviews`
    LIMIT
      500)
  );

The result is similar to the following:

+--------------+---------------------+----------------------------------------+
| embedding    | remote_model_status | content                                |
+--------------+---------------------+----------------------------------------+
| -0.694072425 | null                | Isabelle Huppert must be one of the... |
|  0.439208865 |                     |                                        |
|  0.99988997  |                     |                                        |
| -0.993487895 |                     |                                        |
| ...          |                     |                                        |
+--------------+---------------------+----------------------------------------+

Clean up

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.