Integrating Google Cloud data into cost management

Cost Management Service 1-latest

Learn how to add and configure your Google Cloud integration

Red Hat Customer Content Services

Abstract

Learn how to add a Google Cloud integration to cost management. Cost management is part of the Red Hat Insights portfolio of services. The Red Hat Insights suite of advanced analytical tools helps you to identify and prioritize impacts on your operations, security, and business.

Chapter 1. Integrating Google Cloud data into cost management

To add a Google Cloud account to cost management, you must configure your Google Cloud to provide metrics, then add your Google Cloud account as a integration from the Red Hat Hybrid Cloud Console user interface.

Note

You must have a user with Cloud Administrator entitlements before you can add integrations to cost management.

Before you can add your Google Cloud account to cost management as a data integration, you must configure the following services on your Google Cloud account to allow cost management access to metrics:

  • Cost management Google Cloud project.
  • Billing service account member with the correct role to export your data to Red Hat Hybrid Cloud Console.
  • BigQuery dataset to contain the cost data.
  • Billing export that sends the cost data to your BigQuery dataset.

As you will complete some of the following steps in the Google Cloud console, and some steps in the cost management user interface, keep both applications open in a web browser.

Add your Google Cloud integration to cost management from the Integrations page.

Note

Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

1.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

1.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to cost management.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.

Verification steps

  1. Navigate to the Google Cloud Console Dashboard
  2. Verify the project is in the menu bar.

Additional resources

1.3. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the cost management project from the dropdown in the menu bar.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select these four permissions for your custom role:

    • bigquery.jobs.create
    • bigquery.tables.getData
    • bigquery.tables.get
    • bigquery.tables.list
  7. Click ADD.
  8. Click CREATE.

Additional resources

1.4. Adding a billing service account member to your Google Cloud project

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A cost management Identity and Access Management (IAM) role

Procedure

  1. In the Google Cloud Console, click IAM & AdminIAM.
  2. Select the cost management project from the dropdown in the menu bar.
  3. Click ADD.
  4. Paste the IAM role you created into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created. In this example, use customer-data-role.
  6. Click SAVE.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

Additional resources

1.5. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for cost management.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click Big DataBigQuery.
  2. Select the cost management project in the Explorer panel.
  3. Click CREATE DATASET.
  4. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerData.
  5. Click CREATE DATASET.

1.6. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the cost management BigQuery dataset.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the cost management Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

1.6.1. Viewing billing tables in BigQuery

You may want to review the metrics collected and sent to cost management. This can also assist with troubleshooting incorrect or missing data in cost management.

Note

Google may take several hours to export billing data to your BigQuery dataset.

Prerequisites

  • Access to Google Cloud console with bigquery.dataViewer role

Procedure

  1. Navigate to Big DataBigQuery in Google Cloud Console.
  2. Select the cost management project in the Explorer panel.
  3. Click gcp_billing_export_v1_xxxxxx_xxxxxx_xxxxxx table under the cost management dataset.
  4. Click the Preview tab to view the metrics.

Chapter 2. Integrating filtered Google Cloud data into cost management

To copy exports, object storage buckets, and filter your data to share only a subset of your billing information with Red Hat, you can configure a function script in Google Cloud.

Note

To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

To configure your Google Cloud account to be a cost management integration:

  • Create a Google Cloud project for your cost management data.
  • Create a bucket for filtered reports.
  • Have a billing service account member with the correct role to export your data to cost management.
  • Create a BigQuery dataset to contain the cost data.
  • Create a billing export that sends the cost management data to your BigQuery dataset.

Because you will complete some of the following steps in the Google Cloud Console, and some steps in the cost management user interface, keep both applications open in a web browser.

Note

Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

Add your Google Cloud integration to cost management from the Integrations page.

2.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

2.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to cost management.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.
  6. In the cost management Add a cloud integration wizard, on the Project page, enter your Project ID.
  7. To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management, click Next.

Verification steps

  1. Navigate to the Google Cloud Console Dashboard
  2. Verify the project is in the menu bar.

Additional resources

2.3. Creating a Google Cloud bucket

Create a bucket for filtered reports that you will create later. Buckets are containers that store data.

Procedure

  1. In the Google Cloud Console, click Buckets.
  2. Click Create bucket.
  3. Enter your bucket information. Name your bucket. In this example, use customer-data.
  4. Click Create, then click Confirm in the confirmation dialog.
  5. In the cost management Add a cloud integration wizard, on the Create cloud storage bucket page, enter your Cloud storage bucket name.

Additional resources

  • For additional information about creating buckets, see the Google Cloud documentation on Creating buckets.

2.4. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the cost management project from the dropdown in the menu bar.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select these four permissions for your custom role:

    • storage.objects.get
    • storage.objects.list
    • storage.buckets.get
  7. Click ADD.
  8. Click CREATE.

Additional resources

2.5. Adding a billing service account member to your Google Cloud project

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A cost management Identity and Access Management (IAM) role

Procedure

  1. In the Google Cloud Console, click IAM & AdminIAM.
  2. Select the cost management project from the dropdown in the menu bar.
  3. Click ADD.
  4. Paste the IAM role you created into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created. In this example, use customer-data-role.
  6. Click SAVE.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

Additional resources

2.6. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for cost management.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click Big DataBigQuery.
  2. Select the cost management project in the Explorer panel.
  3. Click CREATE DATASET.
  4. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerFilteredData.
  5. Click CREATE DATASET.

2.7. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the cost management BigQuery dataset.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the cost management Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

2.8. Creating a function to post filtered data to your storage bucket

Create a function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account. This script filters the cost data you created with BigQuery, removes non-Red Hat information, then creates .csv files, stores them in the bucket you created, and sends the data to Red Hat.

Procedure

  1. In the Google Cloud Console, search for secret and select the Secret manager result to set up a secret to authenticate your function with Red Hat without storing your credentials in your function.

    1. On the Secret Manager page, click Create Secret.
    2. Name your secret, add your Red Hat username, and click Create Secret.
    3. Repeat this process to save a secret for your Red Hat password.
  2. In the Google Cloud Console search bar, search for functions and select the Cloud Functions result.
  3. On the Cloud Functions page, click Create function.
  4. Name the function. In this example, use customer-data-function.
  5. In the Trigger section, click Save to accept the HTTP Trigger type.
  6. In the Runtime, build, connections and security settings, click the Security and image repository, reference the secrets you created, click Done, and click Next.
  7. On the Cloud Functions Code page, set the runtime to Python 3.9.
  8. Open the requirements.txt file. Paste the following lines to the end of the file.

    requests
    google-cloud-bigquery
    google-cloud-storage
  9. Open the main.py file.

    1. Set the Entry Point to get_filtered_data.
    2. Paste the following python script. Change the values in the section marked # Required vars to update to the values for your environment.

      import csv
      import datetime
      import uuid
      import os
      import requests
      from google.cloud import bigquery
      from google.cloud import storage
      from itertools import islice
      from dateutil.relativedelta import relativedelta
      
      query_range = 5
      now = datetime.datetime.now()
      delta = now - relativedelta(days=query_range)
      year = now.strftime("%Y")
      month = now.strftime("%m")
      day = now.strftime("%d")
      report_prefix=f"{year}/{month}/{day}/{uuid.uuid4()}"
      
      # Required vars to update
      USER = os.getenv('username')         # Cost management username
      PASS = os.getenv('password')         # Cost management password
      INTEGRATION_ID = "<integration_id>"  # Cost management integration_id
      BUCKET = "<bucket>"                  # Filtered data GCP Bucket
      PROJECT_ID = "<project_id>"          # Your project ID
      DATASET = "<dataset>"                # Your dataset name
      TABLE_ID = "<table_id>"              # Your table ID
      
      gcp_big_query_columns = [
          "billing_account_id",
          "service.id",
          "service.description",
          "sku.id",
          "sku.description",
          "usage_start_time",
          "usage_end_time",
          "project.id",
          "project.name",
          "project.labels",
          "project.ancestry_numbers",
          "labels",
          "system_labels",
          "location.location",
          "location.country",
          "location.region",
          "location.zone",
          "export_time",
          "cost",
          "currency",
          "currency_conversion_rate",
          "usage.amount",
          "usage.unit",
          "usage.amount_in_pricing_units",
          "usage.pricing_unit",
          "credits",
          "invoice.month",
          "cost_type",
          "resource.name",
          "resource.global_name",
      ]
      table_name = ".".join([PROJECT_ID, DATASET, TABLE_ID])
      
      BATCH_SIZE = 200000
      
      def batch(iterable, n):
          """Yields successive n-sized chunks from iterable"""
          it = iter(iterable)
          while chunk := tuple(islice(it, n)):
              yield chunk
      
      def build_query_select_statement():
          """Helper to build query select statement."""
          columns_list = gcp_big_query_columns.copy()
          columns_list = [
              f"TO_JSON_STRING({col})" if col in ("labels", "system_labels", "project.labels") else col
              for col in columns_list
          ]
          columns_list.append("DATE(_PARTITIONTIME) as partition_date")
          return ",".join(columns_list)
      
      def create_reports(query_date):
          query = f"SELECT {build_query_select_statement()} FROM {table_name} WHERE DATE(_PARTITIONTIME) = {query_date} AND sku.description LIKE '%RedHat%' OR sku.description LIKE '%Red Hat%' OR  service.description LIKE '%Red Hat%' ORDER BY usage_start_time"
          client = bigquery.Client()
          query_job = client.query(query).result()
          column_list = gcp_big_query_columns.copy()
          column_list.append("partition_date")
          daily_files = []
          storage_client = storage.Client()
          bucket = storage_client.bucket(BUCKET)
          for i, rows in enumerate(batch(query_job, BATCH_SIZE)):
              csv_file = f"{report_prefix}/{query_date}_part_{str(i)}.csv"
              daily_files.append(csv_file)
              blob = bucket.blob(csv_file)
              with blob.open(mode='w') as f:
                  writer = csv.writer(f)
                  writer.writerow(column_list)
                  writer.writerows(rows)
          return daily_files
      
      def post_data(files_list):
          # Post CSV's to console.redhat.com API
          url = "https://console.redhat.com/api/cost-management/v1/ingress/reports/"
          json_data = {"source": INTEGRATION_ID, "reports_list": files_list, "bill_year": year, "bill_month": month}
          resp = requests.post(url, json=json_data, auth=(USER, PASS))
          return resp
      
      def get_filtered_data(request):
          files_list = []
          query_dates = [delta + datetime.timedelta(days=x) for x in range(query_range)]
          for query_date in query_dates:
              files_list += create_reports(query_date.date())
          resp = post_data(files_list)
          return f'Files posted! {resp}'
  10. Click Deploy.

2.9. Trigger your function to post filtered data to your storage bucket

Create a scheduler job to run the function you created to send filtered data to Red Hat on a schedule.

Procedure

  1. Copy the Trigger URL for the function you created to post the cost reports. You will need to add it to the Google Cloud Scheduler.

    1. In the Google Cloud Console, search for functions and select the Cloud Functions result.
    2. On the Cloud Functions page, select your function, and click the Trigger tab.
    3. In the HTTP section, click Copy to clipboard.
  2. Create the scheduler job. In the Google Cloud Console, search for cloud scheduler and select the Cloud Scheduler result.
  3. Click Create job.

    1. Name your scheduler job. In this example, use CustomerFilteredDataSchedule.
    2. In the Frequency field, set the cron expression for when you want the function to run. In this example, use 09*** to run the function daily at 9 AM.
    3. Set the timezone and click Continue.
  4. Configure the execution on the next page.

    1. In the Target type field, select HTTP.
    2. In the URL field, paste the Trigger URL you copied.
    3. In the body field, paste the following code that passes into the function to trigger it.

      {"name": "Scheduler"}
    4. In the Auth header field, select Add OIDC token.
    5. Click the Service account field and click Create to create a service account and role for the scheduler job.
  5. In the Service account details step, name your service account. In this example, use scheduler-service-account. Accept the default Service account ID and click Create and Continue.

    1. In the Grand this service account access to project, select two roles for your account.
    2. Click ADD ANOTHER ROLE then search for and select Cloud Scheduler Job Runner and Cloud Functions Invoker.
    3. Click Continue.
    4. Click Done to finish creating the service account.
  6. On the Service accounts for your project page, select the scheduler job that you were working on. In this example, the name is scheduler-service-account.
  7. In the Configure the execution page, select the Service account field and select the scheduler-service-account you just created.
  8. Click Continue and then click Create.

Chapter 3. Next steps for managing your costs

After adding your OpenShift Container Platform and Google Cloud integration, on the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.

You can also use the global navigation menu to view additional details about your costs by cloud provider.

3.1. Limiting access to cost management resources

After you add and configure integrations in cost management, you can limit access to cost data and resources.

You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.

To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.

3.2. Configuring tagging for your integrations

The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.

You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.

Important

You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.

To learn more about the following topics, see Managing cost data using tagging:

  • Planning your tagging strategy to organize your view of cost data
  • Understanding how cost management associates tags
  • Configuring tags and labels on your integrations

3.3. Configuring cost models to accurately report costs

Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.

A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.

In Cost Models, you can complete the following tasks:

  • Classifying your costs as infrastructure or supplementary costs
  • Capturing monthly costs for OpenShift nodes and clusters
  • Applying a markup to account for additional support costs

To learn how to configure a cost model, see Using cost models.

3.4. Visualizing your costs with Cost Explorer

Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.

To learn more about the following topics, see Visualizing your costs using Cost Explorer:

  • Using Cost Explorer to identify abnormal events
  • Understanding how your cost data changes over time
  • Creating custom bar charts of your cost and usage data
  • Exporting custom cost data tables

Providing feedback on Red Hat documentation

If you found an error or have a suggestion on how to improve these guidelines, open an issue in the cost management Jira board and add the Documentation label.

We appreciate your feedback!

Legal Notice

Copyright © 2024 Red Hat, Inc.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, the Red Hat logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.