This document describes how to configure Google Cloud SQL logs with Google Cloud Logging and sent to a Cloud pub/sub with an HTTP push forwarder.
Google Cloud SQL
Google Cloud SQL is a fully managed database service that makes it easy to set up, maintain, manage, and administer your SQL databases in the cloud.
Get metrics from Google Cloud SQL to:
- Visualize the performance of your Cloud SQL databases.
- Correlate the performance of your Cloud SQL databases with your applications.
Installation
Metric Collection
If you haven’t already, set up the Google Cloud Platform integration first. There are no other installation steps.
Note: Use below reference link to set up GCP Integration.
Link: https://docs.datadoghq.com/integrations/google_cloud_platform/
Configuration
To collect custom Cloud SQL labels as tags, enable the cloud asset inventory permission.
Log Collection
Google Cloud SQL logs are collected with Google Cloud Logging and sent to a Cloud pub/sub with an HTTP push forwarder. If you haven’t already, set up a Cloud pub/sub with an HTTP push forwarder.
Note: Use below reference link to set up Cloud pub/sub.
Link: https://docs.datadoghq.com/integrations/google_cloud_platform/#log-collection
Once this is done, export your Google Cloud SQL logs from Google Cloud Logging to the pub/sub:
- Go to the Google Cloud Logging page and filter Google Cloud SQL logs.
- Click Create Sink and name the sink accordingly.
- Choose Cloud Pub/Sub as the destination and select the pub/sub that was created for that purpose. Note: The pub/sub can be located in a different project.
- Click Create and wait for the confirmation message to show up.
- In Create logs Routing Sink window, fill the following details
- Under Sink Details, enter Name & Description.
- Click Next
- Under Sink Destination, in Select sink service, select Cloud Storage Bucket and in Cloud Storage Bucket, select existing bucket or create a new bucket
- Click Next
- Under Choose Logs to include in Sink, a default log is populated once you select an option in Cloud Storage Bucket
- Click Next
- (Optional) Under Choose Logs to filter out of Sink, choose the logs that you would like not to sink
- Click Create Sink.
All logs will be sinked and stored in Cloud Storage Bucket
Viewing logs in Cloud Storage Bucket
To view the GCP Cloud SQL logs that are synchronized in cloud storage bucket, first you must grant the Chronicle access. You must add an email You must add the email address 8911409095528497-0 account@partnercontent.gserviceaccount.com to the permissions of the relevant Google Cloud Storage object(s). You must also perform the following actions from the Cloud Storage section in the Google Cloud Console.
- To grant read permission to a specific file, you can "Edit access" on that file and grant the email "Reader" access. This can only be done if you have not enabled uniform bucket-level access.
- If you configure the feed to delete source files (see below for how to do this), you must add the email as a principle on your bucket and grant it the IAM role of Storage Object Admin.
- To grant read permission to multiple files you must grant access at the bucket level. Specifically, you must add the above email as a principle to your storage bucket and grant it the IAM role of Storage Object Viewer.
To enable permission to multiple files in a single bucket at a time, following steps help you achieve it.
- Click on the bucket that you would like enable permissions.
- Click sqladmin.googleapis.com --> GCP Cloud SQL
A bucket details window appears - In Permissions tab, click ADD
- In New Principals field, add the email address "8911409095528497-0 account@partnercontent.gserviceaccount.com "
- In Role, select "Storage Object Viewer" from the dropdown menu
- Click Save
A gsutil URL is generated for a storage bucket that you have enabled permissions
7. In order to ingest GCP Cloud Sql logs into Chronicle, you must copy "gsutil URL" from the configuration tab of a storage bucket and paste it in the Input parameters of Chronicle Feeds
Configuring a feed in Chronicle
To configure a feed in Chronicle,
- From your Chronicle instance page, select Settings from the main menu at top left of your screen
- Click on Feeds where you can find the data feeds that you have configured as well as the default feeds that Google provided
- From the Feeds page, click ADD NEW at top of the screen
The ADD FEED window appears - In Set Properties tab, select SOURCE TYPE as Google Cloud Storage from the dropdown menu
5. Select the Log Type as GCP Cloud SQL from the dropdown menu
6. Click Next
7. In Input Parameters tab, paste the gsutil URL that you copied from configuration tab of a storage bucket
8. Click Next
9. In Finalize tab, click Submit
Comments
0 comments
Please sign in to leave a comment.