VPC Flow Logs records a sample of network flows sent from and received by VM instances. These logs can be used for network monitoring, forensics, real-time security analysis, and expense optimization.
You can view flow logs in Cloud Logging, and you can export logs to any destination that Cloud Logging export supports.
Flow logs are aggregated by connection from Compute Engine VMs and exported in real time.
Ingesting VPC network logs into Chronicle
To import VPC logs into Chronicle,
- Login to Google Cloud account using credentials
- On Welcome page, click VPC Networks icon
3. In VPC network page, click Default button. A subnet page appears
4. Select all logs and click Flow Logs --> Configure from the dropdown menu. A small window appears
5. Select Aggregation Interval from the following
-
- 5 Sec
- 30 Sec
- 1 Min
- 5 Min
- 10 Min
- 15 Min
6. Enter Sample Rate. For example, 50% and click Save
After saving, VPC logs start flowing into Chronicle
7. Next, search "Logging" in the search bar at the top and click Enter. By default, it navigates you to Log Explorer
8. In Log Explorer, you can see all logs that come from multiple sources. Filter the logs by choosing VPC_flows in Log Name at top right corner of screen and click Apply
All VPC logs are sorted out in the page
9. Click More Actions and select Create Sink from the dropdown menu. It navigates you to Logs Router screen
10. In Create logs Routing Sink window, fill the following details
-
- Under Sink Details, enter Name & Description. For example, test_gcp_vcp_flows & GCP Flows
- Click Next
- Under Sink Destination, in Select sink service, select Cloud Storage Bucket and in Cloud Storage Bucket, select test-vpc-flow-logs or create a new bucket
- Click Next
- Under Choose Logs to include in Sink, a default log is populated once you select an option in Cloud Storage Bucket
- Click Next
- (Optional) Under Choose Logs to filter out of Sink, choose the logs that you would like not to sink
- Click Create Sink. All logs will be sinked and stored in Cloud Storage Bucket
Viewing VPC logs in Cloud Storage Bucket
To view the VPC logs that are synchronized in cloud storage bucket, first you must grant the Chronicle access. You must add the email address as 8911409095528497-0-account@partnercontent.gserviceaccount.com to the permissions of the relevant Google Cloud Storage object(s). You must also perform the following actions from the Cloud Storage section in the Google Cloud Console (console.cloud.google.com)
- To grant read permission to a specific file, you can "Edit access" on that file and grant the above email "Reader" access. This can only be done if you have not enabled uniform bucket-level access.
- If you configure the feed to delete source files (see below for how to do this), you must add the above email as a principle on your bucket and grant it the IAM role of Storage Object Admin.
- To grant read permission to multiple files you must grant access at the bucket level. Specifically, you must add the above email as a principle to your storage bucket and grant it the IAM role of Storage Object Viewer.
To enable permission to multiple files in a single bucket at a time, following steps help you achieve it.
- Click on the bucket that you would like enable permissions. For example, test-vpc-flow-logs
- Click compute.googleapis.com --> vpc-flow-logs. A bucket details window appears
3. In Permissions tab, click ADD
4.In Role, select "Storage Object Viewer" from the dropdown menu
5. Click Save. A gsutil URL is generated for a storage bucket that you have enabled permissions
6. In order to ingest VPC logs into Chronicle, you must copy "gsutil URL" from the configuration tab of a storage bucket (for example, test-vpc-flow-logs) and paste it in the Input parameters of Chronicle Feeds.
Configuring a feed in Chronicle Instance
To configure a feed in Chronicle,
- From your Chronicle instance page, select Settings from the main menu at top left of your screen
2. Click on Feeds where you can find the data feeds that you have configured as well as the default feeds that Google provided.
3. From the Feeds page, click ADD NEW at top of the screen.
An ADD FEED window appears.
4. In Set Properties tab, select SOURCE TYPE as Google Cloud Storage from the dropdown menu
5. Select the Log Type as GCP VPC Flow from the dropdown menu.
6. In Chronicle Service Account, type in the account address as 994980096-0-account@partnercontent.gserviceaccount.com.
7. Click Next.
8. In Input Parameters tab, paste the gsutil URL(gs://test-vpc-flow-logs) that you copied from configuration tab of a storage bucket (for example, test-vpc-flow-logs).
9. Click Next.
10. In Finalize tab, click Submit.
VPC logs have been ingested successfully.
Sample Logs
The following are the logs that GCP VPC send to Chronicle.
{"insertId":"3mf4ecfnmz6s0","jsonPayload":{"bytes_sent":"8","connection":{"dest_ip":"0.0.0.0",
"dest_port":443,"protocol":6,"src_ip":"0.0.0.0","src_port":63435},
"dest_location":{"asn":2222,"continent":"America","country":"usa","region":"California"},
"end_time":"2023-07-17T11:00:01.271117209Z","packets_sent":"8","reporter":"SRC",
"src_instance":{"project_id":"conto-test-v01","region":"ai-clet2","vm_name":"win2022-srv-maincenter",
"zone":"ai-clet2-a"},"src_vpc":{"project_id":"conto-test-v01","subnetwork_name":"e1-mgt-private",
"vpc_name":"conto-test-v01"},"start_time":"2023-07-17T11:00:01.271117209Z"},
"logName":"projects/conto-test-v01/logs/compute.googleapis.com%2Fvpc_flows",
"receiveTimestamp":"2023-07-17T11:00:10.609220794Z","resource":{"labels":
{"location":"us-west4-a","project_id":"conto-test-v01","subnetwork_id":"657339034369585128",
"subnetwork_name":"e2-mnt-private"},"type":"gce_subnetwork"},
"timestamp":"2023-07-17T11:00:10.609220794Z"}
Comments
0 comments
Please sign in to leave a comment.