The article covers how to push logs to Amazon S3 through the Cloudflare dashboard.
Cloudflare produces data related to all traffic seen across zones and audit logging to actions taken in the Cloudflare console.
Chronicle Data Types
- Cloudflare
Configuration
Note:
- This integration requires a Cloudflare Enterprise subscription.
- It is recommended either an AWS S3 or Google Cloud Storage bucket be setup for use with Cloudflare's LogPush. Depending on which is chosen, either the AWS S3 Bucket or GCP GCS Bucket may be followed for ingesting Cloudflare logs.
Enable Log Push to Amazon S3
Cloudflare Logpush supports pushing logs directly to Amazon S3 via the Cloudflare dashboard or via API. Customers that use AWS GovCloud locations should use S3-compatible endpoint and not the Amazon S3 endpoint.
Manage via CloudFlare Dashboard
Enable Logpush to Amazon S3 via the dashboard.
To enable the Cloudflare Logpush service:
- Log in to the Cloudflare dashboard.
- Select the Enterprise account or domain you want to use with Logpush.
- Go to Analytics & Logs > Logs.
- Click Connect a service. A modal window opens where you will need to complete several steps.
- Select the dataset you want to push to a storage service.
- Select the data fields to include in your logs. Add or remove fields later by modifying your settings in Logs > Logpush.
- Select Amazon S3.
- Enter or select the following destination information:
- Bucket path
- Daily subfolders
- Bucket region
- Encryption constraint in bucket policy
- For Grant Cloudflare access to upload files to your bucket, make sure your bucket has a policy (if you did not add it already):
- Copy the JSON policy, then go to your bucket in the Amazon S3 console and paste the policy in Permissions > Bucket Policy and click Save.
- Click Validate access.
- Enter the Ownership token (included in a file or log Cloudflare sends to your provider) and click Prove ownership. To find the ownership token, click the Open button in the Overview tab of the ownership challenge file.
- Click Save and Start Pushing to finish enabling Logpush.
Once connected, Cloudflare lists Amazon S3 as a connected service under Logs > Logpush. Edit or remove connected services from here.
Manage via API
Cloudflare uses Amazon Identity and Access Management (IAM) to gain access to your S3 bucket. The Cloudflare IAM user needs PutObject
permission for the bucket.
Logs are written into that bucket as gzipped objects using the S3 Access Control List (ACL) Bucket-owner-full-control
permission.
Only roles with Cloudflare Log Share edit permissions can read and configure Logpush jobs because job configurations may contain sensitive information. Ensure Log Share permissions are enabled, before attempting to read or configure a Logpush job.
For illustrative purposes, imagine that you want to store logs in the bucket burritobot
, in the logs
directory. The S3 URL would then be s3://burritobot/logs
.
To enable Logpush to Amazon S3:
- Create an S3 bucket.
Note: Buckets in China regions (cn-north-1
,cn-northwest-1
) are currently not supported. - Edit and paste the policy below into S3 > Bucket > Permissions > Bucket Policy, replacing the Resource value with your own bucket path. The AWS Principal is owned by Cloudflare and shouldn’t be changed.
{
"Id": "Policy1506627184792",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1506627150918",
"Action": ["s3:PutObject"],
"Effect": "Allow",
"Resource": "arn:aws:s3:::burritobot/logs/*",
"Principal": {
"AWS": ["arn:aws:iam::391854517948:user/cloudflare-logpush"]
}
}
]
}
Note:
Logpush uses multipart upload for S3. Aborted uploads will result in incomplete files remaining in your bucket. To minimize your storage costs, Amazon recommends configuring a lifecycle rule using the AbortIncompleteMultipartUpload
action.
Sample Logs
The following are the logs that Cloudflare send to the Chronicle.
{"ClientIP":"0.0.0.0","ClientRequestHost":"assets-service-proding.contoso.net",
"ClientRequestMethod":"POST","ClientRequestURI":"/integrations/assets/process",
"EdgeEndTimestamp":"2023-07-14T13:12:39Z","EdgeResponseBytes":7003,"EdgeResponseStatus":502,
"EdgeStartTimestamp":"2023-07-14T13:12:39Z","RayID":"7e6a04c05a3f03f9",
"CacheCacheStatus":"unknown","CacheReserveUsed":false,"CacheTieredFill":false,
"CacheResponseBytes":1665,"CacheResponseStatus":502,"FirewallMatchesActions":[],
"FirewallMatchesRuleIDs":[],"FirewallMatchesSources":[],"OriginDNSResponseTimeMs":0,
"OriginIP":"0.0.0.0","OriginRequestHeaderSendDurationMs":0,"OriginSSLProtocol":"TLSv1.2",
"OriginTCPHandshakeDurationMs":0,"OriginTLSHandshakeDurationMs":0,"OriginResponseBytes":0,
"OriginResponseDurationMs":2,"OriginResponseHTTPExpires":"","OriginResponseHTTPLastModified":"",
"OriginResponseHeaderReceiveDurationMs":2,"OriginResponseStatus":502,
"OriginResponseTime":2000000,"WAFAction":"unknown","WAFAttackScore":0,"WAFFlags":"0",
"WAFMatchedVar":"","WAFProfile":"unknown","WAFRCEAttackScore":0,"WAFRuleID":"",
"WAFRuleMessage":"","WAFSQLiAttackScore":0,"WAFXSSAttackScore":0,"WorkerCPUTime":0,
"WorkerStatus":"unknown","WorkerSubrequest":false,"WorkerSubrequestCount":0,
"WorkerWallTimeUs":0}
Comments
0 comments
Please sign in to leave a comment.