Tenable.io is a cloud service for vulnerability management. The method required to ingest Tenable.io logs into Google Chronicle requires an AWS EC2 instance that extracts logs from the Tenable.io API and copies them into an AWS S3 storage bucket. A Chronicle feed is configured to obtain the Tenable.io logs from the bucket to complete the ingestion process.
Overview
This topic describes the steps to configure logs on tenable.io.
Prerequisites
-
Tenable.io administrator console access
-
An Ubuntu server based on one of the following versions (24.04, 22.04, 18.04)
- Ubuntu server must be built with the following virtual hardware
- CPU - 2 core
- Memory - 4GB
- Storage - 50GB or above
Configuration
Generate Keys in the tenable.io
Now access your Tenable.io dashboard.
- Click on the user icon in the top right.
- Click on "My Account".
- Click on "API Keys" and click "Generate".
- Save the API Keys of Access Key & Secret Key.
Steps to Install boto3 in Ubuntu
To install boto3 on Ubuntu, you can use pip, the package installer for Python packages. Here's how you can do it:
1. First, ensure that you have Python and pip installed on your Ubuntu system. Most Ubuntu installations come with Python pre-installed. You can check the Python version by running:
python - -version
If Python is installed, it will print the version number.
2. Install pip if it's not already installed. You can do this by running:
sudo apt update
sudo apt install python3-pip
This will install pip for Python 3. Verify the installation by running:
pip3 - -version
3. Once you have pip installed, you can install boto3 by running:
pip3 install boto3
4. After the installation is complete, you can verify that boto3 is installed by running:
pip3 show boto3
This command will display information about the installed boto3 package.
Implementing the Python script with the boto3 library
Before implementing the Python script with the boto3 library, you need to create S3 Bucket in AWS.
Please find the below URL reference to create the S3 Bucket in AWS
https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html
If script is accessing AWS services, ensure that the appropriate permissions are set for the AWS account you are using. This typically involves configuring AWS IAM (Identity and Access Management) roles and policies.
To create and run a Python script that utilizes the boto3 library for interacting with AWS services, you can follow these steps:
Open a text editor such as Nano or Vim in your terminal:
nano my_script.py
Then, paste your Python script into the editor.
import boto3
import json
import requests
def fetch_tenable_logs():
# Make a request to Tenable.io API to fetch logs
# Replace 'TENABLE_ACCESS_KEY' and 'TENABLE_SECRET_KEY' with your Tenable API credentials
access_key = 'TENABLE_ACCESS_KEY'
secret_key = 'TENABLE_SECRET_KEY'
headers = {
'X-ApiKeys': f'accessKey={access_key}; secretKey={secret_key};'
}
# Replace 'LOG_FETCH_URL' with the URL to fetch logs from Tenable.io
log_fetch_url = 'LOG_FETCH_URL'
response = requests.get(log_fetch_url, headers=headers)
if response.status_code == 200:
return response.json()['logs']
else:
print(f"Failed to fetch logs. Status code: {response.status_code}")
return None
def upload_logs_to_s3(logs):
# Initialize Boto3 S3 client
s3 = boto3.client('s3')
# Replace 'YOUR_BUCKET_NAME' with the name of your S3 bucket
bucket_name = 'YOUR_BUCKET_NAME'
for log in logs:
# Convert log to JSON string
log_json = json.dumps(log)
# Replace 'YOUR_LOG_KEY_PREFIX' with the desired prefix for log objects in S3
log_key = 'YOUR_LOG_KEY_PREFIX/' + log['timestamp'] + '.json'
# Upload log to S3 bucket
s3.put_object(Bucket=bucket_name, Key=log_key, Body=log_json)
print(f"Uploaded log {log_key} to S3")
def main():
# Fetch logs from Tenable.io
tenable_logs = fetch_tenable_logs()
if tenable_logs:
# Upload logs to S3 bucket
upload_logs_to_s3(tenable_logs)
else:
print("No logs fetched from Tenable.io")
if __name__ == "__main__":
main()
Save the script by pressing Ctrl
+ O
, then confirm the file name by pressing Enter
.
Exit the editor by pressing Ctrl
+ X
.
The fetch_tenable_logs() function makes a request to the Tenable.io API to fetch logs
You need to replace below details,
TENABLE_ACCESS_KEY = Paste the Access Key here where you have generated at Step3.1
TENABLE_SECRET_KEY = Paste the Secret Key here where you have generated at Step3.1
LOG_FETCH_URL – URL to fetch the logs from Tenable.io.
Example: https://cloud.tenable.com/audit-log/v1/events
The upload_logs_to_s3(logs) function takes the fetched logs as input and uploads them to an S3 bucket using the boto3 library.
bucket_name = 'YOUR_BUCKET_NAME' - S3 Bucket name
log_key = 'YOUR_LOG_KEY_PREFIX/' - with the desired prefix for log objects in S3.
Finally, the if __name__ == "__main__": block ensures that the main() function is executed when the script is run as the main program.
Make sure you are in the same directory as your script or provide the full path to the script.
Make sure to replace the placeholder values with your actual credentials, URLs, bucket names, and log key prefixes.
It returns the logs if the request is successful (status code 200), otherwise, it prints an error message and returns None.
1. Now go to Chronicle > Settings > Feeds. Click on Feeds where you can find the data feeds that you have configured.
2. In Input Parameters tab,
- Region - Select Region.
- S3 URI - URL of the S3 Bucket.
- URI IS A - Choose ‘Directory which Includes subdirectories’.
- SOURCE DELETION OPTION - Choose ‘Never delete files’.
- Access Key ID - Provide the Access Key of the S3 Bucket
- Secret Key ID - Provide Secret Key of the S3 Bucket.
3. Click Next.
Note: It is not recommended to hard code credentials for production implementations. The example is used for testing. Please implement the proper OS environment variables for managing secrets
Comments
0 comments
Please sign in to leave a comment.