Integrating Datadog Log Data
Updated over a week ago

🛠 Botify LogAnalyzer requires information from your web server's daily logs and your website's crawl report. This document describes the integration process when Datadog hosts the logs.

Overview

Datadog customers configure an automated export to one of the following storage methods they own, and Botify fetches the logs from the storage location:

AWS S3

  1. Create an S3 bucket from your AWS console and a role for Datadog to integrate with AWS Cloudwatch.

  2. Add the following two permission statements to the IAM policies of your Datadog role. Edit the bucket names and, optionally, specify the paths that contain your log archives. The PutObject permission is sufficient for uploading archives. GetObject and ListObject can be used by other services provided by Datadog.

    { "Version": "2012-10-17", "Statement": [ { "Sid": "DatadogUploadLogs", "Effect": "Allow", "Action": ["s3:PutObject", "s3:GetObject"], "Resource": [ "arn:aws:s3:::<MY_BUCKET_NAME_1_/_MY_OPTIONAL_BUCKET_PATH_1>/*" ] }, { "Sid": "DatadogListBucket", "Effect": "Allow", "Action": "s3:ListBucket", "Resource": [ "arn:aws:s3:::<MY_BUCKET_NAME_1>" ] } ] }
  3. On your Datadog Archives page, select the Add a new archive option at the bottom. Only Admin Datadog users can complete this and the following step.

  4. Select the appropriate AWS account + role combination for your S3 bucket.

  5. Add your bucket name and, optionally, a prefix directory for the entire content of your log archives.

  6. Click the Save button to save your archive.

    1454
  7. Create a new role in AWS that Botify will use to download the logs from the S3 bucket.

  8. Send an email to Support to request our ARN, which you will add to your policy to assume the role.

  9. Send us this newly created ARN.

For more information, please refer to the Datadog logs management for AWS S3 documentation.


Azure Storage

  1. In your Azure Portal, create a storage account where your logs will be sent. Give your storage account a name, any account kind, and select the "hot access" tier.

  2. Set up the Azure integration within the subscription that holds your new storage account. This includees creating an App Registration with which Datadog can integrate.

  3. To grant your Datadog App sufficient permission to write to and rehydrate from your storage account:

    • Select your storage account from the Storage Accounts page.

    • Navigate to Access Control (IAM) and select Add > Add Role Assignment.

    • In the Role field, select Storage Blob Data Contributor.

    • Select the Datadog App that you created for integrating with Azure, and then save.

  4. Send your account name and the related account key to Botify using the appropriate regional Support email address below.

Please refer to the Datadog logs management for Azure documentation for more information.

1454

Google Cloud Storage Bucket

  1. In your GCP account, create a GCS bucket where your logs will be sent. Under “Choose how to control access to objects”, select Set object-level and bucket-level permissions.

  2. Set up the GCP integration for the project that holds your GCS storage bucket. Datadog will integrate with the GCS Service Account you create in this process.

  3. Add the roles in Storage > Storage Object Creator (for generating archives) to grant your Datadog GCP Service Account sufficient permissions to write your archives to your bucket. If creating a new Service Account, do this on the GCP Credentials page. If you are updating an existing Service Account, do this on the GCP IAM Admin page.

    1444

  4. Add the Storage Object Creator role to your Datadog GCP Service Account.

  5. On the Datadog Archives page, select the Add a new archive option at the bottom. Only Admin Datadog users can complete this and the following step.

  6. Select the GCS archive type and the GCS Service Account with permission to write to your storage bucket.

  7. Add your bucket name and, optionally, a prefix directory for the entire content of your log archives.

  8. Click the Save button to save your archive.

    1194

  9. Send the bucket name and related key file to Botify using the appropriate regional Support email address below.

The following is an example of the correct key format:

{ "type": "service_account", "project_id": "some-project-id", "private_key_id": "bd6c6d1e4ba345958e324410f9cdf1def3731a9a", "private_key": "-----BEGIN PRIVATE KEY-----\nVerYBiGPrivAtESsHKeY\n-----END PRIVATE KEY-----\n", "client_email": "botify@some-project-id.iam.gserviceaccount.com", "client_id": "10000872129394565452", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://oauth2.googleapis.com/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/some-project-id.iam.gserviceaccount.com" }

For more details, refer to the Datadog logs management for Google Cloud Storage documentation.


Contact Support

If you need any assistance, please contact Support using the email address for your region:

Did this answer your question?