The Data Export feature offers the ability for Single Org & MSP users to export data at regularly timed intervals, and will allow users to configure exports to Amazon S3 or Splunk. Once configured, the data available in the CSV version of the Query Log will transmit in near real-time.
This feature allows users to combine their Query Log data with other data for monitoring/action/alerting. The feature is only available to users with Owner, Admin, or Super User roles. User's with Edit Policies or Read-Only roles will not have access to configure the Data Export feature.
DNSFilter's Data Export can integrate directly with most SIEMs. Some examples include Datadog, Humio, LogRythm, QRadar, Splunk, and Sumo Logic.
Amazon S3 Data Export Configuration
This can be configured at the Single Organization level or as an MSP within each sub-organizations you're responsible for.
- In DNSFilter your dashboard, navigate to Tools > Data Export> Configure Data Export
- Select the Amazon S3 service and click Continue
- Next, enter your Amazon S3 bucket name. This is where you will be hosting your information which is a unique name for your account. For more information on where to locate this, please refer to Amazon's guide here.
- The Key Prefix field is optional. It allows users to organize the data they store in Amazon's S3 buckets.
- Next, enter the appropriate region for your S3 bucket (e.g., us-east-1). For more information on this, please refer to Amazon's bucket location guide here.
- The Access Key ID and Secret Access Key values can be found on your Amazon S3 bucket details page. Please see Amazon's help guide here for more details on how to generate an Access Key.
- Click on Verify & Test Account
- You will then see the following message below confirming that your account has been successfully configured. Click Finalize to complete the process.
- Your configuration is now complete.
Splunk Data Export Configuration
We utilize Splunk's `HTTP Event Collector` API, which uses a well-recognized protocol for transferring data. It is scalable, secure, token-based for convenience, and easy to maintain.
The protocol is often implemented by SIEMs and data tools apart from Splunk, and may work out of the box with your preferred data tool as well. For example, Humio implements a one-to-one `HEC` API which is already confirmed to work with this Data Export feature.
- In your dashboard, navigate to Tools > Data Export> Configure Data Export
- Next, enter your HTTP Event Collector URL and your Active Event Collector Token. For more information on how to generate an HTTP Event Collector URL and Token, please follow along with Splunk's help guide here
- Then select Verify & Test Account
- Then select Finalize after your configuration has been successful
Enabling Data Export as an MSP or Single Org
When the Data Export option is activated, it is turned on and applied to all sub-organizations (e.g., all plan tiers). Users at the sub-organization level may see the following image below when attempting to configure the Data Export under Tools > Data Export:
If so, the Owner, Admin, or Super User must turn on this feature from the billing section of your DNSFilter account from the steps below.
Please Note: If you are an MSP, please continue with Step #1. If you are a single Org, skip ahead to Step #4:
- Login to your DNSFilter account
- Click on MSP > Billing > Upgrade your plan
- From here, scroll down to Add-ons and active the Data Export option for your current plan, and click Save (e.g., Basic, Pro, or Enterprise)
💡 Info Tip: The Data Export feature cannot be turned On for a single organization. This is because the Data export is applied at the plan level. If you want to enable data export for enterprise plan levels, it will be applied across all organizations with a plan type of enterprise.
- From here, scroll down to Add-ons and activate the Data Export option for your current plan, and click Save (e.g., Basic, Pro, or Enterprise)
- If you have a single organization, then in your dashboard, you can navigate to Organization > Billing, activate the Data Export add-on, and save the changes.
Troubleshooting configuration errors for Amazon S3 and Splunk
You may see the error message displayed below when attempting to verify and test the account configuration or if the export fails. You will get an email about these failures after 20 errors have occurred. This could indicate that either the access credentials have changed or the region setting has changed, which would cause the export setup to fail.
If that happens to be the case, you need to click on Edit and check to ensure that the current settings match what you have configured in your AWS bucket.
Then, update the changes to the data export configuration and click Verify & Test Account again to ensure that there is no longer an error.
The following message below is also another error that you may run into when verifying and testing your account:
This error may appear when the configuration settings are incorrect, missing, or have been updated during or after the configuration. We recommend cross-checking your Splunk details to fix any missing or incorrect values in the Splunk configuration. Once confirmed/fixed, click Verify & Test Account again to ensure the error clears.
A sample of our Standard Export file can be found here. We have also included a sample export of our roaming client data here. Below, we've also included a formatted data table that includes a brief description of what this data represents:
|Sequence||DNSF Name||DNSF Description|
|1||Time||When this request was made in UTC.|
|2||FQDN||Fully qualified domain that was requested.|
|3||Domain||Domain that was requested.|
|4||Protocol||Internet protocol used to make the request|
|5||Username||Username of the dashboard user who made the request|
|6||UserID||ID of the dashboard user who made the request|
|7||QuestionType||Type of the DNS request that was made|
|8||Code||DNS return code for the request|
|9||OriginalCode||DNS original return code for the request|
|10||RequestAddress||The external IP that made the request|
|11||Client||Roaming client name that made the request|
|12||ClientID||Roaming client ID that made the request|
|13||ClientType||Roaming client type that made the request.|
|14||ClientMac||Roaming client mac address that made the request. Field may vary based on the roaming client type.|
|15||IP4||Internal IP that made the request|
|16||IP6||Internal IP that made the request|
|17||Region||The geographical region where the request was made|
|18||Network||Network or site name where the request was made|
|19||NetworkID||Network or site ID where the request was made|
|20||Collection||Collection name where the request was made|
|21||CollectionID||Collection ID where the request was made|
|22||Policy||Policy name that processed the request|
|23||PolicyID||Policy ID that processed the request|
|24||ScheduledPolicy||Scheduled policy name that made the request|
|25||ScheduledPolicyID||Scheduled policy ID that made the request|
|26||Seccats||One or multiple categories associated with the blocked request|
|27||Secallowcats||One or multiple categories associated with the allowed request|
|28||Blockcats||One or multiple categories associated with the blocked request|
|29||Blockallowcats||One or multiple categories associated with the allowed request|
|30||Allowed||Boolean value indicating whether the request was allowed|
|31||Threat||Boolean value indicating whether the request is categorized as a threat|
|32||Method||Method used to process the request|
|33||Organization||Organization name associated with the request|
|34||OrganizationID||Organization ID associated with the request|
|35||ApplicationID||Application ID associated with the request|
|36||ApplicationName||Application name associated with the request|
|37||ApplicationCategoryID||Application category ID associated with the request|
|38||ApplicationCategoryName||Application category name associated with the request|