Boost Your Security Operations: Cribl Stream and MS Sentinel Integration Guide

Boost Your Security Operations: Cribl Stream and MS Sentinel Integration Guide

Cribl Stream is a cutting-edge log stream management tool designed to enhance the efficiency and visibility of log data processing. Its advanced visualization capabilities allow users to see incoming and outgoing data in real time, providing a clear and intuitive understanding of their log streams. With Cribl Stream, users can apply filters, replay logs, and utilize a range of advanced features to streamline log management.

Log parsing and normalization are simplified and highly effective with Cribl Stream, ensuring that data is correctly formatted and ready for analysis. Additionally, Cribl Stream supports multi-destination log routing, allowing users to send logs to various endpoints, such as a SIEM platform for security monitoring and a low-cost storage solution for archiving. This flexibility makes Cribl Stream a versatile tool in modern data management and security operations.

Benefits of Cribl Stream

  1. Intuitive Visualization: Cribl Stream’s visual interface makes it easy to understand data flows, with real-time visualization of incoming and outgoing logs. This feature significantly improves the ability to monitor and troubleshoot data streams.
  2. Advanced Filtering and Transformation: Users can apply filters in real time, allowing for dynamic data processing. The ability to replay logs ensures that data can be re-examined and reprocessed as needed, enhancing the flexibility of log management.
  3. Simplified Log Parsing and Normalization: The tool simplifies the process of parsing and normalizing logs, ensuring data consistency and reliability. This is crucial for accurate data analysis and compliance.
  4. Multi-Destination Log Routing: Cribl Stream can route logs to multiple destinations simultaneously. For instance, logs can be sent to a SIEM platform for immediate analysis and to a cost-effective storage solution for long-term archiving. This multi-destination capability enhances data utilization across different platforms.
  5. High Performance: Optimized for high efficiency, Cribl Stream handles large volumes of log data without compromising on performance. Its design ensures that data processing is both fast and resource-efficient.
  6. Strong Community and Support: With robust enterprise support and an active user community, Cribl Stream users have access to a wealth of resources and assistance. This ensures that any challenges can be quickly addressed and resolved. 


Integrating Cribl Stream with MS Sentinel

Add a Data Source in Cribl Stream:

For this exercise, we’ll enable a Cribl Stream built-in datagen (i.e., data generator) that generates a stream of ‘syslog’ sample log data. Login to the Cribl Stream GUI : http://x.x.x.x:9000

·       From Cribl Stream’s Manage submenu, select Data > Sources.

·       From the Data Sources page’s tiles or left menu, select Datagen.

(You can use the search box to jump to the Datagen tile.)

·       Click Add Source to open the New Source modal.

·       In the Input ID field, name this Source “criblstream”


·       In the Data Generator File drop-down, select syslog. This generates syslog events for a business scenario.

·       Click Save.

Create a Pipeline in Cribl Stream:

·       From the top menu, select Processing > Pipelines.

·       You now have a two-pane view, with a Pipelines list on the left and Sample Data controls on the right. (We’ll capture some sample data momentarily.)

·       At the Pipelines pane’s upper right, click Add Pipeline, then select Create Pipeline.

·       In the new Pipeline’s ID field, enter a unique identifier.

·       Optionally, enter a Description of this Pipeline’s purpose.

·       Click Save.

·       In the right pane, click Capture Data.

·       Click Capture, then accept the drop-down’s defaults – click Start.

·       When the modal finishes populating with events, click Save as Sample File.

·       Click Save. This saves to the File Name you entered above, and closes the modal. You’re now previewing the captured events in the right pane. (Note that this pane’s Simple Preview tab now has focus.)

·       Click the Show more link to expand one or more events.

We will filter the logs before sending them to MS Sentinel by performing two actions:

  1. Add a function in the pipeline to include a field named "TimeGenerated" with the date and time.
  2. Remove the field "_time" from the logs, as it is an illegal field.

Although these transformations can be performed later in the DCR transformation, it is more efficient to handle them in Cribl. The primary purpose of Cribl is to ensure that only data in the correct format, as expected by the destination, is sent.

·       In the left Pipelines pane, click Add Function.

·       Search for Eval, then click it.

·       In the Evaluate Fields, name the field “TimeGenerated” with a value of “new Date()”.

·       In the Remove Fields, enter “_time”.

·       Click Save.

·       Check the output to ensure that the filtering is applied successfully.

·       Download the output file as json.

Register App in MS Entra ID:

·       In the Azure Entra ID pane, select App registrations. Click on New registration.

·       Enter "Criblstream" for the Name.

·       Under Supported account types, select Accounts in this organizational directory only (single tenant).

·       Click Register.

·       After the registration is complete, note the Application (client) ID and Directory (tenant) ID on the app’s Overview page.

·       Click on “Add a certificate or secret”.

·       Under Client secrets, click on New client secret.

·       Enter a description for the client secret.

·       Select an expiry period for the client secret (e.g., 6 months).

·       Click Add.

·       After the client secret is created, copy and save the Value immediately as you won’t be able to retrieve it again once you leave the page.

Create Data Collection Endpoint (DCE):

·       In the Azure portal, navigate to Monitor > Data Collection Endpoint.

·       Click Create.

·       Enter a Name and choose a Region.

·       Copy the Logs ingestion URI for later use.

Create a DCR-Based Custom Table:

·       Access your Log Analytics workspace.

·       Within the workspace, navigate to Tables.

·       Click on the Create option.

·       Ensure to select "DCR-based" to initiate the appropriate wizard.

·       In the wizard, provide a valid table name.

·       Opt to create a new Data Collection Rule (DCR) directly within this wizard and give it a valid name.

·       Select the Data Collection Endpoint (DCE) created in the earlier step.

·       Click 'Next'.

·       Upload the sample JSON file generated earlier.

·       Click Finish to complete the table configuration process.

Authorize the application to the DCR:

·       Access the Data Collection Rule (DCR) in the Azure portal.

·       Navigate to the Access Control (IAM) blade within the DCR.

·       Click on "Add role assignment."

·       Grant the recently created application the built-in ‘Monitoring Metrics Publisher’ role to allow log ingestion requests on the Data Collection Rule. 

Extract the URI from the Azure Resource Explorer:

·       Go to Azure Resource Graph Explorer.

·       Run the following query:

Resources

| where type =~ 'microsoft.insights/datacollectionrules'

| mv-expand Streams= properties['dataFlows']

| project name, id, DCE = tostring(properties['dataCollectionEndpointId']), ImmutableId = properties['immutableId'], StreamName = Streams['streams'][0]

 | join kind=leftouter (Resources

| where type =~ 'microsoft.insights/datacollectionendpoints'

| project name,  DCE = tostring(id), endpoint = properties['logsIngestion']['endpoint']) on DCE

| project name, StreamName, Endpoint = strcat(endpoint, '/dataCollectionRules/',ImmutableId,'/streams/',StreamName,'?api-version=2021-11-01-preview')

·       Download the results as a CSV file.

·       Open the CSV file and note the URL of your created Data Collection Endpoint (DCE) and Data Collection Rule (DCR).[e.g https://criblstream-wsgs.uksouth-1.ingest.monitor.azure.com/dataCollectionRules/dcr-a7xxxxxx2c4bbcbd3cxxxxxxd8f7/streams/Custom-criblstream_CL?api-version=2021-11-01-preview

Add a Destination & Route in Cribl Stream:

·       Log in to the Cribl Stream GUI: http://x.x.x.x:9000

·       From Cribl Stream’s Manage submenu, select Data > Destination.

·       From the Data Destinations page, search for "Azure Sentinel".

·       Enter the URI collected via Azure Graph Explorer.

·       Go to “Authentication” and enter the URL: https://login.microsoftonline.com/b6f90cc2-c8ad-479f-89e4-fd4f599db2d8/oauth2/v2.0/token

·       Input the Client ID and Secret as created in the previous steps.

·       Click save.

·       From Cribl Stream’s Manage submenu, select Routing > Data Routes.

·       Enter the Route Name and select the Pipeline and Output from the dropdown menu.

·       From Cribl Stream’s Monitoring submenu, select Flows. You should be able to see data flowing from the source to MS Sentinel via the pipeline.


·       Verify the MS Log Analytics Workspace's Logs section to confirm the arrival of the logs.


Note: If you need all the images for the integration process, please comment or message me directly to receive the complete document. Including numerous images in this article was challenging.



Ibrahim Kolade

#DevOps #DevSecOps #Observability#AWS#GCP#cybersecurity Experienced sales professional with over a decade experience in B2B sales

2w

Great one Nouman

Like
Reply
Steve Thallman

Regional Sales Manager @ Cribl | Giving you choice, control and flexibility to get your IT and Security data where you want it, when you need it

2w

Great overview

Like
Reply
James Todd

Cribl: Enabling IT and security teams to deliver the results you want within budget by prioritising and optimising your data and reducing storage costs

2w

Excellent write up Nouman Ahmed Khan, great job 👏

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics