Microsoft Azure Monitor Plugin for Log Shipper
Microsoft Azure Monitor Plugin for Log Shipper
This document explains how to configure the Microsoft Azure Monitor integration with the Cloud Log Shipper module of the Netskope Cloud Exchange platform. This plugin is designed for ingesting Alerts and Events data into the Microsoft Azure Monitor Log Analytics Workspace.
Prerequisites
- A Netskope tenant (or multiple, for example, production and development/test instances) that is already configured in Cloud Exchange.
- A Netskope Cloud Exchange tenant with the Log Shipper module already configured.
- The Microsoft Azure Application’s Tenant ID, Client ID and Client Secret.
- A Microsoft Azure Log Analytic Workspace.
- A Microsoft Azure Monitor Data Collection Endpoint.
- A Microsoft Azure Monitor Data Collection Rule.
- Connectivity to the following host: https://portal.azure.com/.
CE Version Compatibility
This plugin is compatible with Netskope CE v4.2.0 and v5.0.0.
Microsoft Azure Monitor Plugin Support
The Microsoft Azure Monitor plugin is used to ingest Netskope Events, and Netskope Alerts data to Azure Monitor.
Event Support | Yes |
Alert Support | Yes |
WebTx Support | No |
Permissions
Requires an Azure Account with Monitor access.
API Details
List of APIs Used
API Endpoint | Method | Use case |
---|---|---|
/oauth2/v2.0/token | POST | To generate the token. |
{Data Collection Endpoint URI}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2023-01-01 | POST | To ingest logs in Azure Monitor. |
Generate Token
API Endpoint: /oauth2/v2.0/token
Method: POST
Parameters:
client_id: client_id
client_secret: client_secret
scope: https://monitor.azure.com/.default
grant_type: client_credentials
Headers:
Content-Type: application/x-www-form-urlencoded
API Request Endpoint
https://login.microsoftonline.com/<tenant_id>/oauth2/v2.0/token
Sample API Response
{ "token_type": "Bearer", "expires_in": 3599, "ext_expires_in": 3599, "access_token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6IlQxU3QtZExUdnlXUmd4Ql82NzZ1OGtyWFMtSSIsImtpZCI6IlQxU3QtZExUdnlXUmd4Ql82NzZ1OGtyWFMtSSJ9.eyJhdWQiOiIwNWE2R1QVRFcTkxSzk5QUFBLiIsInJvbGVzIjpbImRpc2NvdmVyeS5tYW5hZ2UiXSwic3ViIjoiNzljMEHIywny8JmtEONTPUcOahramZDIYLL8JBGvUH5V-ebPIrAOnCZGvwcbYbVZy7joFwmjeIK22Er_4eCVDXDAzAWuF5uD-KFZp7DkZNSR06i7OD-Yo6YiGEzAP5fMW8anHREJDwh0OtkMn5GRf15ccuhBhNlGiT17uPNzAct*************************************5_DsDgVK109p1yVTrGTw" }
Logs Ingestion API in Azure Monitor
API Endpoint: {Data Collection Endpoint URI}/dataCollectionRules/{DCR Immutable ID}/streams/{Stream Name}?api-version=2023-01-01
Method: POST
Parameters:
api-version=2023-01-01
Headers:
Content-Type: application/json
Authorization: Bearer {token}
Request Body:
[ { "TimeGenerated": "2023-11-14 15:10:02", "Column01": "Value01", "Column02": "Value02" } ]
API Request Endpoint:
https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com/dataCollectionRules/dcr-000a00a000a00000a000000aa000a0aa/streams/Custom-MyTable?api-version=2023-01-01
Sample API Response:
204
Performance Matrix
This performance reading is for a Large Stack CE tested on the below-mentioned VM specifications. The below readings are added with the consideration that it will ingest around 10K file size in 11 seconds for Events, Alert.
Stack details | Size: Large RAM: 32 GB CPU: 16 Cores |
Events, Alerts ingested to third-party SIEM | 200K EPM |
Webtx ingested to third-party SIEM | NA |
User Agent
The user-agent added in this plugin is in the following format:
netskope-ce-<ce_version>-<module>-<plugin_name>-v<plugin_version>
For example:
netskope-ce-5.0.0-cls-microsoft-azure-monitor-v1.1.0
Workflow
- Configure a Log Analytics Workspace.
- Configure an Application and get your Tenant ID, Application ID and Client Secret.
- Configure a Data Collection Endpoint and get your DCE URI.
- Configure a Basic Table in Log Analytics Workspace and get your Data Collection Rule Immutable ID.
- Assign a Permission to DCR and DCE.
- Configure the Microsoft Azure Monitor plugin.
- Configure a Log Shipper Business Rule for Microsoft Azure Monitor.
- Configure Log Shipper SIEM mappings for Microsoft Azure Monitor.
- Validate the Microsoft Azure Monitor plugin.
Click play to watch a video.
- Log in to Azure and select Log Analytics Workspace.
- Click the Create Tab on the top.
- Select Subscription, and then select an existing Resource Group (or create a new one).
- Enter a name for your Log Analytics Workspace, select a region, and then select Next > Next > Create.
- Log in to Azure with an account that has a Global Administrator role.
- Go to Azure App Registration > New Registration.
- In the registration form, enter a name for your application, and then click Register.
- Make a copy of the Tenant ID and Application (client) ID on the application page.
- Click Add a Certificate or Secret, and then click New client secret to generate a Client secret. Add a description and Expire time, and then click Add.
- Copy the value of Secret, as it will only be displayed once.
- Go to Azure Home and select Monitor from the Azure services.
- Select Data Collection Endpoints on the left panel, and then click Create.
- Enter a name for the Data collection Endpoint, select a Subscription and Resource Group, select a region (make sure that this region is the region of your Log Analytics Workspace), and then click Review + create.
- From the Overview tab, copy the Logs Ingestion that will be your Data Collection Endpoint DCE URI.
- A Custom Log Analytics Table requires sample data to be uploaded in order to create a JSON file on your system with the following content:
[ { "RawData": { }, "Application": "", "DataType": "", "SubType": "", "TimeGenerated": "2022-11-01 12:00:00.576165" } ]
- On the Azure home tab, go to Log Analytics Workspace, select the workspace created previously, select Tables. Click Create and select New Custom log (DCR based).
- Enter a name for the table.
- For Data Collection Rule, click Create a new data collection rule and select a Subscription and Resource Group from the dropdown lists. Enter the region for your Log Analytics Workspace, and then click Done.
- The new Data Collection Rule will be selected in the Data collection rule field. Click Next.
- On the Schema and Transformation tab, click Browse for Files and select the sample data JSON file you created previously.
- Click Next and then click Create.
- A Custom Log Table will be created with the suffix
_CL
. By default, the table plan will be Analytics. To convert it to Basic table, search for your table and click on the three dots on the right. - Select Manage Table, and in the table plan field, select Basic.
- Select the retention period as per your requirement and click Save.
Note
- Here we are changing the table Plan from Analytics to Basic because the Basic log data plan lets you save on the cost of ingesting, and storing high-volume verbose logs in your Log Analytics workspace for debugging, troubleshooting, and auditing.If table plan is not changed and kept as Analytics, the Logs will still be ingested in the Table without any issue.The Analytics table has a configurable retention period from 30 days to 730 days. The Basic table has Retention fixed at eight days.Basic Logs tables retain data for eight days. When you change an existing table’s plan to Basic Logs, Azure archives data that’s more than eight days old, but still within the table’s original retention period.
- To get the Data Collection Immutable ID, go to Home, select Monitor from the Azure Services > Data Collection Rules, and then select the DCR created by you while creating the Custom Table.
- In the Overview tab, click JSON View from the top right corner, and copy the immutableId.
- On the Azure Home page, go to Monitor > Data Collection Endpoint and select the Endpoint created previously.
- Select Access control (IAM) and click Add role assignment.
- From the list of roles, select Monitoring Metrics Publisher and click Next.
- Select User, group, or service principal for which to assign access.
- Click Select Members and search for the Application you created in the search box, and then select it.
- Click Review + assign.
- Repeat these same steps to assign permissions to the DCR (Data Collection Rule).
- In Cloud Exchange, go to Settings > Plugins.
- Search for and select the Microsoft Azure Monitor box to open the plugin configuration page.
- Enter a Configuration Name a select a valid Mapping. (Default Mappings for all plugins are available. If you want to create a new mapping, go to Settings > Log Shipper > Mapping).
- Transform the raw logs is enabled by default, which will transform the raw data on the basis of Mapping file. Turn it off if you want to send Raw data directly to Azure Monitor.
- Click Next.
- Enter these Configuration Parameters:
- Directory (tenant) ID: Directory (tenant) ID of your AzureAD Application.
- Application (client) ID: Application (client) ID of your AzureAD Application.
- Client Secret: Client Secret of your AzureAD Application.
- DCE URI: URI of Data Collector Endpoint.
- DCR Immutable ID: Immutable ID of Data Collection Rule.
- Custom Log Table Name: Custom Log Table name for ingesting data. Make sure that the Table exists in your Log Analytics Workspace.
- Click Save.
- Go to Log Shipper > Business Rules.
- Click on the Create New Rule.
- By default, there’s a business rule that filters all alerts and events. If you want to filter out any specific type of alert or event, click Create New Rule and configure a new business rule by adding the rule name and filter(s).
- Enter a Folder Name, if any.
- Click Save.
- Go to Log Shipper > SIEM Mappings and click Add SIEM Mapping.
- Select a Source Configuration, Destination Configuration, and Business Rule.
- Click Save.
After the SIEM mapping is added, the data will start to be pulled from the Netskope tenant and ingested into the Azure Monitor platform.
Validate the Pull
To validate the pulling of indicators from the Netskope tenant.
Go to Logging in Netskope CE and search for the pulled logs with the filter message contain pulled.
Validate the Push
To validate the plugin workflow in Netskope Cloud Exchange:
- Go to Logging and Search for ingested events with the filter message contains ingested.
- The ingested logs will be filtered.
To validate in the Log Analytics Workspace:
- In the Azure portal, go to Log Analytics Workspace, select the Log Analytics Workspace that you created, and select Logs under the General Category on the left panel.
- Write the Custom Log Table Name in the query editor and click Run. You can select the Time Range from the top to filter out logs.
Troubleshooting
If receive error code 403 while configuring the plugin in toast and log message
Ensure that you have the correct permissions for your application to the DCR. Check if you have assigned permissions to the correct Data Collection endpoint as described above. It may take up to 30 minutes to reflect the assigned permissions.