Elastic Plugin for Log Shipper

Elastic Plugin for Log Shipper

This document explains how to configure the Elastic plugin in the Log Shipper module of the Netskope Cloud Exchange platform. This plugin supports ingestion of Alerts (Compromised Credential, Policy, Malsite, Malware, DLP, Security Assessment, Watchlist, Quarantine, Remediation, UBA, CTEP), Events (Page, Application, Audit, Infrastructure, Network, Incident) in ECS and JSON format.

Prerequisites

To complete this configuration, you need:

  • A Netskope tenant (or multiple, for example, production and development/test instances) that is already configured in Cloud Exchange.
  • A Netskope Cloud Exchange tenant with the Log Shipper module already configured.
  • Your Elastic server (Filebeat) TCP Server address and port.
  • Connectivity to the following host: Elastic Server.

Note

Verify your Elastic instance permissions are secure and not set up for open public access. Only allow access to your cloud storage instance from your Cloud Exchange Host and any other addresses that need access.

CE Version Compatibility
  • Netskope CE v4.2.0, v5.0.1
  • Elastic – Netskope Integration (version 1.17.2)
Elastic Plugin Support

Elastic plugin is used to ingest all the Alert, Events in ECS and JSON format on Discover tab of Elastic in ECS and JSON format.

Alerts SupportYes (Compromised Credential, Policy, Malsite, Malware, DLP, Security Assessment, Watchlist, Quarantine, Remediation, UBA, CTEP)
Event SupportYes (Page, Application, Audit, Infrastructure, Network, Incident)
WebTx SupportNot Supported
CE LogsNot Supported
Permissions

Ports used in Netskope Integration on Elastic should be accessible via cloud exchange.

API Details

The plugin utilizes python’s socket library to establish a connection with the Elastic server.

Specifically, the plugin uses the socket.connect method to initiate the connection using the socket.AF_INET and socket.SOCK_STREAM protocols. This guarantees that the connection made to the server is reliable and stream-oriented.

In addition to this, the plugin leverages the socket.sendall method to transmit logs to the Elastic server. This method ensures that all data is sent successfully before returning.

Performance Matrix

This performance reading is conducted on a Large Stack CE with the below-mentioned VM specifications. The below readings are added with the consideration that it will ingest around 10K alerts and events in ~20 seconds to the Elastic platform.

Stack detailsSize: Large RAM: 32 GB CPU: 16 Cores
Alerts/Events ingested to third-party SIEM~200K EPM

Workflow

  1. Configure a Netskope integration in Elastic.
  2. Configure the Elastic plugin.
  3. Configure a Log Shipper Business Rule for Elastic.
  4. Configure Log Shipper SIEM Mappings for Elastic.
  5. Validate the Elastic plugin.

Click play to watch a video.

 

Configure a Netskope Integration in Elastic

  1. Log in to Elastic.
  2. Search for and select Integrations.
  3. Search for Netskope and click on the Netskope box.
  4. Click Add Netskope.
  5. Expand the dropdown menu.
  6. If the Elastic server and the deployment location of Cloud Exchange match, keep the Listen Address to localhost. Otherwise, add 0.0.0.0 in the Listen Address. Change the Listen port based on your requirements. Make sure that the configured port will be accessible to Cloud Exchange.
  7. Enable the Preserve original event toggle for Netskope Alerts and Netskope Events.
  8. If you want to add a custom tag, then click Advanced options and add the tag.
  9. Click Save and continue.
  10. Click Save and deploy changes.
  11. The Integration policy that was just created will appear under the Integration Policies.
If you want to ingest the data in JSON format, then follow these steps to deploy the integration.
  1. Search for and select Integrations.
  2. Search for TCP and click on the Custom TCP Logs box.
  3. Click Add Custom TCP Logs.
  4. Expand the dropdown menu.
  5. If the Elastic server and the deployment location of Cloud Exchange match, keep the Listen Address to localhost. Otherwise, add 0.0.0.0 in the Listen Address.
  6. Click Save and continue.
  7. Click Save and deploy changes.
  8. The Integration policy that has been newly created will appear under the Integration Policies tab.

Configure the Elastic Plugin

  1. In Cloud Exchange, go to Settings > Plugins. Search for and select the CLS Elastic box.
  2. Add the plugin configuration name, and make sure you have the Elastic Default Mappings (recommended) file selected. Disable the toggle button to transform the logs if you want to ingest the data in JSON; keep it enabled if you want to ingest the data in ECS format. Click Next.
  3. Enter values for these parameters:
    • Server Address: IP address of Elastic server in which data will be ingested.
    • Server Port: The TCP port used while creating the integration policy on Elastic.
  4. Click Save. This new plugin will be available on the Log Shipper > Plugins page.

Configure a Log Shipper Business Rule for Elastic

  1. In Cloud Exchange, go to Log Shipper > Business Rules.
  2. By default, there is a business rule that filters all alerts and events. If you want to filter out any specific types of alerts or events, click Create New Rule and configure a new business rule by adding the rule name and filter(s).
  3. Click Save.

Configure Log Shipper SIEM Mappings for Elastic

  1. In Cloud Exchange, go to Log Shipper > SIEM Mappings and click Add SIEM Mappings.
  2. Select the Source plugin (Netskope CLS), Destination plugin (Elastic), and your business rule, and then click Save.
  3. After the SIEM mapping is added, the data will start to be pulled from the Netskope tenant, transformed, and ingested into the Elastic platform.

Validate the Elastic Plugin

Validate the Pull

Go to the Logging in Netskope CE. Search for the pulled logs.

Validate the Push

To validate the plugin workflow in Cloud Exchange, go to Logging and search for ingested Events and Alerts with the filter message contains ingested. The ingested logs will be filtered. To validate the push in Elastic:
  1. Log in to Elastic.
  2. Search for and select Discover.
  3. Search for data_stream.dataset : “netskope.alerts” or data_stream.dataset : “netskope.events” and click Update. Or search for tags : “<the tag that was added while configuring the Netskope integration>
JSON Format

Troubleshooting

If you encounter difficulties saving the Elastic plugin
Despite entering all parameters and clicking Save, an error may occur, possibly due to the following reason: The server/port configuration may differ from the specified settings (Cloud Exchange/Elastic). Go to the Elastic Platform and search for Integrations. Go to Installed integrations, click on the Netskope card > Integration policies, and then click on Integration policy (the configuration you have used). Make sure both are the same.

Known Behavior

If you encounter any of the following errors during ingestion, it may be due to socket-related issues. We have identified an unresolved problem on the Elastic side. For more information, please refer to: JSON Parse Exception – Illegal Character (Ctrl Char)
  • Processor ‘json’ with tag ‘json_message’ failed with message ‘Unexpected character (‘*’ (code 42)): expected a valid value (JSON String, Number, Array, Object or token ‘null’, ‘true’ or ‘false’)\n at [Source: (org.elasticsearch.common.io.stream.ByteBufferStreamInput); line: 1, column: 2]’
  • Processor ‘json’ with tag ‘json_message’ failed with message ‘Illegal character ((CTRL-CHAR, code 3)): only regular white space (\\\\r, \\\\n, \\\\t) is allowed between tokens\\n at [Source: (org.elasticsearch.common.io.stream.ByteBufferStreamInput); line: 1, column: 2]’
  • Processor ‘json’ with tag ‘json_message’ failed with message ‘Invalid UTF-8 start byte 0xbf\\n at [Source: (org.elasticsearch.common.io.stream.ByteBufferStreamInput); line: 1, column: 3]’

Known Limitation

The existing CLS Elastic (ELK) plugin does not currently support WebTx data ingestion. This is because the existing Elastic Netskope integration, to which we send logs, only supports Alerts and Events. Therefore, we cannot add support for WebTx in the current plugin. For WebTx support, the Elastic team would need to update their Netskope integration.

As a workaround to this issue, the Syslog plugin can be used to send WebTx data to Elastic using the Custom TCP Logs integration. This method will send raw logs, not ECS-transformed logs. Follow these steps to configure the Custom TCP logs integration in Elastic.

  1. In Elastic, go to Integrations and search for Custom TCP Logs integration.
  2. Click Add Custom TCP Logs located in the top right corner.
  3. Provide the name and description for the integration.
  4. Make sure Custom TCP Logs is enabled. Expand it and enter the Listen Address, Listen Port and Dataset Name. Also provide the Tags for filtering the ingested logs, and then click Save and continue.
  5. After this is done, configure the CLS Syslog plugin from Netskope CE. Note that if you have an On-Premises setup for Elastic, make sure to provide the Listen Address as 0.0.0.0. When adding the port make sure the port is exposed.
Share this Doc

Elastic Plugin for Log Shipper

Or copy link

In this topic ...