Configure a Custom Log Parser

Configure a Custom Log Parser

You can configure a custom log parser if the predefined parsers do not extract events from your uploaded logs. You can customize your parser based on what you know about your logs. After creating a custom parser, it will appear on the Custom tab.

These are the primary steps to configuring a custom log parser:

  1. Upload
  2. Preparation
  3. Extraction
  4. Transformation
  5. Name

To configure a custom parser using a log file:

  1. On the Test/Create page, click Create Custom Parser.
  2. On the Upload page, click Select File. Locate and select a log file, and then click Upload.
  3. Click Test to process your log file. The results display on the Extracted Fields tab at the bottom of the page. You can use these bottom tabs to ensure that your log file data is being mapped correctly:
    • Extracted Fields: This tab shows the columns that are extracted from your log file.
    • Extracted Events: This tab shows the extracted events from your log file.
    • Rejections: This tab shows the rejected log line number, line, and reason.
    • Summary: This tab provides an overview of the total extracted lines, number of rejected lines, and the total uploaded lines.

      Tip

      The Test button and the bottom tabs display on all pages.

  4. On the Preparation page, select and enter the needed information on these tabs:
    • Format: This tab opens with a default setting of CSV for format. To change the formatting, specify the following:
      • Log file type: Choose CSV, Delimited, Key-Value, or Hybrid.
      • Header in log: Specify whether the log has a header.
      • Header starts with: Enter the text string that start the header.
      • Line number: Enter the number of the starting line in the log.
      • Header delimiter: Choose the type of delimiter from the dropdown.
    • Pre-Filters: This tab allows you to pick data line patterns for your log file. Currently, the only supported Data Line Pattern is 80http443https. In addition, you can define the Line/Header Patterns to discard, a regular expression that determines the log lines to discard.
    • Multi-Line Merge: Use this tab only if you have complex log files that contain lines that split into two lines. You can specify the following:
      • Multi-line delimiters: Delimiter between lines.
      • Log-line delimiter size: Delimiter size between log lines.
      • Log-line delimiter match: Delimiter match between log lines.
  5. When finished, click Next.
  6. On the Extraction page, identify the file structure, timestamp, and fields for your log file.
    • Structure: This tab allows you to specify the Field Delimiter (like comma, space, tab, colon, and so on), Field Enclosure (like double quotes, single quotes, or pipe character), and Line Enclosed Within (like square brackets, parenthesis, curly brackets, pipe character, or double quotes). Timestamp: This tab contains several fields that are important to mapping data correctly. The fields include:
      • Timestamp Maps To: Identifies the location (column header) of the timestamp in your log file and map it to this field.

        Note

        This is a required field and best practice is to map it to column 1.

        Timestamp Format: In most cases you do not need to change a format except for some cases when the timestamp is in a unique epoch format, in which case you should select a format code like %H for the format field. Refer to the Timestamp Format Codes to see all the specific format codes. Enclosed Within: Select if the Timestamp is enclosed within square brackets, parenthesis, curly brackets, pipe character, or double quotes. Default Time Zone: Select your default GMT time zone for the log file.
    Fields: For field mapping, verify the log header of each specific field, like Source IP (required), Source (Src) Port, Destination IP or Destination Host or URL (required), Destination Port (required if only
    Destination IP or Destination Host is present), Action, and map them to the required field from Skope IT. For example, if the Source IP is in column 6, select column 6 from the dropdown list for the Source IP. Do this for all fields you want to map.
  7. When finished, click Next.
  8. Click Test to process your log file. The results display on the tabs at the bottom of the page.
  9. Click Finish, enter a name for your custom parser, and then click Save.

Tip

Keep this Create Custom Parser window open if you want to create a Transformation Rule, which is explained in Transformation.

Timestamp Format Codes

Format Code Description Example
%a Day of the week as locale’s abbreviated name Wed
%A Day of the week as locale’s full Name Wednesday
%w Weekday as a decimal number, where 0 is Sunday and 6 is Saturday 3 for Wednesday
%d Day of the month as a zero-padded decimal number 25
%b Month as a locale’s abbreviated name Dec
%B Month as a locale’s full name December
%m Month as a zero-padded decimal number 12
%y The year without the century as a zero-padded decimal number 15 or 2015
%Y The year with the century as a decimal number 2015
%H Hours on a 24-hour clock as a zero-padded decimal number 18
%I Hours on a 12 hour clock 06
%p Locale’s equivalent of AM or PM PM
%M Minutes as a zero-padded decimal number 15
%S Seconds as a zero padded decimal number 30
%f Microsecond as a decimal number, zero-padded on the left 000000
%z UTC offset in the form +HHMM or -HHMM (empty string if the object is naive) -0800
%Z Timezone name like GMT( empty string if naive) PST
%j Day number of the year from the first of Jan as a zero-padded decimal number 359
%U Week number of the year as a zero-padded number. All days in the first week preceding the first Sunday are considered to be week 0 51
%W Week number of the year as a zero-padded number. All days in the first week preceding the first Monday are considered to be week 0 02 (a zero-padded number for single digit weeks) or 36
%c Locale’s appropriate date and time representation Wed Dec 25 18:15:30 2015
%x Locale appropriate date 12/25/15
%X Locale appropriate time 18:15:30
Share this Doc

Configure a Custom Log Parser

Or copy link

In this topic ...