Log Analytics, now part of Azure Monitor, is a log collection, search, and reporting service hosted in Microsoft Azure. Log Analytics processes data from various sources, including Azure resources, applications, and OS data. Windows and Linux clients use the Log Analytics agent to gather performance metrics, event logs, syslogs, and custom log data. This agent can run on computers in Azure, on-premises ones, or even other cloud providers.
Avatar
Latest posts by Travis Roberts (see all)

The Log Analytics agent can log data that applications and scripts create, called custom logs. Collecting custom logs allows for searching, reporting, and alerting beyond the default event log and syslog data.

Additionally, you can create custom fields in Log Analytics based on the data collected, including custom logs. Custom fields enable indexing and searching on discrete subsets of data.

There are requirements for logging custom logs. Logs must use new line or timestamp delimiters. There's no support for circular log files. Also, log files must use ASCII or UTF-8 encoding.

Add a custom log

The first step for custom logs is to provide a sample of the log file. This serves for the initial configuration and must meet the requirements above. This example uses a simple script to create a log file for demonstration. The script loops through a process to generate sample error messages and log them to a file. Notice the output change from PowerShell's default of UTF-16 to UTF-8.

$loopCount = 15
$type = 'MyCustomLog'
while ($count -le $loopCount) {
    $errorCode = Get-Random -Minimum 1000 -Maximum 3999
    if ($errorCode -le 1999) {
        $errorMessage = 'Working as expected'
    }
    elseif ($errorCode -le 2999) {
        $errorMessage = 'System warning'
    }
    else {
        $errorMessage = 'System Error'
    }
    $logEntry = "$(get-date -format 'dd-MM-yyyy HH:mm:ss') Source: $type ErrorCode: $errorCode ErrorMessage: $errorMessage"
    Out-File -InputObject $logEntry -filePath .\MyCustomLog.log -Append  Encoding utf8
    Start-Sleep -Seconds 1
    $count = $count += 1
}

Create the custom log by going to the Log Analytics workspace, select Advanced settings, and go into the Data blade. From here, go to Data and select Custom Logs.

Log Analytics advanced settings

Log Analytics advanced settings

Under Custom Logs, click Add + to add a custom log.

Add a custom log

Add a custom log

The Add Custom Log wizard opens. Browse to upload the sample file; this example uses the file created in a previous step as the sample. You can do the same or use a file created by your application.

Upload a sample log file

Upload a sample log file

Click Next to get to the Delimiter screen. From here, select the delimiter New line or a Timestamp. If using a timestamp, it must match one of the listed timestamp formats.

Select a record delimiter

Select a record delimiter

Add the log collection path. This is the Windows or Linux path that the Log Analytics agent will watch for the log data. Notice it supports wildcards for the file name.

Add the log collection path

Add the log collection path

The custom log setting applies to the workspace. Any client connected to the workspace will send log files matching that path and filename if it exists on the server.

Next, add a name and a description for the custom log. Log Analytics will append _CL to the end of each custom log. This identifies it as a custom log. Click Done to finish setup.

Finish custom log collection

Finish custom log collection

The custom log now appears in the Log Analytics workspace. Modify or remove the custom log by navigating back to Advanced settings > Custom Logs in the workspace.

Edit custom logs

Edit custom logs

Log Analytics creates a dedicated container for each custom log. It can take 30 minutes or longer to provision the container, which must complete before new data shows in Log Analytics. Also, it only collects new data, not data in the log file prior to custom log creation.

Search a custom log

To search the custom log, first go into the workspace Logs window to verify the custom log setup. The new log appears under Schema > Custom Logs as shown below.

Schema > Custom Logs

Schema > Custom Logs

Enter the name of the log into the search window to search the new custom log. This will return all records from the last 24 hours.

Custom log search

Custom log search

Click on the ">" sign to expand the search results for one of the records. Notice the content of a single log record is in the "RawData" field.

RawData field

RawData field

One way to find specific data from the custom log is to search against the RawData field with a regular expression. For example, the command below will return all entries with "System warning" in the RawData field.

CustomLog_CL| where RawData matches regex "System warning"

Custom fields

A regular expression search works against the RawData field, but it's not efficient, especially against a large number of entries. It would be better to extract parts of the RawData field to index and search against them properly. This is what custom fields do.

At the time of this writing, it's not possible to create a custom field from the Logs search panel. You have to do it from the Logs (classic) search shown below.

Logs (classic) search

Logs (classic) search

Go to Logs (classic) and search for the custom log. Notice the three dots next to the data fields. Click on those dots next to RawData and then Extract fields from 'CustomLog_CL'.

Extract fields

Extract fields

This brings up the Custom Fields interface. This feature is in preview at the time of writing. Errors have occurred during testing. The errors usually clear after waiting an hour or two and trying again.

Under Main Example, highlight the data from the RawData field to base the new custom field on. This will bring up the extraction wizard. Type in the name of the new custom field and select the data type. Click Extract when finished.

Custom Fields extraction wizard

Custom Fields extraction wizard

Log Analytics uses a process called FlashExtract to predict the data to include in the new custom field. In the example below, Log Analytics correctly identified "System Error" and "System warning" but missed the "Working" in "Working as expected".

Custom field search results

Custom field search results

To correct the "Working as expected" entry, click on the edit circle above the log entry in the search results and select Modify this highlight. This opens the Additional Examples editor under Main Example. Highlight the expected value and select Extract.

Edit custom fields

Edit custom fields

Save the extraction to finish the new custom field once the summary panel shows the correct values. Custom fields only apply to new data; any existing data will not have the custom field applied.

Search custom fields

The new custom field is now available for searching. The example below shows a search of the custom log that returns all rows with "System Error" in the new ErrorMessage_CF field.

Search custom fields

Search custom fields

This example used custom fields created from the RawData field of the custom log. Custom fields are not limited to custom logs. You can create custom fields from other data fields in Log Analytics.

Subscribe to 4sysops newsletter!

Conclusion

Custom logs extend log collection beyond event log and syslog data to applications and scripts that log data locally. After data collection, you can search and report on it using regular expressions. Custom fields provide for the creation of discrete fields based on larger data fields for more granular indexing, searching, and reporting.

1 Comment
  1. Avatar
    milo 5 years ago

    I wonder if anyone out there configured OMS Linux Agent to send catalina.out logs to Logs Analytics in Azure? I have config file but for some reason its not working , it could be that after tomcat upgrade from 8.5.32 -> 8.5.35 file format changed.

    Here is my config which is failing to send info:

    <source>
      @type sudo_tail

      format none

      tag oms.api.tomcat.rota.out

      path /var/lib/tomcat/logs/catalina.out

      pos_file /var/opt/microsoft/omsagent/state/tomcat.log.rota.out.pos

      read_from_head true

      run_interval 30

    </source>

    the error in omsagent.log I see is:

    2019-05-20 13:05:31 +0000 [info]: INFO Received paths from sudo tail plugin : /var/lib/tomcat/logs/catalina.out 2019-05-20 13:05:31 +0000 [info]: INFO Following tail of /var/lib/tomcat/logs/catalina.out 2019-05-20 13:05:39 +0000 [warn]: Missing DataType or IPName field in record from tag 'oms.api.tomcat.rota.out' 2019-05-20 13:05:58 +0000 [info]: Sending OMS Heartbeat succeeded at 2019-05-20T13:05:58.645Z

    Please show me how this config should look like to successfuly  send logs to Log Analytics Workspace

    Regards

Leave a reply

Your email address will not be published. Required fields are marked *

*

© 4sysops 2006 - 2023

CONTACT US

Please ask IT administration questions in the forums. Any other messages are welcome.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account