How to Create and Monitor Custom Logs with Azure Monitor
In this guide, I’ll show you how to set up custom test logs using Azure Log Analytics Workspace to streamline monitoring and analysis.
By the end of this guide, you’ll have a functional setup for collecting and utilizing custom logs, making it easier to troubleshoot, monitor performance, and gain valuable insights into your systems.
Why would you want this in the first place you may ask? Here are some reasons:
- Control how your logs look, so they fit your needs
- Troubleshoot faster by having the right info when things go wrong
- Store logs securely
- Analyze data easily to find patterns or issues
- Track important events for audits and compliance
- Save money by only logging what you really need
One downside of using Azure Monitor is the potential cost, which can increase significantly based on the volume of logs ingested. This is because you are charged for both log storage and any alerts you configure. In an enterprise environment, Azure Monitor is often the preferred choice due to its robust features and seamless integration with other Azure services. However, for homelab or development purposes, it may be more cost-effective to explore alternatives like Grafana.
Nonetheless, this is still a great way to securely analyse your logs and create custom patterns for your logs for testing and use in production
Let's begin.
Scenario
For my use case, I wanted to test if I can get alerts for Chocoately logs. The logs for Chocoately is located here: "C:\ProgramData\chocolatey\logs\chocolatey.log" For the purpose of testing, I've onboarded my Windows 10 machine through Azure Arc and will be using the resource in Azure to collect the logs
Prerequisites
- Azure Account
- Azure CLI Installed
- Chocolately
- Logs Analytics Workspace
- A server or workstation onboarded with Azure Monitor Agent
Create Table in Log Analytics using PowerShell
First, lets start by creating a PowerShell script to create the table and schema. Copy the script I have below. This is a script from Microsoft which I've copied here:
I removed a couple columns but you can leave it as default. There are 2 things you need to change in the script. The table name "ChocoLogs_CL" and the Invoke-AzRestMethod. Do not remove the appended "_CL" as this is required in the name.
Example:
$tableParams = @'
{
"properties": {
"schema": {
"name": "ChocoLogs_CL",
"columns": [
{
"name": "TimeGenerated",
"type": "DateTime"
},
{
"name": "RawData",
"type": "String"
}
]
}
}
}
'@
Invoke-AzRestMethod -Path "/subscriptions/a5ca5e86-3b6f-44b8-a115-d4061ec25089/resourceGroups/arg-aue-plat-mgmt-logging/providers/Microsoft.OperationalInsights/workspaces/law-aue-plat-mgmt-ph3eirsh2ixc6/tables/ChocoLogs_CL?api-version=2021-12-01-preview" -Method PUT -payload $tableParams
Run PowerShell Script
Now save the PowerShell script and run it. It should look like the below. Status 200 means it went through
Validate in the Azure Portal
You can further validate this by nagivating to your logs analytics workspace then tables. I've created a couple for testing as you can see below.
Create Data Collection Rule
Next, you'll need to create a data collection rule. What is a data collection rule? It's essentially a set of instructions that tells Azure where to collect the logs from and where to send it. I won't cover the steps here but you'll need a log analytics workspace, and resource(VM) to collect the logs from. Go ahead and create this.
Create Custom Text Logs
Once you've created the data collection rule, you'll want to create a data source for it. Navigate to the data collection rule you've just created then Data sources and press Add
Select Custom Text Logs in source type and the modify the required fields
- File Pattern: C:\ProgramData\chocolatey\logs\chocolatey.log
- Table Name: ChocoLogs_CL
- Transform: source
Source means it's basically the raw data. This can be modified using KQL if you need it customized. We won't be doing that here.
Destination - Ensure you fill in the destination tab as well. Select the logs analytics workspace you want to send the data to.
Initiate data collection
To ensure the logs are ingested into the Log Analytics workspace, you can trigger the process by generating logs or by installing something using Chocolatey, which will automatically create the logs. This process can take up to 3-15 mins I found.
Create Kusto Query
Finally, after the logs have been ingested, you can run queries on your logs. Navigate to the logs analytics workspace -->Logs
Default - calling the table by calling the table itself "ChocoLogs_CL" or you can use a custom Query which I've generated below
Custom Query:
ChocoLogs_CL | project d = split(RawData,",") | project RawData=d, TimeGenerated=todatetime(d[0]), Message = substring(d, strlen(split(d, ",")[0]) + 1)
And there we have it, you've successfully configured custom logs. You can take this a step further by creating Alerts
Found this article useful? Why not buy Phi a coffee to show your appreciation?