This article explains how to use the Azure portal to configure and view logs for Azure Database for PostgreSQL.
Azure Database for PostgreSQL allows you to configure and access PostgreSQL server logs. There are various scenarios to use the logs, such as identifying errors, troubleshoot suboptimal performance, and repair configuration issues. Azure PostgreSQL logs are available on every node of a Flexible Server. You can route these logs using Diagnostic settings to Log Analytics, Storage Account, or Event Hub. This post discusses how you can configure Azure Database for PostgreSQL logs and shares methods to access these log files using log analytics service.
PostgreSQL generates event logs that contain useful information. SQL query failures, failed login attempts, and deadlocks are captured in the logs by default (depending on your logging settings). These error messages can help identify various application issues. For example, if you converted a legacy application from Oracle to PostgreSQL, some queries may not convert correctly to PostgreSQL syntax. These incorrectly formatted queries generate error messages in the logs, which can help identify the problematic application code.
In addition to the default logging, you can modify PostgreSQL logging parameters to capture beneficial information in identifying and solving issues such as deficient performance and security audits. These logging parameters help capture information such as connections and disconnections, schema changes, slow queries (with duration), queries delayed due to lock waits, queries consuming temporary disk storage, and autovacuum activity.
Steps to configure logging in Azure PostgreSQL
Step 1. Enable logging server parameters
Azure PostgreSQL provides several server parameters that control PostgreSQL logging behavior. You can configure these parameters as per your application requirements. For more information about server parameters, please refer When To Log and What To Log sections of the Postgres documentation.
Step 2. Configure log diagnostic settings
Make sure you select the PostgreSQLLogs category and send it to the intended Log Analytics workspace. Next, the logs can be shipped to a storage server, or to log analytics service. This can be configured in diagnostic settings.
Step 3. Viewing logs via Azure Log Analytics (LA)
The way you access the logs depends on which endpoint you choose. For this article we will use Log Analytics.
For Log Analytics, logs are sent to the workspace you selected. Once diagnostic settings are enabled, PostgreSQL logs are available in your Log Analytics workspace (typically under the AzureDiagnostics table for PostgreSQLLogs).
Use the following KQL query to get started:
// PostgreSQL server logs (last 6 hours)
// Tip: Use ResourceId filtering so the query works reliably across RG/subscription changes.
AzureDiagnostics
| where ResourceProvider == "MICROSOFT.DBFORPOSTGRESQL"
| where Category == "PostgreSQLLogs"
| where ResourceId contains "/providers/Microsoft.DBforPostgreSQL/flexibleServers/"
| where TimeGenerated > ago(6h)
| where Resource =~ "myservername"
| project TimeGenerated, Resource, Message
| order by TimeGenerated desc
Common filters you can add
- To narrow to one server, add:
| where Resource =~ "myservername" - To narrow time window, tweak:
ago(30m),ago(24h), etc. - To narrow to a resource group:
| where ResourceGroup == "my-rg"
If you want to send logs to Storage Account, see the logs storage account article. For Event Hubs, see the stream Azure logs article.
Logging Considerations
Verbose logging tends to cause performance issues, especially if you log all statements. This is when your server is doing the additional work for logging each SQL statement. Our recommendation is to be very careful when enabling statement-logging options.
log_statement = 'none' | 'ddl' | 'mod' | 'all'
It may be tempting to set this to “all” to capture every SQL statement running on the server, but this may not always be a good idea in reality. On a busy production instance logging thousands of SELECT per minutes, this can generate a huge amount of logging information. If you need to set the parameters to these values, make sure you are only doing so for a short period of time for troubleshooting purposes, and closely monitor the storage space, throughout. For a complete list of statement level logging parameters, please refer the Postgres documentation
Summary
Azure Database for PostgreSQL logs provides useful information about database activity, that can help in performance tuning and troubleshooting. While a perfect logging configuration will be a matter of trial and error, what I have explained here is how you can configure logs, set up logging destination and query logs via Azure log analytics.