Blog Post

Azure Database for PostgreSQL Blog
3 MIN READ

How to check logs for Azure Database for PostgreSQL

varun-dhawan's avatar
varun-dhawan
Icon for Microsoft rankMicrosoft
Sep 07, 2022


This article explains how to use Azure Portal to configure and view logs for your Azure database for PostgreSQL.

Note: This blog post applies to both Single Server and Flexible Server.

 

Azure Database for PostgreSQL allows you to configure and access PostgreSQL server logs. There are various scenarios to use the logs, such as identifying errors, troubleshoot suboptimal performance, and repair configuration issues. Azure PostgreSQL logs are available on every node of a flexible server. You can ship logs to a storage server, or to log analytics service.  This post discusses how you can configure Azure Database for PostgreSQL logs and shares methods to access these log files using log analytics service.


 

PostgreSQL generates event logs that contain useful information. SQL query failures, failed login attempts, and deadlocks are captured in the logs by default. These error messages can help identify various application issues. For example, if you converted a legacy application from Oracle to PostgreSQL, some queries may not convert correctly to PostgreSQL syntax. These incorrectly formatted queries generate error messages in the logs, which can help identify the problematic application code.

 

In addition to the default logging, you can modify PostgreSQL logging parameters to capture beneficial information in identifying and solving issues such as deficient performance and security audits. These logging parameters help capture information such as connections and disconnections, schema modification queries, slow queries with the duration, queries taking time because they are waiting of locks, queries consuming temporary disk storage, and the backend autovacuum process consuming resources.


This log information can help troubleshoot potential performance and auditing issues while using the database. This post provides more details about enabling this logging and its benefits.


Steps to configure logging


Step 1. Enable logging server parameters

The Azure PostgreSQL server parameter includes several logging server parameters that control the PostgreSQL logging behavior. You can configure these parameters as per your application requirements. For more information about server parameters, please refer  When To Log and What To Log sections of the Postgres documentation.

 

Step 2. Configure log diagnostic settings

 

Once configured, the logs can be shipped to a storage server, or to log analytics service. This can be configured in diagnostic settings.

 

 

 

Step 3. Viewing logs via Azure Log Analytics (LA)

 

The way you access the logs depends on which endpoint you choose. For this article we will use Log Analytics.

For Log Analytics, logs are sent to the workspace you selected. The Postgres logs use the AzureDiagnostics collection mode, so they can be queried from the AzureDiagnostics table. Use following KQL query to get started.

 

 

 

 

 

// Find Errors
// Search for errors in the last 6 hours.
// To create an alert for this query, click '+ New alert rule'
AzureDiagnostics
| where Resource =~ "myservername"
| where Category == "PostgreSQLLogs"
| where TimeGenerated > ago(6h)

 

 

 

 

 

 

 

If you are interested in sending logs to Azure Storage, see the logs storage account article or For Event Hubs, see the stream Azure logs article.

Logging Considerations

Verbose logging tends to cause performance issues, especially if you log all statements. This is when your server is doing the additional work for logging each SQL statement. Our recommendation is to be very careful when enabling statement-logging options.

 

 

 

 

log_statement = 'off' | 'ddl' | 'mod' | 'all'

 

 

 

 

It may be tempting to set this to “all” to capture every SQL statement running on the server, but this may not always be a good idea in reality.  On a busy production instance logging thousands of SELECT per minutes, this can generate a huge amount of logging information. If you need to set the parameters to these values, make sure you are only doing so for a short period of time for troubleshooting purposes, and closely monitor the storage space, throughout. For a complete list of statement level logging parameters, please refer the Postgres documentation


Summary

Azure Database for PostgreSQL logs provides useful information about database activity, that can help in performance tuning and troubleshooting. While a perfect logging configuration will be a matter of trial and error, what I have explained here is how you can configure logs, set up logging destination and query logs via Azure log analytics.

Your feedback and questions are welcome! You can always reach out to our team of Postgres experts at  Ask Azure DB for PostgreSQL.

Updated Feb 10, 2024
Version 17.0
  • UdhayV's avatar
    UdhayV
    Copper Contributor

    I am not able to access the below querystore views in the log analytics workspace for flexible server.

    query_store.qs_view
    query_store.query_texts_view
    query_store.pgms_wait_sampling_view

     

    I already set the below parameters 

     

    pg_qs.query_capture_mode
    pgms_wait_sampling.query_capture_mode

     

    But however i am not able access these views in Custo Query,

     

    Appreciate your help.

     

    Regards

    UV

  • Hi UV,
    thanks for taking time to reading this post.!

     

    Please note, currently In Azure Postgres Flexible Server, Query Store is not integrated with Azure Monitor Log Analytics. This is something that team is currently being worked upon.

    Meanwhile, you can still write SQL queries to access Query Store logs.

    SELECT * FROM  query_store.qs_view;
    SELECT * FROM  query_store.pgms_wait_sampling_view;

    Ref - https://learn.microsoft.com/en-us/azure/postgresql/flexible-server/concepts-query-store#access-query-store-information

     

    Thank you,

    Varun

  • uv's avatar
    uv
    Copper Contributor

    Thanks Varun,  That means the azureDiagnostics table category "Querystoreruntimestatistics " and "Querystorewaitstatistics" not available now. so we can't use KQL. ok.

     

    Using querystore views that you mentioned here, how to display the long running queries from this querystore to azure dashboard? Is there any standard method that we can suggest to our customers?

     

    Appreciate your reply.

     

    Thanks

     

    Regards