Introducing: Log Parser Studio
Published Mar 07 2012 01:57 PM 391K Views

To download the Log Parser Studio, please see the attachment on this blog post.

Anyone who regularly uses Log Parser 2.2 knows just how useful and powerful it can be for obtaining valuable information from IIS (Internet Information Server) and other logs. In addition, adding the power of SQL allows explicit searching of gigabytes of logs returning only the data that is needed while filtering out the noise. The only thing missing is a great graphical user interface (GUI) to function as a front-end to Log Parser and a ‘Query Library’ in order to manage all those great queries and scripts that one builds up over time.

Log Parser Studio was created to fulfill this need; by allowing those who use Log Parser 2.2 (and even those who don’t due to lack of an interface) to work faster and more efficiently to get to the data they need with less “fiddling” with scripts and folders full of queries.

With Log Parser Studio (LPS for short) we can house all of our queries in a central location. We can edit and create new queries in the ‘Query Editor’ and save them for later. We can search for queries using free text search as well as export and import both libraries and queries in different formats allowing for easy collaboration as well as storing multiple types of separate libraries for different protocols.

Processing Logs for Exchange Protocols

We all know this very well: processing logs for different Exchange protocols is a time consuming task. In the absence of special purpose tools, it becomes a tedious task for an Exchange Administrator to sift thru those logs and process them using Log Parser (or some other tool), if output format is important. You also need expertise in writing those SQL queries. You can also use special purpose scripts that one can find on the web and then analyze the output to make some sense of out of those lengthy logs. Log Parser Studio is mainly designed for quick and easy processing of different logs for Exchange protocols. Once you launch it, you’ll notice tabs for different Exchange protocols, i.e. Microsoft Exchange ActiveSync (MAS), Exchange Web Services (EWS), Outlook Web App (OWA/HTTP) and others. Under those tabs there are tens of SQL queries written for specific purposes (description and other particulars of a query are also available in the main UI), which can be run by just one click!

Let’s get into the specifics of some of the cool features of Log Parser Studio

Query Library and Management

Upon launching LPS, the first thing you will see is the Query Library preloaded with queries. This is where we manage all of our queries. The library is always available by clicking on the Library tab. You can load a query for review or execution using several methods. The easiest method is to simply select the query in the list and double-click it. Upon doing so the query will auto-open in its own Query tab. The Query Library is home base for queries. All queries maintained by LPS are stored in this library. There are easy controls to quickly locate desired queries & mark them as favorites for quick access later.

image

Library Recovery

The initial library that ships with LPS is embedded in the application and created upon install. If you ever delete, corrupt or lose the library you can easily reset back to the original by using the recover library feature (Options | Recover Library). When recovering the library all existing queries will be deleted. If you have custom/modified queries that you do not want to lose, you should export those first, then after recovering the default set of queries, you can merge them back into LPS.

Import/Export

Depending on your need, the entire library or subsets of the library can be imported and exported either as the default LPS XML format or as SQL queries. For example, if you have a folder full of Log Parser SQL queries, you can import some or all of them into LPS’s library. Usually, the only thing you will need to do after the import is make a few adjustments. All LPS needs is the base SQL query and to swap out the filename references with ‘[LOGFILEPATH]’ and/or ‘[OUTFILEPATH]’ as discussed in detail in the PDF manual included with the tool (you can access it via LPS | Help | Documentation).

Queries

Remember that a well-written structured query makes all the difference between a successful query that returns the concise information you need vs. a subpar query which taxes your system, returns much more information than you actually need and in some cases crashes the application.

image

The art of creating great SQL/Log Parser queries is outside the scope of this post, however all of the queries included with LPS have been written to achieve the most concise results while returning the fewest records. Knowing what you want and how to get it with the least number of rows returned is the key!

Batch Jobs and Multithreading

You’ll find that LPS in combination with Log Parser 2.2 is a very powerful tool. However, if all you could do was run a single query at a time and wait for the results, you probably wouldn’t be making near as much progress as you could be. In lieu of this LPS contains both batch jobs and multithreaded queries.

A batch job is simply a collection of predefined queries that can all be executed with the press of a single button. From within the Batch Manager you can remove any single or all queries as well as execute them. You can also execute them by clicking the Run Multiple Queries button or the Execute button in the Batch Manager. Upon execution, LPS will prepare and execute each query in the batch. By default LPS will send ALL queries to Log Parser 2.2 as soon as each is prepared. This is where multithreading works in our favor. For example, if we have 50 queries setup as a batch job and execute the job, we’ll have 50 threads in the background all working with Log Parser simultaneously leaving the user free to work with other queries. As each job finishes the results are passed back to the grid or the CSV output based on the query type. Even in this scenario you can continue to work with other queries, search, modify and execute. As each query completes its thread is retired and its resources freed. These threads are managed very efficiently in the background so there should be no issue running multiple queries at once.

image

Now what if we did want the queries in the batch to run concurrently for performance or other reasons? This functionality is already built-into LPS’s options. Just make the change in LPS | Options | Preferences by checking the ‘Process Batch Queries in Sequence’ checkbox. When checked, the first query in the batch is executed and the next query will not begin until the first one is complete. This process will continue until the last query in the batch has been executed.

Automation

In conjunction with batch jobs, automation allows unattended scheduled automation of batch jobs. For example we can create a scheduled task that will automatically run a chosen batch job which also operates on a separate set of custom folders. This process requires two components, a folder list file (.FLD) and a batch list file (.XML). We create these ahead of time from within LPS. For more details on how to do that, please refer to the manual.

Charts

Many queries that return data to the Result Grid can be charted using the built-in charting feature. The basic requirements for charts are the same as Log Parser 2.2, i.e.

  1. The first column in the grid may be any data type (string, number etc.)
  2. The second column must be some type of number (Integer, Double, Decimal), Strings are not allowed

Keep the above requirements in mind when creating your own queries so that you will consciously write the query to include a number for column two. To generate a chart click the chart button after a query has completed. For #2 above, even if you forgot to do so, you can drag any numbered column and drop it in the second column after the fact. This way if you have multiple numbered columns, you can simply drag the one that you’re interested in, into second column and generate different charts from the same data. Again, for more details on charting feature, please refer to the manual.

image

Keyboard Shortcuts/Commands

There are multiple keyboard shortcuts built-in to LPS. You can view the list anytime while using LPS by clicking LPS | Help | Keyboard Shortcuts. The currently included shortcuts are as follows:

Shortcut What it does
CTRL+N Start a new query.
CTRL+S Save active query in library or query tab depending on which has focus.
CTRL+Q Open library window.
CTRL+B Add selected query in library to batch.
ALT+B Open Batch Manager.
CTRL+B Add the selected queries to batch.
CTRL+D Duplicates the current active query to a new tab.
CTRL+ALT+E Open the error log if one exists.
CTRL+E Export current selected query results to CSV.
ALT+F Add selected query in library to the favorites list.
CTRL+ALT+L Open the raw Library in the first available text editor.
CTRL+F5 Reload the Library from disk.
F5 Execute active query.
F2 Edit name/description of currently selected query in the Library.
F3 Display the list of IIS fields.

Supported Input and Output types

Log Parser 2.2 has the ability to query multiple types of logs. Since LPS is a work in progress, only the most used types are currently available. Additional input and output types will be added when possible in upcoming versions or updates.

Supported Input Types

Full support for W3SVC/IIS, CSV, HTTP Error and basic support for all built-in Log Parser 2.2 input formats. In addition, some custom written LPS formats such as Microsoft Exchange specific formats that are not available with the default Log Parser 2.2 install.

Supported Output Types

CSV and TXT are the currently supported output file types.

Log Parser Studio - Quick Start Guide

Want to skip all the details & just run some queries right now? Start here …

The very first thing Log Parser Studio needs to know is where the log files are, and the default location that you would like any queries that export their results as CSV files to be saved.

1. Setup your default CSV output path:

a. Go to LPS | Options | Preferences | Default Output Path.

b. Browse to and select the folder you would like to use for exported results.

c. Click Apply.

d. Any queries that export CSV files will now be saved in this folder.
NOTE: If you forget to set this path before you start the CSV files will be saved in %AppData%\Microsoft\Log Parser Studio by default but it is recommended that y ou move this to another location.

2. Tell LPS where the log files are by opening the Log File Manager. If you try to run a query before completing this step LPS will prompt and ask you to set the log path. Upon clicking OK on that prompt, you are presented with the Log File Manager. Click Add Folder to add a folder or Add File to add a single or multiple files. When adding a folder you still must select at least one file so LPS will know which type of log we are working with. When doing so, LPS will automatically turn this into a wildcard (*.xxx) Indicating that all matching logs in the folder will be searched.

You can easily tell which folder or files are currently being searched by examining the status bar at the bottom-right of Log Parser Studio. To see the full path, roll your mouse over the status bar.

NOTE: LPS and Log Parser handle multiple types of logs and objects that can be queried. It is important to remember that the type of log you are querying must match the query you are performing. In other words, when running a query that expects IIS logs, only IIS logs should be selected in the File Manager. Failure to do this (it’s easy to forget) will result errors or unexpected behavior will be returned when running the query.

3. Choose a query from the library and run it:

a. Click the Library tab if it isn’t already selected.

b. Choose a query in the list and double-click it. This will open the query in its own tab.

c. Click the Run Single Query button to execute the query

The query execution will begin in the background. Once the query has completed there are two possible outputs targets; the result grid in the top half of the query tab or a CSV file. Some queries return to the grid while other more memory intensive queries are saved to CSV.

As a general rule queries that may return very large result sets are probably best served going to a CSV file for further processing in Excel. Once you have the results there are many features for working with those results. For more details, please refer to the manual.

Have fun with Log Parser Studio! & always remember – There’s a query for that!

Kary Wall
Escalation Engineer
Microsoft Exchange Support

34 Comments
Not applicable

Thanks Kary. Getting it installed asap.

Not applicable

This looks brilliant! Trying it out today :D

Not applicable

Very useful & versatile tool, thanks Kary! This will make it easier for me to go thru the logs proactively and identify trends in advance for my org, so later if things sway in other direction, I'll be able to catch that. I love that there are hundreds of built-in queries, that's massive amount of work! & there is cool charting for my CIO/CEO, hey don't tell them .... :)

Not applicable

Great .. Thanks

Not applicable

Are you saying all the logs are imported into a SQL database on the back end and the queries are run against that, or you are storing the queries in a SQL database and running them live against the logs as they exist in their native format and location?

Thanks either way for inventing this tool.

Not applicable

I am getting an exception when analyzing large log files:

System.ArgumentOutOfRangeException: Length cannot be less than zero.

Parameter name: length

  at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy)

  at ExLPT.MainForm.queryTimer_Tick(Object sender, EventArgs e)

  at System.Windows.Forms.Timer.OnTick(EventArgs e)

  at System.Windows.Forms.Timer.TimerNativeWindow.WndProc(Message& m)

  at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)

Not applicable

Hi Michael,

What query are you running and when do you see the error? Immediately, or after the query has been running for some time? Lastly, what type of logs are you querying and what is the size of the logs.

Not applicable

Congrats to Microsoft for finally releasing a GUI for Logparser. I think MS should also release MS Logparser 3.0 with new powerful features. I also think that Log Parser Lizard (search on Google) is much better GUI for MS logparser.

Not applicable

Congrats to Microsoft for finally releasing a GUI for Logparser. I think MS should also release MS Logparser 3.0 with new powerful features. I also think that Log Parser Lizard (search on Google) is much better GUI for MS logparser.

Not applicable

Hi Kary,

I think my post got lost, so I'll try again.

I've tried basically any IIS query with IIS logs. The exception message box appears everytime the Elapsed time is updated. When I close it, it reappears the next time Elapsed is updated. However Elapsed does not stay at 0:00:00, it is still updated. It's like the timer event handler is called two times and one of the calls fails. The size of the log files does not matter much. Only with very small logs (< 100 MB) I don't get the exception, but I guess that's because the query is finished before the elapsed time is updated.

Best regards,

Michael

Not applicable

Hi Michael,

Thank you very much for the information. I believe I have already identified the issue based on the call stack you provided and I am testing the fix at this very moment. Once the testing is complete, I will update the download and post a new comment here when it is ready.

Not applicable

The richness of this tool and the feature set makes logparser just so much easier. I love this tool.

Thanks Kary.

Not applicable

The issue found by Michael should be corrected and is now reflected in the download. Thanks Michael!

Not applicable

Hi Kary,

thanks for fixing this so quick! Great tool!

I have discovered another small issue. I used the [IIS: Request per Hour] query to analyze a log file that is fairly large (> 1.3 GB) and I noticed that for some rows the TotBytesSent column value overflows resulting in a negative amount of bytes sent. Is there a way to enable Int64 values for certain or all columns?

Best regards,

Michael

Not applicable

I get error: Unhandled exception has occured in your application.

What should I do?

Not applicable

This is an awesome tool.. this is going to be one of the most downloaded tool from EHLO.

It was super in testing phase it looks super awesome now.

Thank you

Praveen

Not applicable

Hi Herbert,

CTRL+ALT+E should bring up the error log. You can copy the last entry in the log if you like and post the details. Also make sure both Log Parser 2.2 (complete) and .NET 4.0 are installed. :)

Not applicable

Hi Michael,

I think that field is a Double by default but I will look into it. In the meantime you can possibly workaround the issue using either of the two methods below.

-> Send the output to CSV instead (which bypasses the datatype completely) by adding an INTO statement just above the FROM line:

    INTO '[OUTFILEPATH]ReqsPerHour.CSV'

The default '[OUTFILEPATH]' token can be set in Options | Preferences | Default Output Path.

--> Wrap the bytes field in a DIV() function and convert it to kilobytes:

    DIV(SUM(sc-bytes), 1024) AS TotalKBSent

You can also save any changes to the query with CTRL+S as desired.

Not applicable

Are you saying all the logs are imported into a SQL database on the back end and the queries are run against that, or you are storing the queries in a SQL database and running them live against the logs as they exist in their native format and location?

Thanks either way for inventing this tool.

Not applicable

Hi HotFix,

SQL server or SQL databases are not involved. Only SQL style queries are used to pass to Log Parser 2.2. The queries in the LPS library are stored locally in XML format.

Not applicable

Looks great!  Any plans for handling SharePoint log files?

Not applicable

Awesome tool Kary!

Not applicable

Does one of the existing log file formats work with Exchange 2010 message tracking logs?

Not applicable

Hi Mike,

2010 Message Tracking Log Parsing should be in the next update.

Thanks,

Kary

Not applicable

Wow, this is a great utility. One issue I have found (or at least can't find a way to make it work otherwise) is that there doesn't appear to be a way to have the tool use only the active event log without first adding and selecting a file in the Log File Manager. Even if I don't want to use the file, there has to be one selected. Could there be an option to have it run against "live" event logs without requiring a file selection?

Not applicable

@Anonymous

You got me! I found this as well. The next update will address this. For now as long as there is a log in the Log File Manager (even an empty 123.log file) it should work. When using the event queries it bypasses the log file anyway, I just failed to ignore the check when querying live event logs etc. :)

Not applicable

Great. Do you know when the next update wil be available?

Not applicable

Awesome tool, wasted a lot of time before stumpling upon this.

Not applicable

When I'am requesting EXRPCLOG file, rcp-status returned data is -1 and not real string value

Microsoft

Thank you for this detailed article - I refer my customers to it regularly when they need assistance in parsing various sets of log file data. Kary Wall has been a tremendous help in this area. 

-Nic

Iron Contributor

Will this gem get some love as well from the Exchange Team please? :)

 

Copper Contributor

Is there a digitally signed version of this application? Our IT Security group is requiring applications be signed by the creators and I'd hate to lose access to this utility!

Copper Contributor

Amazing work, thanks so much team!

Copper Contributor

The tool is fairly useful. I only use the IIS scripts to export to powershell, swap the SQL for my own requirements but there are a few bugs etc. Divide by zero is just one of them, null exception is another. One says about the double quotes being obsolete for the output com object.

 

As you use *.log, and filemode:0 (overwrite), it works but if you want to use a for/foreach loop, filemode needs to be changed to filemode:1 (append) and change the *.log to each log file name plus use ([math]::ceiling($i / $fileCount) / 100 )) in a for loop for the progress bar. $i is the current iteration and $fileCount are the total number of files

 

Also, exporting the scripts to Powershell and running are incredible slow when you use REVERSEDNS(c-ip). The queries I run can take 3 minutes but open Visual Studio 2022, create a windows app with a button using startupinfo, the same SQL query takes 1 second or less (including ReverseDns). So, swap the SqlQuery over from a script from LPS to run in Powershell is so slow running. We are talking 25 seconds or more per log.

 

Also, in IIS, create custom fields (X_FORWARDED-FOR etc) for logging then the scripts from LPS fail. If I leave an extended log in the directory and use LPS for a simple select/count and it fails, remove that file and it works again. Seems that the LPS needs some work.

Version history
Last update:
‎Apr 22 2020 08:54 AM
Updated by: