Log Parser Studio

Log Parser Studio is a utility that allows you to search through and create reports from your IIS, Event, EXADB and others types of logs. It builds on top of Log Parser 2.2 and has a full user interface for easy creation and management of related SQL queries.

4.6 Star
306,462 times
Add to favorites
E-mail Twitter del.icio.us Digg Facebook
  • Hi Kary, I just downloaded the latest LPS and expected to see the advertised 170 queries?
    3 Posts | Last post June 20, 2013
    • it looks as though the version I have is dated 3/8/2012 and is version
    • Hi Greg,
      That is version 1.0 and I'm not sure how that one would be showing up. The new download should be listed as LPSV2.C.ZIP above. Here is the link below. Please let me know if you are able to download the latest version:
    • Thanks very much Kary, I did download from this page yesterday but today I got the new version:-)
      Thanks again
  • exchange 2003 support
    3 Posts | Last post June 20, 2013
    • Hi, is this tool suitable for exchange 2003 log analysis? thank you
    • Hi gian77,
      You should be able to analyze most 2003 logs. If there is a particular log, query, result you are interested in for Exchange 2003. Please post here and/or the blog below and I will get you pointed in the right direction. 
      More LPS articles here: http://blogs.technet.com/b/karywa/default.aspx?PostSortBy=MostRecent&PageIndex=1
    • To follow up on this, I posted instructions along with a sample query on how to query Exchange 2003 Message tracking logs here:
      For Exchange 2007 & Exchange 2010 message tracking logs, the EEL log format should suffice. There are some built-in 2010 message tracking queries in LPS 2.0. Just search the library for "Message Tracking" without the quotes.
  • EAS Throttling Budget Query returns no results
    3 Posts | Last post June 12, 2013
    • I think it has something to do with the fact that EAS Budget info in the iis log is URIEncoded.
      If I remove the >75% conditionals, the results do not have any of the fields correct populated.
      It looks like the entire budget string is stuffed into each field.
    • Answering it myself...the query needs to be URLUNESCAPED.  
      Devs:  Please contact me and I can get you the correct syntax.  It is a bit to complicated to paste here.
    • Hi Steve,
      I just saw this but you and I have already communicated and corrected which is good. I just wanted to follow up officially here as well. Thanks for pointing this out as well!
  • Possible Bug?
    3 Posts | Last post June 12, 2013
    • If I try to do a simply query (select top 500 from '[LOGFILEPATH]') on Exchange SMTP Receive Protocol Logs and select EXSMTPRECLOG, it returns the error:
      Log Parser Studio
      Format Exception: one of the records returned was a different format than the rest. Some records were not returned. Run the query to CSV if you see this error again.
      Query: New Query
      If I change it to EXSMTPSENDLOG the command works.  The log formats are the same, so it isn't a blocker for me, but I just wanted you to know.
      Also, is it possible to get log parser studio to: 1) display the log parser command it's running, 2) allow us to modify it?  Or is strictly calling the DLL so that isn't possible.
    • Hi Mark,
      That error is the same as I was explaining to Sigurd. What likely happened is that the first few rows of a column contained numbers then further down it was text or similar. LPS attempted to calculate the data type for grid sorting. When the differing data type was encountered the error is thrown because it isn't the same data type the affected column was originally set up as. 
      The easiest workaround in the current version is to send the output to a CSV file instead:
      SELECT *
      The above bypasses all data type calculations including that error if I remember.
    • I just handed off the final bits a couple of days ago so I expect the updated EHLO article and the download file (on this page) to be updated with the latest version between today and Monday. I don't know the exact day so it could technically be later but that's my current guestimate at the moment. I will update this post if anything changes.
      There were a lot of bugs fixed and features expanded but all of you who really know how to push the limits will be key in helping me iron out anything I may have missed or regressed. :)
  • Override automatic time format?
    2 Posts | Last post May 22, 2013
    • The LogParser gui is nice, but unfortunately it tends to mask the seconds part by automatically formatting this to what appears to be "DD.MM.yyyy HH:mm" (this is my local culture datetime format).
      Using TO_STRING() doesn't do a lot of good since the GUI apparently recognizes the output as a datetime and formats it. To wit:
          TOP 1
          TO_STRING(TO_TIMESTAMP(date, time), 'dd.MM.yyyy hh:mm:ss') AS Created1 ,
          TO_STRING(TO_TIMESTAMP(date, time), ': dd:MM.yyyy hh:mm:ss :') AS Created2
      Created1 = "03.05.2013 21:00"
      Created2 = ": 03:05.2013 21:00:00 :"
      Is this a bug or a feature?
    • Hi Sigurd,
      In the current version it always attempts to guess the data type of the first few records returned in order for the grid in the GUI to be sortable which is a requirement of the grid itself; that probably results in the auto formatting you are seeing by the underlying .NET grid view.
      The upcoming version allows this to be turned on/off in preferences. There are a couple of advantages/caveats:
      1. With sort enabled you can always properly sort by any column in the results. However, there is always the risk of an error if the column contains different data types. It may see integers in the first few records, then 1000 records down a string is found. 
      2. With sort disabled the above errors are avoided but the columns may not sort as expected when dealing with numbers for example. In order to build the grid for sorting it is required to "know" the data type before creating it which is where the challenge is because we can only look at the first few results to efficiently guess at what the data type will be.
      What I typically do since I tend to use LPS much of the day everyday is to keep sort disabled and just enable as needed (or vice versa). As I said there are lot more features in the soon to be available version which gives the user some control over this. Just checking your test query above in the new version I get the following result:
      	Created1	          Created2
      	06.06.2011 00:00:00	: 06:06.2011 00:00:00 :
  • Exchange RPC Client Access Logs (EXRPCLOG)
    2 Posts | Last post May 22, 2013
    • Hello,
      There is a problem in the data displayed/returned by the "EXRPC: Find all requests where rpc-stats > 0 or Failures > 0" query.  Three of fields show a value of -1 (negative one) but that is not the real data in the log files. 
      The three field that are not displaying the correct data are: rpc-status, processing-time, failures
      I've tried to run Log Parser 2.2 manually but get errors about the data type of rpc-status.  If I disable parsing (-dtLines:0) I can get rpc-status data with TO_STRING, but them my reformatting of date-time does not work.
      Any suggestions or fixes for working with the RPC Client Access Logs?
    • Hi Mike,
      The issue lies in the fact that the rpc-status field can sometimes return disparate data types contained within the results. TO_STRING() is typically the workaround but I'm not sure why using that on the rpc-status field would cause issues with reformatting the date-time. If you can help me understand that piece, I'll get you an answer.
      The latest version should be ready for download by the end of this month and the Exchange logs support is more robust. However, the underlying issue remains that the rpc-status field may not return expected data types which based on the contents of the log may need to be accounted for.
  • Automation
    5 Posts | Last post April 26, 2013
    • I'm having problems with CSV generation in automated batch jobs.
      The batch jobs work fine if I run them from the GUI but if I run them from the command line (LPS.exe batch.xml files.fld) the CSV files are always opened using the associated program.
      Is the "Open CSV file when query completes" option being ignored when batch jobs run from command line or am I doing something wrong?
      I also noticed the bug that Mike Celone mentioned. It's not possible to specify log files using wild cards. That would be very good to have since IIS logs change names every day.
    • "Open CSV file when query completes" means the same as double-clicking it which will open in the default CSV application. The biggest reason for running a query to CSV is to avoid very large recordsets being loaded in LPS. Since Log Parser 2.2 is 32bit, LPS must run 32bit which leaves the possiblity of running out of memory when loading very large result sets.
      However, you could achieve similar behavior by not running it to CSV (assuming the result isn't too large) then manually export it as CSV from within LPS.
    • Fixing the batch jobs using wildcards with folders as we speak. :)
    • The reason I wanted to generate a CSV was that I wanted to have a report generated and emailed automatically to me on a regular basis.
      It doesn't seem possible because LPS.exe launches the default CSV application and waits for it to exit. That kind of blocks further execution of my script.
      It would be nice if it didn't launch the associated program after generating the CSV. After all I have selected the option not to do so...
    • Forgive me for not seeing that you acknowledged this problem in another thread last week. Never mind, I'll wait for the new version then.
  • New Version Release Date
    4 Posts | Last post April 18, 2013
    • Was wondering when the new version will be released to correct the batch problem.  Also seeing issues in current version that does not acknowledge the do not open CSV after running a query.  I have removed the check box but it still prompts for each query being run.
    • Hi Wayne,
      The new version is essentially done (including a fix for the batch problem) and I'm working hard to get it public as soon as we can. I'm not sure I understand the CSV export issue. What is the prompt you are seeing? That option should either open the CSV in the default viewer or do nothing assuming the query contains an INTO statement to a CSV file. 
      I can't think of a prompt that would occur unless there isn't a default viewer for CSV, however, I would expect unchecking the auto-open CSV option in preferences would work in the current version either way. I'll test today and confirm it works as intended.
    • Thanks for the update on the new version.  
      The CSV issue is I have unchecked the option to not prompt to open the csv after the query finishes but noticed that option is ignored if a new query tab is created.  So when I go back into preferences, it is unchecked but still prompts to open a CSV file.   I get this on default MAS queries as well as custom queries built as if it is not reading the preference settings.
    • Thanks for pointing this out Wayne; I just confirmed the issue. I'll make sure this isn't the case in the new release.
  • MS APP .evt file EventIDs output to .csv wrong format
    3 Posts | Last post April 11, 2013
    • If I use LogParser to convert a MS APP .evt file to .csv format, the EventIDs are numbers, e.g. 17177, 1, 58063.  If I run the same file through Log Parser Studio, the corresponding EventIDs to the three I listed are:
      1.07E+09, 1.07E+09 and 2.68E+09.   Is there some way to convert them to useable numbers?
    • Hi Ellen,
      In some cases the underlying event ID is different than what Event Viewer displays and this is what is returned to LPS from LP 2.2. When this is the case I think you can us a bit shift operator function in the query to translate to the correct event ID. There is an existing built-in query if I remember that does this conversion as part of the query; without that conversion you may see what you are experiencing. I'll see if I can find that conversion for you and post back how you might use it.
    • Hi Ellen,
      Following up as promised. See the following example query that finds the top 1000 Errors/Warnings in the application event log. Pay special attention to the third line that demonstrates how to convert the underlying event ID into the expected event ID. This should work for most events though I have seen the occasional corner case where it doesn't:
      SELECT TOP 1000 
      BIT_AND(EventID, 0x3fffffff) as EventID,
      EventTypeName as Name,
      SourceName as Source,
      Strings, Message, Data 
      WHERE EventType = 1 OR EventType = 2
      ORDER BY TimeGenerated DESC
  • LPR doesn't recognize LogParser2 in local path
    2 Posts | Last post April 11, 2013
    • I'm running LPR in a server to which I do not have admin access. I have downloaded LogParser2 and put all the files in the filder C:\logs. In the same folder I store the LPS files.
      But when I try to start LPS I get a message that I need to download LogParser2... Why can't LPS find the LogParser exe/dll files in the same directory as it was started from?
    • Hi Joakim,
      If I understand the question you need to either run the installer for Log Parser 2.2 then choose "Complete Install" or place logparser.dll in the directory of your choice then manually register the dll with regsvr32. logparser.dll is a COM DLL and needs to be registered on the system Log Parser Studio is running on.
51 - 60 of 68 Items