Tuesday, May 31, 2005

Clustering Q+A from Russ

Russ Kaufmann has posted a useful set of common Exchange clustering questions/answers over at his blog.

IT's Showtime: Exchange Cluster Best Practices

Scott Schnoll, longtime MVP and now a Technical Writer with Microsoft, gave an “Exchange Server 2003 Cluster Deployments: Best Practices” talk at ITForum 2004. The presentation is available online in full! Here’s the link: http://www.microsoft.com/uk/technet/itsshowtime/sessionh.aspx?videoid=10

(credit where it’s due: I ran across this link at Mark’s blog)

Tuesday, May 24, 2005

Public folders and more public folders

Public folders are the monthly theme for May on Microsoft Exchange Server TechCenter. Some good reading there!

If you can't get enough of public folder stuff, you might also want to mark May 23 in your calendar as Nino Bilic be delivering a Exchange Public Folder Internals Webcast then.

New Public Folder Administration coming in Exchange 2003 Service Pack 2

From a post by Dave Whitney:

Each week, we examine the top Support Services (used to be called PSS) calls and how much it’s costing the company to service these calls. While public folder administration does not by itself compose a high percentage of these costs, a high percentage of the public folder administration issues stem from one single dialog and a complete misunderstanding of what it does.
I’m speaking of the right-click menu item off of a public folder in ESM called "Propagate Settings". What this dialog allows an admin to do is copy particular properties of the selected folder to all of the subfolders beneath it. Seems fairly self-evident, except people are led to believe that delta changes they just made to the selected folder (such as adding a user to the ACL, or making some change to the folder’s replica list) is what will be propagated. Not so! The dialog only copies the current settings to all the subfolders. This usually results in spectacular replication storms if the admin is propping down changes to the replica list.
As a result, we’ve replaced the Propagate Settings dialog with the Manage Settings wizard. The wizard has three gross paths: 1) do things the old way (when you really do want to copy properties to all the subfolders), 2) make a delta change to the folders’ client permissions, and 3) make a delta change to the folders’ replica list. You can add, remove or replace replicas and users, and you can also modify permissions of a single user. So, say you’re interested in moving a hierarchy of folders off of one server onto some other server. You don’t know offhand which folders in that hierarchy are presently on your source server. No problem! Just use the Replace a Replica path through the wizard to replace server A with server B. It’ll only affect folders where server A is presently a replica, so the rest of the folders in that portion of the hierarchy will remain unaffected.
Another high percentage of the public folder-related calls results in admins retiring public folder stores which still had content replicas. They end up losing data either because users had posted data to that store and it never got a chance to replicate out, or that store was the sole replica for some folders.
As a result, we’ve tightened up deleting public stores. You can no longer delete a public store that still has unreplicated data present. You must first move all the replicas to another server (or delete the folders you just don’t want or need to keep). This includes system folders too. The aforementioned wizard is helpful, but is still a pain because the admin would need to right-click each top level folder and run through the wizard. So, there’s now a new right-click menu off of the public store object itself: Move All Replicas. The admin just chooses some other server and voila! all of your replica lists are modified. Note that you can’t delete the store until everything has actually replicated away. Simply changing the replica list is insufficient - the system will confirm when all the data is actually replicated away, and this process could take a substantial amount of time.
This small handful of changes should substantially lower customer headache pain levels, and reduce our public folder support costs to nearly zero.

Warning message when installing Exchange 2003 Service Pack 1 on Windows Server 2003 Service Pack 1

My buddy Joe sent this out and I am glad to see that this error can be ignored!

Just an FYI on Windows SP1 in regards to Exchange 2003 SP1.
Horizons Consulting, Inc.
   A Microsoft Gold Certified Partner
Phone: 1.800.499.4110

Live Meeting 2005 Add-in Pack Beta 2

With the Live Meeting Add-in Pack, you can perform conferencing tasks directly from Microsoft Office Outlook, Word, Excel, PowerPoint, Project, Visio, as well as from Windows Messenger, MSN Messenger, and Office Communicator 2005.

With the Live Meeting Add-in for Outlook, you can:

  • Schedule a Live Meeting from Outlook
  • Identify individual meeting participants as attendees or presenters
  • Send separate invitations for attendees and for presenters
  • Specify default meeting options and override those defaults for specific meetings

Live Meeting Add-in for Office Collaboration

  • With the Office Collaboration Add-in, you can start a Meet Now meeting directly from Word, Excel, PowerPoint, Visio, or Project. The document appears in an application sharing session.

Live Meeting add-in for Instant Messaging

  • If the recipient also has the Live Meeting Add-in Pack installed, you can start a Live Meeting from Windows Messenger or from MSN Messenger.

Microsoft wins email technology patent

Microsoft has won a patent that could give the company control of certain essential features found in multiple email applications.

The company was granted US patent number 6,895,426 on Tuesday. The patent covers, "treating electronic mail addresses as objects".

Among other uses of this patent, Microsoft claims the technologies lets email addresses be added to a contact list easily: it also covers situations in which a user copies an address to a clipboard or double-clicks to access contact information that is relevant to an email address.

"Email addresses are treated as objects in the message preview pane and full message windows of both incoming and outgoing email messages. A small icon is added to the text of each address. In a preferred embodiment, the icons will vary depending on the pedigree of the address," the patent, which was filed on October 17, 2000 claims.

Wednesday, May 18, 2005

DiskPart or DiskPar...

"Disk subsystem bottlenecks cause more performance problems than server-side CPU or RAM deficiencies, and a poorly designed disk subsystem can leave your organization vulnerable to hardware malfunctions."

This is so true! As memory chips become cheaper and consolidation dictates more users per server, the disk subsystem turns into the main cause of performance problems. The key to a system without problems is a proper planning and design. The document from which I extracted the beginning sentence, Optimizing Storage for Exchange Server 2003, is a good source of information.

One of the improvements you can make on your storage is to align the disks. Microsoft provides a tool, Diskpar, which allows to align the disk tracks with sector tracks. For partitions created by Windows 2000 and Windows Server 2003, the default starting sector for disks that have more than 63 sectors per track is the 64th sector, causing one out of every eight blocks of data written to your disk to span two disk tracks. Diskpar can increase disk performance as much as 20 percent, but you should always consult your hardware vendor before using this tool. Some disk configurations will have no benefit from the tool.
One fine example of information provided by the storage vendor is HP. Just take a look at this document: HP StorageWorks Enterprise Virtual Array 5000 and Microsoft® Exchange Server 2003: storage performance and configuration — white paper.

The Diskpar utility can be found in the Windows 2000 Server Resource Kit. With the release of Windows Server 2003 SP1, diskpart now includes this functionality. The new syntax for creating partitions with diskpart is:

create partition primary [size=N] [offset=N] [ID={Byte | GUID}] [align=N] [noerr]

Of course, you can still use diskpar:

Usage: diskpar [ -i | -s ] DriveNumber
  -i: query drive layout and partition information
  -s: set partition information (only used on raw drive please)

Tuesday, May 17, 2005

Office 12 to ease lines of communication

From an article on ZDnet, it appears that Groove will be part of Office 12 in order to add much richer collaboration services to the next release of Office.  This will include finding out if co-workers are online and available, then finding a way to share pre-determined data.    This knowledge will include the IM services as well as Internet Telephony! 

"The ability to handle voice over Internet Protocol technology is "one of the core features (Microsoft is) building into the next version of Office," said Hilf, Microsoft's director of platform technology strategy. The company also plans to continue adding more server-based productsto the Office family of products, he said."

Friday, May 13, 2005

Thursday, May 05, 2005

Log Parser Basics... Ok, a little more than basics.

Mark Minasi's Windows Networking Tech Page newsletter this month posted an article to describe details behind using the Microsoft LogParser utility.  It is good reading and will come in handy...


Introducing Log Parser, a Tool You Must Learn

I'm too cheap to pay for one of those Web site analysis tools, but I'd like to be able to extract a few statistics from my Web logs -- in particular, it'd be great to know how many hits a particular page had, or how many of you took a moment and read this newsletter.  When I asked my friend and IIS expert Brett Hill, he got this mystical look in his eyes -- you know the way people look when they're about to tell you about the Secrets Of The Universe that they've recently discovered? -- and beatifically intoned, "Log Parser."

Now, I'd already heard about Log Parser, but I'd also heard that it was a [fill in your favorite frustration-related adjective] nightmare to understand syntax-wise.  Brett said no problem, he was going to do a talk about Log Parser at the next Windows Connections conference.  But Brett got hired away by Microsoft -- he's now an IIS 7.0 Evangelist and yes, I did mean to type "7.0" rather than "6.0" -- and so Randy Franklin Smith, a big-time security techie, stepped in.  In his presentation, Randy did just what I needed him to do:  give me a bunch of already-working examples of Log Parser syntax so I could get started.  So I've been working with it and in this article, I'll explain why you really want to learn it and then I'll explain some of its truly nightmarish syntax.

I strongly recommend that you give this article a look.  This is a very useful tool and, of course, the price is right.

What Log Parser Can Work On

Log Parser is a free command-line tool that can analyze and report on a variety of files.  I've already suggested one use, to count the number of times that this newsletter has been viewed.  But Log Parser can also analyze event logs, your Active Directory ("show me all of the managers -- that is, someone who appears in the 'manager' attribute of my user accounts -- and compute how many people each person manages"), the file system ("show me the names of the ten largest files on the hard disk"), any text file ("how many times does the word "really" appear in this document?"), Network Monitor output, the Registry ("how many REG_DWORD values exist in my Registry?") and a number of other formats.  It will then output that data as text, a file, a datagrid, new entries in a SQL database, SYSLOG entries, XML, and so on.

Installing Log Parser and Some Sample Data

You can find Log Parser at Microsoft's downloads section.  It's a simple MSI file and so a snap to install.  Unfortunately it installs itself to a folder in Program Files without modifying the system's PATH environment variable, meaning that you've got to be in Log Parser's directory to run it or you'll get a "bad command or file name" error.  Either add Log Parser's directory to your PATH variable, or if you're lazy like me then just copy the logparser.exe file into System32.  Then open up a command line and you're ready to start parsing.

But we'll need some data to analyze.  I've simplified (and sanitized) two day's logs from my Web site and put them at


Right-click that URL from Internet Explorer and choose "Save target as..." to save the file to your hard disk.  Unzip it and you'll find two log files -- put them in a directory named c:\logfiles.  Make that your default directory ("cd \logfiles") in your command prompt window and your commands can be shorter, as you won't have to type "c:\logfiles" in your Log Parser commands.  With that, we're ready to go.

Note:  here's another strong recommendation:  grab that ZIP file, download Log Parser and try this stuff out.  If you're feeling lazy, remember you can always just copy a Log Parser line from the Web page you're reading and paste it into your command prompt.

A First Query

Let's try out just about the simplest Log Parser command possible:

logparser "select * from *.log" -i:iisw3c

Picked apart, it is the command "logparser," followed by a SQL query statement -- don't run away, I'll show you all the SQL you'll need today! -- followed by the -i option, which explains to Log Parser what kind of file it is (an IIS log file, in this case).  The SQL query is "Select * from *.log", which just means "get everything" -- the asterisk works in SQL the same as it does in DOS commands, meaning "everything" -- from all of the files with the extension "log" in the current directory. 

(Aside: this is why learning Log Parser is difficult -- you're trying to learn two new things at the same time.  Half of what you're trying to learn is Log Parser's syntax, which is ugly enough all by itself.  But every Log Parser query includes a SQL query, and if you've never written SQL queries then you'll find that they're a quite wide field of syntax to master as well.  I strongly recommend taking the time to browse through the logparser.chm Help file that installs in the same directory as Log Parser.  And let me note at this point that I'm not a SQL query expert, so I may not be approaching these problems in the best way.)

Controlling Where Log Parser Puts Its Output: rtp, FROM and Data Grids

You'll get ten lines of output and then a "press a key..." prompt.  Given that there are about 6491 lines in the two logs and assuming that you want to see every line, that'd mean you'd have to press a key about 649 times... yuck.  That's where the -rtp ("records to print?") option comes in; set it to 100 and it'll only pause every 100 lines.  Set it to -1 and you'll never see a "press a key..." again:

logparser "select * from *.log" -i:iisw3c -rtp:-1

Of course, that still takes a long time and is kinda useless in a command prompt window.  We can tell Log Parser to stuff the result into a file by adding the "INTO" option.  It goes in the SQL query before the FROM part.  This takes the output and puts it in a file called OUTPUT.TXT:

logparser "select * into output.txt from *.log" -i:iisw3c -rtp:-1

Open output.txt in Notepad and you'll see that you've got all of the info from the two logs nicely collected in output.txt.  But Log Parser can output its data in other ways as well.  In particular, it can use a built-in thing that 2000, XP and 2003 contain called a "data grid."  We tell Log Parser to output to something other than its "native" format (dumping all of the junk onto the screen) with the -o: option:

logparser "select * from *.log" -i:iisw3c -rtp:-1 -o:datagrid

I guess I shouldn't be surprised given that Windows is 40 or 50 million lines of code these days, but it's always interesting to learn that there's something "new" that I already owned.  Notice the "Auto Resize" button -- click it and the columns figure out how wide they should be; very nice.

Seeing What's In an Input File

Notice what's going on here -- Log Parser used spaces, tabs or commas to separate -- "parse" is the correct phrase -- each line into particular items.  You then see in the column headers the names of those items.  For example, the IP address of the person visiting my Web site is in the c-ip field, and the file that they viewed is in the cs-uri-stem field.  Or alternatively you can ask Log Parser about the "iisw3c" format like so:

logparser -h -i:iisw3c

But what we've seen so far is really just a straight dump of the logs, no computation or analysis.  What if I just wanted to see the IP addresses of my visitors?  I'd do that by restricting the things that I SELECT:

logparser "select c-ip from c:\logfiles\*.log" -o:datagrid -i:iisw3c -rtp:-1

Doing a Little Analysis and Beautifying:  COUNT, GROUP BY and AS

Ah, a much smaller amount of data, but again no analysis.  It'd be more interesting to see how often each one visited.  Again, I modify the SQL SELECT statement. I can create a new field that reports the number of times that a given IP address appears by adding a "count(*)" variable to the SELECT statement.  COUNT does just what you'd expect it to do -- it counts records.  A super-simple example might be (note that this doesn't work, I'm just introducing the idea):

logparser "select c-ip, count(*) from *.log" -i:iisw3c -o:datagrid -rtp:-1

Now, if this did work, it'd list two columns -- each IP address and how often that IP address appears.  If something showed up 30 times you'd see it 30 times with a count of 30 next to it each time.  But, again, this doesn't work, and Log Parser says that it wants a "group by" clause.   So let's do it logparser's way and add a GROUP BY clause.

logparser "select c-ip, count(*) from *.log group by c-ip" -i:iisw3c -o:datagrid -rtp:-1

That works, and we get about 1200 entries instead of almost 6500, as the duplicates are gone.  But the data grid's column label for the c-ip count is "COUNT(ALL *)," which is not all that meaningful.  That count is the number of visits that a particular IP address made, so we'd like Log Parser to call that column something more meaningful, like for example, oh, "visits."  I can do that by adding an AS clause to the list of selected items:

logparser "select c-ip, count(*) as visits from *.log group by c-ip" -i:iisw3c -o:datagrid -rtp:-1

Sorting and Shortening:  ORDER BY, TOP, DESC and HAVING

That query's output has a better column title, but the list shows the IP addresses in no particular order. It'd be nice to have it sorted by frequency, so we add the "ORDER BY" clause (this should be typed as one line although I've  broken it up so that your browser doesn't make you scroll left and right):

logparser "select c-ip, count(*) as visits from *.log group by c-ip
order by visits" -i:iisw3c -o:datagrid -rtp:-1

Neat; now it's easy to see that one IP address visited over 140 times.   But there are an awful lot of one-visit IP addresses; can we just see the top five visitors?  Sure, with the TOP clause (again, type as one line even though I've broken it for a happier browser experience):

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip
order by visits" -i:iisw3c -o:datagrid -rtp:-1

Hmmm... that showed me the "top" five, all right, but the "top" of the data grid is the low numbers, as it starts out with the one-visit IPs and ends up with the multi-time visitors at the bottom of the data grid.  How to see the bottom five?  Well, there isn't a BOTTOM clause, but we can tell it to sort descending rather than its default, ascending, by adding the DESC clause to ORDER BY.  Then the most frequently-visiting IP addresses end up at the top of the results and TOP 5 works as we'd hoped (type as one line):

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip
order by visits desc" -i:iisw3c -o:datagrid -rtp:-1

Or alternatively I might just want to see all of the IP addresses that have visited me more than 50 times. I can do that with the HAVING clause of the SQL query (type as one line):

logparser "select c-ip, count(*) as visits from *.log group by c-ip
having count(*) >50 order by visits desc" -i:iisw3c -o:datagrid -rtp:-1

Notice that in that case I removed the "top 5" and added "having count(*) > 50" after the "group by."  Log Parser will squawk at you if you put them in a different order.  In SQL queries want to see their clauses in a particular order -- first the SELECT, then the things you're selecting, then the FROM, then the HAVING, then the ORDER BY.  If you're using an INTO, it goes after the things you're selecting and before the the FROM.  So for example if I wanted the output of my query to go to a text file named out2.txt, I'd type (yes, again please type as one line although it's broken):

logparser "select c-ip, count(*) as visits into out2.txt from *.log
group by c-ip having count(*) >50 order by visits desc" -i:iisw3c -rtp:-1

Side Trip:  More Useful SQL Syntax: WHERE, LIKE and More

As long as I'm talking about the format of the SQL SELECT statement, let's take a short side-trip from questing for the perfect query (remember, I'm trying to figure out how many people saw thismonth.htm) and look at what the SELECT statement can do in a bit more detail. 

If you have a really long, ugly SQL query then you can put it in a text file and refer to it.  For example, consider that last query -- it had a huge SELECT statement.  I could type its SQL part, "select c-ip, count(c-ip) as visits from *.log group by c-ip having count(*) >50 order by visits desc" in a text file like so:

select top 5
count(*) as visits
from *.log
group by c-ip
having count(*) > 50
order by visits desc

I then save that in a file I'll call myquery.sql (but I could call it anything).  Then this command gets the job done:

logparser file:myquery.sql -i:iisw3c -o:datagrid -rtp:-1

But I've left out a couple more types of clauses that you can put into a Log Parser SQL query.    There are also the WHERE and USING clauses.  (By the way, Log Parser only supports a subset of honest-to-God SQL.  Thank goodness.  And USING seems not to be a standard SQL clause.)  The order of these clauses in a Select statement is:

  • INTO
  • FROM

And no, I haven't defined what USING does yet, that's kind of beyond the scope of this introductory article.  I just wanted to offer one place where I presented all of the SQL clauses in order for easy reference.  Let's take up WHERE next.

By saying "select c-ip, count(c-ip) from *.log...", I told Log Parser to get the c-ip data from every single line on all of the logs in my local directory, and then do its analysis on those records.  But sometimes I want to tell Log Parser not to fetch every single record, but instead a subset.  I do that with the WHERE clause.  Suppose (I know this is a stretch but I need an example that fits into our data set) I only want to see entries where the IP address starts with "194."  This WHERE clause will do that :

logparser "select top 5 c-ip, count(*) as hits from *.log where c-ip like '194.%.%.%'
group by c-ip order by hits desc" -i:iisw3c -o:datagrid

This command includes the clause "where c-ip like '194.%.%.%' and "LIKE" means "matches a particular pattern."  Patterns can either be particular characters, like if I'd written "cs-ip like ','" or they can use the "_" and "%" wildcards.  "_" means "match exactly one character" and "%" means "match zero or more characters."  The pattern '194.%.%.%,' then -- notice that LIKE patterns are always surrounded by single quotes -- would match any IP address that started with 194, followed by a period, followed by anything (the %), followed by another period, followed by another percent, followed by a final period and a percent.  Here are a few more LIKE pattern examples:

Pattern Examples that would match
'Mark' Mark -- "mark" wouldn't do it, case matters
'Mark%' Mark Minasi, Mark77, Marky, Mark
'Mark_' Marky, MarkM; Neither Mark nor Mark Minasi would work
'Mar%k' Mark, Maraardvark
% any string at all, or even nothing

There is also a NOT LIKE command.

So we've seen that we can use WHERE to restrict the things that SELECT does its work on.  But doesn't HAVING do that as well?  Kind of, but not exactly.  WHERE restricts the data that SELECT looks at to do its analysis; HAVING restricts the results of that analysis.  If that's not clear, let's do another query that will make it clear, as well as giving me an excuse to do some more Log Parser examples.

Querying For the Most Popular Files

We've explored our most-visiting-IP-addresses, but recall that wasn't really what I wanted to do -- I wanted to see how often people viewed thismonth.htm.  Can you see how you'd change it so that we're not seeing the most frequently-visiting IP address, but instead to see the most-requested files?  A look at the data grid output shows that the name of a requested file -- default.asp, thismonth.htm, or the like is the field "cs-uri-stem."   Form the same query as before, but replace "c-ip" with "cs-uri-stem."  Additionally, "hits" is probably a better phrase than "visits" and "file-requested" is more meaningful to most than "cs-uri-stem" and so we end up with this query (again broken here but should be typed as one line):

logparser "select top 5 cs-uri-stem as requested-file, count(c-ip) as hits from *.log
group by requested-file order by hits desc" -i:iisw3c -o:datagrid -rtp:-1

But let's recall that I had a particular query in mind when I got started -- how many hits did my newsletter file, thismonth.htm, get?  I could figure that out from the query that showed me the top five most-visited files, but that's kind of lame.  Instead, this query does it with a WHERE clause, reporting right to the screen (again, type as one line):

logparser "select cs-uri-stem as filename, count(*) as hits from *.log
where filename='/thismonth.htm' group by filename" -i:iisw3c

Again, notice that WHERE clause.  SQL queries fetch some subset of the fields in a database (cs-uri-stem in this case, for example) -- Log Parser didn't grab every available field in the IIS logs, just cs-uri-stem, and so it had a smaller bunch of data to work on, which presumably would make the query run faster.  So naming particular fields in the SELECT statement instead of entering * to get all of the fields reduces the number of fields to fetch before doing some kind of analysis and reporting.  In contrast, using a WHERE clause reduces the number of records fetched.  Fewer records also means less work for the query engine, which means a faster query.

Now I can offer an example where WHERE does something similar to HAVING.  We could phrase the query this way (type as one line):

logparser "select cs-uri-stem as filename, count(*) as hits from *.log
group by filename having filename='/thismonth.htm'" -i:iisw3c

In this second query, I told Log Parser to grab the cs-uri-stem data from every single record and do a bit of computation on it (count the frequency of each file).  Once it's done with that, then Log Parser's got the breakdown of file name frequencies for every single file every encountered.  Now, I don't want to see all of those file name frequencies, I just want the frequency for thismonth.htm.  That's what the HAVING clause does -- it says "Log Parser, you've got a huge hunk of data, but I only want you to show me a tiny bit of it." 

I figured that the first query, the one with the WHERE clause would be a bit more efficient as it says to only bother computing the hit count on records about thismonth.htm, where the second computed hit counts on every single file mentioned in the log, and then only showed thismonth.htm.  And my guess was borne out, as Log Parser reports how long it takes to do something.   And yeah, the time difference was about 0.1 second, but remember there's only two log files in our test bunch -- analyzing five and a half years' worth of logs might pay off in terms of a noticeable time difference with a more efficient query.  Of course, I could be wrong --  remember, I'm just an apprentice SQLer.  (Is that pronounced "squealer?"  I'm getting these creepy flashbacks to Ned Beatty in Deliverance for some reason.)

I should also mention that, again, I'm just scratching the surface here, but here's an even more efficient way to tally the thismonth.htms:

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'" -i:iisw3c

Controlling IIS Log Dates To Query From

Now, that's all pretty neat, except for one thing:  I've been running a Web site named www.minasi.com for quite a long time.  But I've only been offering these free newsletters since 1999 and if I recall right I've only used the "thismonth.htm" file name for the past two.  In addition, I'm really only interested in how many people have looked at this in, say, the past month.  How, then, do I tell Log Parser "do that query, but for heaven's sake don't read every IIS log I've got going back to the beginning of time; instead, only look at entries since Friday, 29 April 2005 at 7:00 PM.

As it turns out the particular iisw3c input type has a special option designed to do just that, as I discovered by looking in the "IISW3C Input Format Parameters" page of Log Parser help.  Just add the -mindatemod parameter, followed by the earliest time that you want the log searched in yyyy-mm-dd format.  For example, to only see the "thismonth.htm" hits since 29 April 2005 I'd type (as one line)

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'"
-i:iisw3c -mindatemod 2005-04-29

Or, to include the time as well, add time as hh:mm:ss and put quotes around the date/time combination.  To see all the hits since 7 PM on the 29th of April 2005, I'd type (as one line)

logparser "select count(*) from *.log where cs-uri-stem='/thismonth.htm'" -i:iisw3c
-mindatemod "2005-04-29 19:00:00"

One little quirk to remember is that the log files store time in Greenwich/Universal/Zulu time.  So in my case, as I live in the Eastern time zone, I'd have to ask not for 7 PM (which is Eastern Daylight Time as I write this) but instead for 11 PM Universal, as I'm four hours behind that time zone.  In the winter I'm five hours behind UTC, so 7 PM for me would be the next day's midnight.  

Log Parser Output Fun

Whew, that SQL syntax stuff can be rough going, if useful.  Let's take a break and play with some more fun stuff -- a few of the ways that Log Parser can output the results of your queries.  We've seen the "native" (simple text) and data grid outputs.  Just for the sake of Log Parser weirdness, try changing the output format to "neuroview (type as one line):"

logparser "select top 5 c-ip, count(*) as visits from *.log group by c-ip
order by visits desc" -i:iisw3c -o:neuroview -looprows:1

Which produces a pretty much useless output... but it looks like the credits from the Matrix movies so you'll no doubt impress the crowd at your next presentation.

If you've got Office loaded on your system, then you can do some graphing too.  Try creating a bar chart instead of a data grid (type as one line):

logparser "select top 5 c-ip, count(*) as visits into chart.gif from *.log
group by c-ip order by visits desc" -i:iisw3c -o:chart -charttype:bar3d -view:on

Again, look in Log Parser's Help for more ways to display its query results.  You might sometimes want to do a really complex query that can't be done in one SQL statement; in that case, you'd do the query in parts, where you do the initial query and save those results to some kind of file (probably a comma separated variable or CSV file), then do a second query on that file, and so on.

Querying the File System with Log Parser

Let's try a query or two on something other than an IIS log.  Wondering how many MP3 files are on your system?  Well, we've seen all of the fields in iisw3c-type input files.  Here we'll use the "fs" input type file and we can get a listing of its fields with the logparser -h -i:fs approach, or look in the quite helpful Log Parser help file.  I find that, not surprisingly, there is an attribute called "name"

logparser "select count(*) from c:\* where name like '%.mp3'" -i:fs

You'll probably get a complaint that it couldn't complete the task but that's because it can't read the System Volume Information folder -- the results are still correct.  (If you like, you can give yourself Read permissions to the System Volume Information folder, but you'll get the same results either way.  Unfortunately there is no way that I know of to say to Log Parser, "search all of C:\ except for such-and-such directory.")  Or total up how much space they're taking:

logparser "select sum(size) from c:\* where name like '%.mp3'" -i:fs

Querying Active Directory

Here's a quick Log Parser AD example.  Suppose I want to get a list of all of the first names in the company, and how many people have each first name.  The only trick you've got to know is the AD word for "first names," which (if you read Newsletter 45) you know is "givenname."  Second, the "FROM" part looks like


So suppose I've got an administrative account called bigguy with password "swordfish" at bigfirm.com and I want to search the whole domain for user names.  The query would look like this (and it's an ugly one, again you'd type as one line although I split it into three on the page):

logparser "select givenname, count(givenname) as tally
from 'ldap://bigguy:swordfish@bigfirm.com/dc=bigfirm,dc=com
group by givenname order by tally desc" -objclass:User -o:datagrid -rtp:1 -i:ADS

More Resources

This was just a start with Log Parser.  The help's got lots of examples that are worth working through, and of course you can Google it for more.  You'll also find more links about Log Parser at www.logparser.com.  Apparently Log Parser's even got a book written about it!

Sunday, May 01, 2005

Exchange Server 2003 Events and Errors

When you are looking at event in eventvwr.exe (Event Viewer) in Windows 2003 you see at the bottom of every Event the following:

For more information, see Help and Support at:

Sometimes when you click on that link you get some good information back.  Sometimes…

Where is that information stored?  Can you get to it without having to open the Event Viewer?  Sure.

TechNet has an Events and Errors Message Center where you can choose a product that the error is associated with.  Exchange 2003 has one here:

Exchange Server 2003 Events and Errors

There you can enter the Source of the error and the Event ID.  For instance, Entering the source of MSExchangeIS and the Event ID 5000 (Shudder…) gives you a table that includes:

5000  Unable to initialize the Microsoft Exchange Information Store service. - Erro...

Click on the Event ID 5000 to see the same thing you would if you did it from the Event Viewer.  Don’t forget to expand the Related Knowledge Base articles at the bottom.

Multiple Unwanted NOTEs Addresses

No, this is not a slam on Lotus Notes.  This one is MS's fault.  (I did found Paul’s recent blog particularly hilarious, though.)

If you take a look at the following KB article you might get the idea that this fix is included in Exchange 2003 SP1. 
840668 Many Lotus Notes proxy addresses appear in the properties of a user account in a domain where Exchange Server 2003 is installed

The version of ntspxgen.dll in the article is 6.5.6980.87 and SP1 should bring everyone up to 6.5.7226.0 (at least).  And it was supposed to be included in SP1, but it didn't make it for some reason.  (We did get the SMTP fix and others for the same issue into SP1.)  So, why might you need the fix?  Well, in certain circumstances the RUS can go a little crazy and start stamping objects over and over again adding NOTES addresses again and again.  Soon you may get to a point where you have "packed pages" (Event ID 1171 with error -1026) and the replication between you 5.5 environment and your Active Directory may stop.  Once you get to this point you have to do some work to get rid of the extra proxy addresses.  See 318774 Removing duplicate and unwanted proxy addresses in Exchange more information on how to resolve it.

If you don't have SP1 already installed and need to install the fix, you can get the hot fix from some of the support options at http://support.microsoft.com/oas/default.aspx?&gprid=1773.  If you already have SP1 installed you may request the POST SP1 fix, otherwise you will get an error during install with the PRE SP1 fix mentioned above.


Live Communications Server 2005 Service Pack 1 for Standard and Enterprise Editions

Live Communications Server 2005 SP1 improves on the features of Live Communications Server 2005 by extending the federation model, enhancing functionality, increasing security, and improving performance and infrastructure support. These improvements include:

  • Tools to enable Public IM Connectivity; the ability to add contacts, send instant messages, and share presence information with users of the three main public IM service providers MSN, AOL and Yahoo!.
  • Enhanced federation, which uses DNS-SRV resolution to simplify connecting to federation partners.
  • New optional spim filters for better control of unsolicited instant messages.
  • Support Microsoft Office Communicator 2005.
  • Support for multiple tree Active Directory forests.
  • Improved server API performance.

You can get specific information about this update in the Microsoft Knowledge Base article (897690): Description of Live Communications Server 2005 Service Pack 1.

Download At Source

Or Download

Live Communications Server 2005 with Service Pack 1 Trial Versions