Matthew Johnson

Profile Photo

Logan Square / Chicago

44 / Male

Bulk Firewall Log Analysis

We're enabling more restrictive Windows firewall rules, so we want to be able to quickly analyze the firewall logs to see what is being blocked. The logging is configured to only include dropped packets, and our rules are focused on Inbound blocks. We see a decent amount of domain traffic being dropped expectedly. We know it's out there, we don't need it, so we don't really need it in the logs when we're looking for things to care about.

To that end, I put together a simple script to look at the local log, filter out the known stuff and then save a copy of it to a network location. We have this script in ConfigMgr, so we can easily run it on a whole collection of machines at once, specifically our pilot machines, to make sure that there's not traffic being dropped that we need to create allow rules for.

$Firewall = @()
(Get-Content "C:\Windows\System32\LogFiles\Firewall\pfirewall.log") -replace "`0", ""|ForEach-Object {                                                                                              
        if ($_ -match '^(.)+ 5353 (.)+$')      {<#Multicast DNS#>}
    elseif ($_ -match '^(.)+ 137 (.)+$')       {<#NETBIOS#>}
    elseif ($_ -match '^(.)+ 5355 (.)+$')      {<#LLMNR#>}
    elseif ($_ -match '^(.)+ 123 (.)+$')       {<#NTP#>}
    elseif ($_ -match '^(.)+ 1900 (.)+$')      {<#UPnP#>}
    elseif ($_ -match '^(.)+ 3702 (.)+$')      {<#UPnPv2#>}
    elseif ($_ -match '^(.)+ 5985 (.)+$')      {<#WinRM#>}
    elseif ($_ -match '^(.)+ :: (.)+$')        {<#IPv6#>}
    elseif ($_ -match '^(.)+ SEND$')           {<#Send#>}
    else   {$Firewall += $_}
}
$Firewall|Set-Content "\\server\share$\Firewall\$($env:computername).log"

This script could undoubtedly be cleaner or better, but I wanted to make it easy to edit. There are also other ways to do this; we capture these events to the security log and use Windows Event Forwarding to put it into a SIEM, but I find the event logs are more difficult to parse than the simple text of the firewall log file.

With logs filtered and collected from all of the pilot machines, I then made a variant of this same script that took all of the logs and combined them into a single file, further filtering out what we knew we could ignore. I edited the resulting file in Notepad++, checked a given TCP/UDP port against the web or ran nslookup for a given server, and in some cases identified the local process using a port using the following:

Get-NetTCPConnection|Where {$_.State -eq 'Listen' -and $_.LocalAddress -ne '127.0.0.1'}|Select LocalPort, @{name='Process'; expression={(Get-Process -ID $_.OwningProcess).ProcessName}}

I did find/replace in Notepad++ until I'd eliminated all of the lines of the script. There were some dynamic port to dynamic port lines I couldn't fully account for, but we did a survey of users and no one reported any issues, so we felt confident in moving the new rules to production.

Having the script available in ConfigMgr will make it simple to grab results from computers if anyone reports issues.

BitLocker Recovery Screen Analysis

I've been trying to convert the organization from McAfee Drive Encryption to BitLocker on Windows 10 Enterprise. This process has been way more involved than I expected. For example, we ran into an issue where we converted a bunch of desktops over the weekend (Sunday morning), and everything went fine. Then, Tuesday morning, the calls start coming in that computers are at the BitLocker recovery screen. We narrow it down to a subset of one model of Lenovo desktop. It takes some digging, but we manage to isolate the problem to the fact that Lenovo has three different boot orders (Primary, Automatic, and Error). We'd been setting the primary boot order to hard drive first via WMI to prevent this problem, but here we were.

As it turned out, the systems that had a problem were shut down every night by the users. Sunday morning these systems did a Wake on LAN and ran through the conversion just fine, but the Wake on LAN uses the "Automatic" boot sequence, which was Network first. That means BitLocker stored the TPM values for that boot order. When the user shut down Monday evening and started back up on Tuesday morning, they were now on Primary sequence, which didn't match, thus the BitLocker Recovery screen. We suspended BitLocker, set all three boot sequences to Hard Drive first, and then it resumed after a restart.

Going through this process, though, I needed to do some analysis of when the PCR values in the TPM were being changed. There's a tool called TBSLogGenerator you can use to generate current state, but BitLocker actually stores the PCR values it seals to the TPM in the Event Log. I wrote a PowerShell script to scrape the log and show me which PCR values changed when. I haven't seen anything else that does this, especially because the PCR string in the log isn't broken into the various PCR values. A caveat with this script is we are using the default set of PCR values, so if you changed that in policy, you'll want to edit the list of PCRs in the script for accurate numbering. BitLocker only logs the ones it cares about (something I had to discover for myself).

$List = @('computername', 'anothercomputer', 'etc')
$arrPCRs = @('[00]', '[02]', '[04]', '[08]', '[09]', '[10]', '[11]')
$List|ForEach {
    Write-Host ''
    Write-Host $_
    try {
        $Events = Get-WinEvent -ComputerName $_ -FilterHashTable @{ LogName = 'Microsoft-Windows-BitLocker/BitLocker Management'; ID = 817 }
        $Results = @()
        $Dates = @()
        For ($i=0; $i -lt $Events.Count; $i++) {
            $Results += [System.BitConverter]::ToString($Events[$i].Properties[3].Value).Replace('-','').Replace('04001400','')
            $Dates += $Events.TimeCreated
        }
        $Previous = 0
        For ($a=0; $a -lt $Results.Count; $a++) {
            $Current = $Results[$a]
            if ($Previous -ne 0) {
                if ($Current -eq $Previous) {
                    Write-Host "Match  $($Dates[$a-1])                    $($Dates[$a])"
                } else {
                    Write-Host "Change $($Dates[$a-1])                    $($Dates[$a])"
                }
                $PCRLen = $Current.Length / 7
                For ($n=0; $n -lt 7; $n++) {
                    $CurrentPCR = $Current.Substring($n * $PCRLen, $PCRLen)
                    $PreviousPCR = $Previous.Substring($n * $PCRLen, $PCRLen)
                    $PCR = $arrPCRs[$n]
                    Write-Host "$PCR $PreviousPCR $CurrentPCR"
                }
            }
            $Previous = $Current
        }
    } catch {
        Write-Host "Unable to connect"
    }
}

It's not my best code as I was in quite the hurry with tickets coming in, but I wanted to share it with anyone else struggling through this process. Hope it helps!

Configuration Manager and 32-bit Programs

This is one of those posts that mostly exists just to remind me that this problem exists and how to fix it. In Configuration Manager (2012 R2 SP1, et al), Programs (as opposed to applications) are always executed as 32-bit. If you want to execute a program 64-bit, you need to use a command prompt to SysNative.

In my most recent case, I found myself banging my head because my script worked every time I tested it outside of Config Manager but never within, no matter what options I selected. I finally figured out this was because it was running 32-bit. The easiest way I've found to make it run 64-bit is to edit the command line of the program thusly:

cmd /c %windir%\sysnative\WindowsPowerShell\v1.0\powershell.exe -File DiskCleanup.ps1

By calling CMD.exe, which will launch 32-bit, you get access to SysNative, which is the 64-bit System32 directory. It's not available to the Configuration Manager client, so you can't just set the working directory or something easy like that.

While I was running a PowerShell script in this instance, it could have been anything, including the Disk Cleanup utility itself, which was my actual problem. Apparently if that runs 32-bit you do not get access to the extra System options, including Windows Update Cleanup which can clear out several gigs of crap from your WinSxS directory (after a restart). So, kind of stupid, but simple enough and it works.

Update!

I just wanted to mention that if you haven't been running disk cleanup on Windows Updates regularly, when you do run it you'll find that it can take a long time to boot into Windows. Some of our systems had over 200 superseded updates to clean up, which translated into a number of calls to the Service Desk because of how long it takes Windows "Configuring Updates" afterwards.

Adding External Data to Configuration Manager Device Records

We use Remedy to manage assets like computers. It gives us some useful information about computers, like who they're assigned to, where they sit, what department they're in, not to mention information about servers like when they should be patched and how.

In Altiris it was pretty simple to write a SQL query that looked at a view of this Remedy data and use that for collections or filters. We could target all Marketing computers that don't have software XYZ installed, and push to them. To be fair, this information could also come from AD in theory, but the AD import doesn't work very well in Altiris, because it's Altiris. In any case, we keep the information in Remedy, so that's where we get it from.

Moving to Configuration Manager, we wanted the same functionality, but the collections are based on WQL queries rather than SQL, so it's not so simple to join in an external view. Instead, I worked found a method using DDRs. As records in Remedy are updated, a timestamp is modified. The script keeps a registry value of its last successful run time and queries any Remedy changes newer than that. Then it adds each result as custom fields to the device record.

I set up the script as a scheduled task and things seemed to be working great. However, we noticed that existing records were often not being updated. As things were changed in Remedy, the script would pull those changes and re-submit the records, but they weren't making it to Configuration Manager. As I went through the logs, I came across a message stating that the timestamp of the DDR was older than the existing record. That didn't make much sense, but as I looked at the timestamps, I realized it was supplying a 12 hour timestamp when a 24 hour timestamp was expected. Records submitted at 12:02 AM were interpreted as being afternoon, and everything after that was earlier.

Because the script was only submitting records that changed, it meant that many records would never get updated (unless they happened to submit right after midnight). The script didn't set the time, it was part of the DDR object, and I couldn't find anywhere to define it myself. Instead, I had to write the DDR to a file and then modify the file with the correct timestamp before dumping it into the Site Server inbox. Once I'd done that, everything worked as expected.

So here's what that looks like. I've shaved the script to only submit a couple of extra fields, but you can add more as you like. This assumes your data source has timestamps. I asked our DBAs to set that up for me. Otherwise you're doing a complete dump every time. I run this every 15 minutes, so that wasn't an option. If you wind up doin g a complete dump, running it once a day, you might as well just skip the DDR file editing and schedule it for right after midnight.

$Logfile = "C:\AssetScript\UpdateAssetData.log"
$Server = "YourSQLServer"
$Database = "AssetDB"
$Query = "SELECT Asset_Name, Asset_Tag, LastProcessedDate FROM AssetTableorView Where [LastProcessedDate] > '$lastruntime'"

Function LogWrite
{
   Param ([string]$logstring)
   Add-content $Logfile -value $logstring
}

#Get last run time
$lastruntime = (Get-ItemProperty -Path HKLM:Software\AssetScript).LastUpdate

#this DLL loads up using Add-Type
Add-Type -Path 'C:\Program Files (x86)\Microsoft System Center 2012 R2 Configuration Manager SDK\Redistributables\Microsoft.ConfigurationManagement.Messaging.dll'

$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server = $Server; Database = $Database; Integrated Security = True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = $Query
$SqlCmd.Connection = $SqlConnection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $SqlCmd
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet) > $null
$SqlConnection.Close()
$results = $DataSet.Tables[0]

#Process results
$i = 0
foreach ($row in $results) {
    
    $Name = $row.Asset_Name
    $Updated = $row.LastProcessedDate
    LogWrite "$Name   $Updated"
 
    #initialize a new instance of the Object
    $DDRFile = New-Object -TypeName  Microsoft.ConfigurationManagement.Messaging.Messages.Server.DiscoveryDataRecordFile("AssetInfo")
     
    #add the Key Property (Netbios Name is the Key Property for the System Architecture)
    $DDRFile.AddStringProperty("Netbios Name", 'key', 64, $Name)
    
    #Add custom asset fields
    $DDRFile.AddStringProperty("Asset_Updated", 'None', 64, $Updated)
    $DDRFile.AddStringProperty("Asset_Tag", 'None', 64, $row.Asset_Tag)
     
    #After populating the Properties write to a temporary folder
    $DDRFile.SerializeToFile("C:\AssetScript\DDR\")
    $i++
}

$filetimestamp = Get-Date -format "MM-dd-yyyy HH:mm:ss"
$replacestring = "AGENTINFO<><" + $filetimestamp + ">"

#Process each file in the temporary folder to set the correct timestamp and move to DDM inbox
Get-ChildItem "C:\AssetScript\DDR\" -Filter *.DDR | `
Foreach-Object{
    $content = Get-Content $_.FullName
    $content -replace "AGENTINFO<><[0-9/ :]+>", $replacestring | Set-Content $_.FullName
    Move-Item $_.FullName "C:\Program Files\Microsoft Configuration Manager\inboxes\ddm.box"
}

#Update last run time if there was at least one record processed
if ($i -gt 0) {
    $timestamp = Get-Date -format "yyyy-MM-dd HH:mm:ss"
    set-itemproperty -Path HKLM:Software\AssetScript -Name "LastUpdate" -value $timestamp
    LogWrite "$timestamp    Processed $i records"
}

Suppressing Software Update Prompts in Configuration Manager 2012 R2

I don't know how you do things, but around here the users don't get interrupted while they are working if at all possible. Visible software deployments have been limited to quarterly at most and software updates need to happen behind the scenes. Computers are automatically rebooted once a week on Sunday morning, so if a software update needs a reboot, that's when it will happen. We have a lot of laptop users, so we can't exclusively push updates at night.

So, Software Updates in Configuration Manager can be set to run at night if possible, but they'll wind up running during the day for a lot of people, and those people can't get a prompt to reboot when the updates are done. Despite hiding all notifications and suppressing reboots, Configuration Manager 2012 still displays a balloon to the user when updates have finished installing, asking them to restart. If they don't click on the balloon it goes away, so you'd think this wasn't a big deal, but it is. It needs to be suppressed.

Since there's no way in the agent settings or the deployment settings to make that happen, we look at the alternatives. We don't want to suppress all agent notifications because we want to prompt them about new software availability (again, quarterly at most). So how to make just software update prompts go away? The answer is, of course, a script. It turns out to be pretty simple. You could use PowerShell, but we rely on VBScript, so that's what we've got here.

On Error Resume Next
Const DEFAULT = 0
Const INTERACTIVE = 1
Const QUIET = 2
Set objCCM = GetObject("winmgmts:ROOT\ccm\ClientSDK")
Set objSUM = objCCM.Get("CCM_SoftwareUpdatesManager")
objSUM.SetAllUpdatesUserExperience QUIET

That's it. Run it once on each client subsequently all notifications from Software Updates are disabled.

Economics

I am not an economist. Perhaps if I were I would understand why the current monetary policy represents anything other than an illusion. We're told that the economy is doing better, but it really seems to mean that the stock market is up (way up, actually).

Official unemployment numbers are down, but the actual percentage of employed people basically never recovered. Labor Force Statistics from the Current Population Survey

The Federal Reserve has quadrupled the amount of liquid money over the last five years. As I understand it, if the amount of money goes up, the relative value of a dollar goes down. It makes sense then that the market would go up, as each company stock is worth more dollars, all else being equal. United States Money Supply

Fortunately, prices have not quadrupled, but this makes sense since those dollars haven't found their way into the average persons pocket. They are in the hand of banks, the wealthy, and foreign governments. So the real world costs of things have stayed fairly flat, giving the appearance that there's no problem creating so much extra money from nothing.

The question in my mind is, where does this lead? Even if we believe that "quantitative easing" will "taper" and the creation of new money drops off, is there a long term affect to this? Will the taper itself precipitate reactions from institutions and markets that have grown accustomed to it?

The dilemma is complex. Investing in the market is one of the few (only?) ways a smaller investor can capitalize on the new money and potentially secure some of it for themselves. If the market is inherently unstable, how much of a risk that represents can determine if it's a good investment. Is there a way to know how likely a crash is, how soon it might happen, and how severe it might be?

The first question is easy. A crash is 100% certain. The stock market has always had crashes and nothing fundamental has changed that would prevent it in the future. That leaves how soon and how severe. How severe is beyond the realm of knowable, but historically crashes have been in the 50-90% range and usually within the course of a couple months. Even on the low end it's significant enough to completely undo years of gains.

The final question is how soon, and this is the real trick. People have been predicting crashes badly for as long as they've been happening and I bring no special skills to the table. There have been several articles in the last several months worrying about the market getting to high, though it seems we just had a crash so we should be ok for a little while, right?

If the market is currently performing at an astonishing 20%, but suffers a high risk in the next 2 years of a loss of 50% or more, it's safe to assume that an attempt to capture as many gains as possible runs an unhealthy risk of loss and it is safer at this point too withdraw to more stable investments. Once the market crashes, it is a good idea to wait a couple of months before investing in the market again.

I've officially moved my website here, not that this is anything significant. I think of this page as being a bit more "grown up" since I'm ditching pseudonyms and even posting about work related stuff.

If anything though I want this to be a bit more casual. If I keep track of events in my life the way I did with my old Journal, it will be private. I don't need to broadcast that, I just wanted a record somewhere. Mostly I hope to post things that are interesting and useful to other people, but not the sort of content I would put on Facebook.

If nothing else it gives me a way to continue doing a bit of web development to keep me somewhat up to date, a place to express myself and share ideas that aren't in the ballpark of 140 characters.

Profiting from Owning Real Estate

We own a house in Chicago that we bought about three years ago and we're fixing it up. We also own the condo that I bought when I moved to Chicago back in 2005, since when we were buying our house we couldn't sell it for anywhere near what we owed on the mortgage. We've been renting it since then, getting basically enough to cover the mortgage, taxes, assessment, and insurance. Fortunately we've had good tenants that pay every month and don't trash the place.

Since the market has been slowly recovering I wanted to figure out if it made sense to sell it or not. On the one hand, renting it is covering our costs and paying off the mortgage. However, there are a number of risks including the adjustable rate of the mortgage, the reliability of rent, the possibility of special assessments, a rise in property taxes, and potential repair and replacement costs in the next few years of things like the appliances. Plus if we sold it, we'd get some money out of the sale.

I decided I wanted to understand how much owning the condo had cost me to date, total. Rather than simply subtracting a bought price from a sold price (less commission and mortgage balance) I added up what I had spent since I bought it in down payment, mortgage payments, assessments, taxes, and insurance, minus the rent we'd gotten over the last few years. The number was shocking. Basically, I've spent as much as the entire mortgage over the years. Granted, most of that was while I lived there, but if we're supposing that buying is a better value than renting, we should consider if that is really the case.

If I had rented an apartment for six-and-a-half years (as long as I lived there) paying what I'm renting the condo for, I would have spent less than if I sold the condo today. So much for buying being like "paying yourself" and living rent free. I'll grant this was over particularly bad years for owning real estate, but it's still painful considering the price has largely recovered.

But! All is not lost. If I continue to rent it, and that rent continues to cover costs, then even allowing for the hypothetical purchase of new appliances or a special assessment, eventually I'll reach a point where selling it would be a net profit.

I like working through questions like this. I think it gives me a better perspective on things. Previously I was just focused on what I could sell it for and if I'd get back my down payment, but thinking about what it costs to own over time reveals that my down payment was a small fraction of what I spent, and to actually profit I need to hold onto it for at least another six years or so. In short, the risks are easily worth it.

I want to address one other aspect, which is looking at the condo as a simple investment. It's generally good investment strategy to take a loss on a bad investment if it means you can then put whatever money is left to work in a better performing investment. In this case, keeping the condo has a far better rate of return than if I took the proceeds of a sale and invested elsewhere. Keeping the condo and paying down the mortgage will effectively return roughly 15% annually, which is pretty damn good. Some might say this is the only real number that's important, but I prefer to take the whole picture into account.

Removing Computers from Software Portal targets in Altiris

I haven't played around with the newer version of Client Management Suite, aka Symantec Management Console, aka Altiris Notification Server, but in our version (7.0 SP5 MR4 R11 WTF), when a user requests a Managed Delivery Policy, their computer is added to a generated policy target called a "portal internal target". Their computer is never removed from these policies, even though having many policies on an agent can break it, and even though any random breeze can trigger agent policies to be re-evaluated. If your managed deliveries are all simple and have detection rules that might not even be a problem, but our environment is a bit more complex and we tend to put in tasks and software without detection rules, so we don't want these policies hanging around on the agent and potentially causing mischief.

We're moving to Configuration Manager, but in the mean time I wanted to share how I remove computers from these targets automatically once the computer has successfully installed the requested software.

Basically it's a just SQL query that I run via an Automation Policy. It has two parts: one to delete the computers that are policy compliant from the relevant target filter, and another to force the portal internal targets to be re-processed.

I've commented the SQL so that hopefully it makes sense, but essentially in the first part I join through a few layers to get what I need. In the second I get a count of how many targets there are and then run through a loop to re-process each one.

Delete targetfilterresource
--Select tfr.resourceguid, tfr.targetfilterguid, tf.guid
From targetfilterresource tfr			--List of computers in direct filter
Inner Join TargetFilter tf			--Target to filter
	on  tf.guid = tfr.targetfilterguid
Inner Join Item I				--Name of target (limit to portal targets)
	on  i.guid = tf.resourcetargetguid
	and i.Name like '%(portal internal target)'
Inner Join vResourceTargetUses P		--Policy to target
	on  P.ResourceTargetGuid = I.Guid
Inner join [Inv_Policy_Compliance_Status] s	--Policy status (limit to compliant)
	on  s.PolicyGuid = P.ItemGuid		--(Policy)
	and s._ResourceGuid = tfr.resourceguid	--(Computer)
	and s.Compliance = 3			--(Compliant)

Declare @max int
Set @max = (
	Select Count(I.Guid)
	From Item I
	Where I.Name Like '%(portal internal target)'
)
Declare @i int
Set @i = 0
Declare @targetguid uniqueidentifier
While @i < @max
Begin
	Set @i = @i + 1
	Set @targetguid = (
		Select sub.Guid From (
			Select I.Guid,
			Rank() over (order by name) as [Pos]
			From Item I
			Where I.Name Like '%(portal internal target)'
		) sub Where [Pos] = @i		
	)
	EXEC spResolveResourceTarget @resourceTargetGuid = @targetguid, @fullUpdate = 1, @forcerefresh = 1, @runSilent = 1
end

Drop this query into an automation policy using "Raw SQL query" and no task set since the query does everything. I use a Shared Schedule defined as every three hours, but use whatever schedule makes sense for your organization.

So if you're stuck using Altiris and you want to clear out computers that have completed a Software Policy request, this is the easiest way to do it.

If you have a lot of portal internal targets, it may take a while to reprocess each one, if you wanted to get fancy you could only reprocess the ones you deleted computers from, but I didn't get take it that far.

Configuration Manager User Inventory with UAC

I spent a little time working out how to inventory user information such as mapped network drives and printers with Configuration Manager. It's a bit more complex than you'd think, particularly if you have UAC (User Account Control) turned on, which you probably should.

There are a couple articles on the web about setting it up, mostly covering how to run a script as the user that adds the information to a custom WMI class, then adding that class to your hardware inventory, basically a two-step process.

Unfortunately it's not as straightforward as that, since the user won't have permission to create the WMI class, nor permission by default to write to it. Worse yet, if you elevate the user to give them permission, they lose access to the network drive and printer WMI classes we wanted to inventory!

I had run into the WMI elevation problem before with Altiris (which always elevates the user) and finally came to a solution wherein the VBScript relaunches itself via explorer.exe, which means it's running in the un-elevated user context. The script can check for the administrator token, or more simply by checking if it is running via cscript or wscript. Explorer will launch by wscript by default, but Altiris (and Configuration Manager) use cscript by default.

In the case of Configuration Manager, I started with the script in the above linked articles. I first create the custom WMI class and grant the user write permissions, then re-launch the script as the user in order to populate the class. Finally, hardware inventory will collect the class.

Another consideration the original lacked was multiple users: it always dropped the class and re-created it, assuring that only the current user inventory will be collected. I instead have the script remove just the entries for the current user, then repopulate them. That keeps it accurate while preserving other user entries.

That's probably more than enough explanation and preamble. Here's the script:

'Mapped Network Drive and Printer Inventory
' Matthew Johnson
' 6/18/2014

Option Explicit
On Error Resume Next

Const wbemCimtypeSint16 = 2
Const wbemCimtypeString = 8
Const wbemCimtypeBoolean = 11
Const wbemCimtypeDateTime = 101
Dim Username, objLocator, objWMI, objCustom, ClassName, colItems, Item, NewEntry, Instance
Username = CreateObject("WinNTSystemInfo").UserName
Set objLocator = CreateObject("WbemScripting.sWbemLocator")

If Right(LCase(WScript.FullName), 11) = "cscript.exe" Then 'System or elevated user context - set up WMI classes

	'Configure WMI
	Dim objNamespace, objCustomNameSpace, objSS, objClass, ClassNotFound

	'Create Namespace
	WScript.Echo "Verify Custom namespace"
	Set objWMI = objLocator.ConnectServer(".", "root")
	Set objNamespace = objWMI.Get("__namespace")
	Set objCustomNameSpace = objNamespace.SpawnInstance_
	objCustomNameSpace.name = "Custom"
	objCustomNameSpace.Put_()
	
	'Set Namespace Permissions
	WScript.Echo "Verify namespace permissions"
	Set objCustom = objLocator.ConnectServer(".", "root\Custom")
	Set objSS = objCustom.Get("__SystemSecurity")
	'Add Partial Write permission for NT Authority\INTERACTIVE
	objSS.SetSD(array(1, 0, 4, 129, 132, 0, 0, 0, 148, 0, 0, 0, 0, 0, 0, 0, 20, 0, 0, 0, 2, 0, 112, 0, 5, 0, 0, 0, 0, 2, 20, 0, 9, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 5, 4, 0, 0, 0, 0, 18, 24, 0, 63, 0, 6, 0, 1, 2, 0, 0, 0, 0, 0, 5, 32, 0, 0, 0, 32, 2, 0, 0, 0, 18, 20, 0, 19, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 5, 20, 0, 0, 0, 0, 18, 20, 0, 19, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 5, 19, 0, 0, 0, 0, 18, 20, 0, 19, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 5, 11, 0, 0, 0, 1, 2, 0, 0, 0, 0, 0, 5, 32, 0, 0, 0, 32, 2, 0, 0, 1, 2, 0, 0, 0, 0, 0, 5, 32, 0, 0, 0, 32, 2, 0, 0))

	'Create NetworkDrives class
	ClassName = "NetworkDrives"
	ClassNotFound = True
	For each objClass in objCustom.SubClassesOf()
		If Split(objClass.Path_.Path, ":")(1) = ClassName Then
			WScript.Echo "Found class " & objClass.Path_.Path
			ClassNotFound = False
		End If
	Next
	If ClassNotFound Then
		WScript.Echo "Creating " & ClassName & " class"
		Set objClass = objCustom.Get
		objClass.Path_.Class = ClassName
		objClass.Properties_.add "Username", wbemCimtypeString
		objClass.Properties_.add "Drive", wbemCimtypeString
		objClass.Properties_.add "Path", wbemCimtypeString
		objClass.Properties_("Username").Qualifiers_.add "key", True
		objClass.Properties_("Drive").Qualifiers_.add "key", True
		objClass.Put_
	Else
		'Delete existing entries for current user
		Set colItems = objCustom.ExecQuery("Select * From " & ClassName & " Where Username='" & Replace(UserName, "\", "\\") & "'")
		If colItems.Count > 0 Then
			WScript.Echo "Deleting " & colItems.Count & " existing " & ClassName & " entries for " & Username
			For each Item in colItems
				Instance = ClassName & ".Drive=""" & Item.Drive & """,Username=""" & Username & """"
				WScript.Echo Instance
				objCustom.Delete Instance
			Next
		End If
	End If
	
	'Create NetworkPrinters class
	ClassName = "NetworkPrinters"
	ClassNotFound = True
	For each objClass in objCustom.SubClassesOf()
		If Split(objClass.Path_.Path, ":")(1) = ClassName Then
			WScript.Echo "Found class " & objClass.Path_.Path
			ClassNotFound = False
		End If
	Next
	If ClassNotFound Then
		WScript.Echo "Creating " & ClassName & " class"
		Set objClass = objCustom.Get
		objClass.Path_.Class = ClassName
		objClass.Properties_.add "Username", wbemCimtypeString
		objClass.Properties_.add "Printer", wbemCimtypeString
		objClass.Properties_.add "Driver", wbemCimtypeString
		objClass.Properties_.add "Server", wbemCimtypeString
		objClass.Properties_.add "Port", wbemCimtypeString
		objClass.Properties_("Username").Qualifiers_.add "key", True
		objClass.Properties_("Printer").Qualifiers_.add "key", True
		objClass.Put_
	Else
		'Delete existing entries for current user
		Set colItems = objCustom.ExecQuery("Select * From " & ClassName & " Where Username='" & Replace(UserName, "\", "\\") & "'")
		If colItems.Count > 0 Then
			WScript.Echo "Deleting " & colItems.Count & " existing " & ClassName & " entries for " & Username
			For each Item in colItems
				Instance = ClassName & ".Printer=""" & Replace(Item.Printer, "\", "\\") & """,Username=""" & Username & """"
				WScript.Echo Instance
				objCustom.Delete Instance
			Next
		End If
	End If
	
	'Re-launch script as user
	CreateObject("WScript.Shell").Run "explorer.exe """ & WScript.ScriptFullName & """", 0, False
	
Else 'User context - Perform user inventory

	Set objCustom = objLocator.ConnectServer(".", "root\Custom")
	Set objWMI = objLocator.ConnectServer(".", "root\cimv2")
	
	'Create NetworkDrives entries for current user
	ClassName = "NetworkDrives"
	For each Item in objWMI.ExecQuery("Select * From Win32_LogicalDisk Where DriveType=4")
		Set NewEntry = objCustom.Get(ClassName).SpawnInstance_
		NewEntry.Username = Username
		NewEntry.Drive = Item.DeviceID
		NewEntry.Path = Item.ProviderName
		NewEntry.Put_
	Next

	'Create NetworkPrinters entries for current user
	ClassName = "NetworkPrinters"
	For each Item in objWMI.ExecQuery("Select * From Win32_Printer Where Network=True and Local=False")
		Set NewEntry = objCustom.Get(ClassName).SpawnInstance_
		NewEntry.Username = Username
		NewEntry.Printer = Item.Name
		NewEntry.Driver = Item.DriverName
		NewEntry.Server = Item.ServerName
		NewEntry.Port = Item.PortName
		NewEntry.Put_
	Next
End If

So there you have it. Run the script manually on a test system via an elevated command prompt and cscript so you can add the classes to hardware inventory (NetworkDrives and NetworkPrinters). Then schedule the script to run on an interval ahead of your hardware inventory. You can review the linked articles for more information on those steps if you need to. Once the script has populated the classes and hardware inventory has gathered the data, you will be able to see it in Resource Explorer and then use it in reports and device collections.

Oh, and if you want to delete the custom class because you're testing or what-have-you, run this:

Set objLocator = CreateObject("WbemScripting.SWbemLocator")
Set objWMI = objLocator.ConnectServer(".", "root")
objWMI.Get("__Namespace.Name='Custom'").Delete_

You can also add any other user-specific data you want using this same script. Just add additional classes and populate them the same way.

I made it all the way around the sun again, which is to say that it's my birthday. I don't feel any older, I don't have any dramatic or exciting plans for the next year, but who knows what the future has in store? I'll keep an open mind to possibilities.

I had a fun time in Las Vegas, and even won a bit at the craps table. Not enough to cover the trip or anything, but at least I'm still up on the city overall. Sarah had hernia surgery a couple days after I got back, so I took off most of the week to take care of the kids while she recovers. Her mom was there this week with Brianna, and Sarah's probably going to be ok by herself next wekk, even if she isn't to the point where she can lift Derek yet.

Emily is getting longer and fatter by the day, both good things for little babies. She starting to smile and coo, but she's also developed a need to be held all the time. Every time I set her down yesterday evening while Sarah was at the store she started crying, so I wound up pacing the house with her on my shoulder.

I haven't had much chance to work on the house recently, which is starting to worry me, since we want to get insulation in by Fall and need to do a lot in order for that to happen. Hopefully next week I can get back at it, since our weekends are booked for the foreseeable future.

I managed to get mobile formatting to work on this-here webpage, but I still haven't done much else with it. I'm still editing the page manually, so it's a long way from where I'd like it to be. Maybe I'll find some time for that too.

Derek's officially two-and-a-half today. It's fun seeing him grow and develop; seeing how he interacts with his baby sister. This morning I brought a new picture in to work for my desk: a picture of the four of us on a hike in the woods this past weekend. Unfortunately it's too big, since apparently I have a 4x6 frame not a 5x7. In any case, it really made me reflect that I have a family of my own now. I mean, I've always had family and we've been our own family since we had Derek, but something about having a wife and two kids now feels different.

We just planned our anniversary trip, since we'll be celebrating five years of marriage and nine years together in September. We're going to Punta Cana in the Dominican Republic and staying at a resort. We'll be taking Emily with us since she won't even be six months old, but Derek will stay with Sarah's parents.

On the opposite end of the spectrum from family, tomorrow morning I leave for Las Vegas, where James is having his bachelor party. I haven't been out there in over seven years and I'm pretty excited. I'll be back late on Sunday and on Tuesday Sarah's having her hernia surgery. I'm off the rest of the week taking care of her, so I guess that's my penance for leaving her at home with the kids.

I haven't worked out an easy way to put pictures into my posts yet, so for now it's just text. Hopefully I'll get something together soon.

I'm working on getting Configuration Manager 2012 set up at work. I've been an Altiris Admin for basically ten years now and it's never not sucked. Configuration Manager definitely has its own set of idiosyncrasies though. It complains if you don't have firewall rules defined on your SQL server when the firewall isn't enabled. Hell, it complains when the firewall isn't enabled and the rules are defined.

On the other hand, despite it complaining, everything is working so far. Altiris, on the other hand, randomly stopped letting us into PXE Configuration yesterday on one of our Deployment Servers. The PXE keys match, the services are running, all the usual culprits look fine. I can either dig into some ancient ini files or I can just reinstall PXE Manager.

At any given moment it's a safe bet that something is broken with Altiris. Usually it's not the DS, which is generally stable —if old and crusty. The NS, though, is a sand castle full of water, except the sand is badly written code and the water is a badly written database. The only thing holding it together is a cluster of item references and resource associations that prevent you from deleting anything due to impossible circular dependencies.

In other words, I'm glad to be moving to Configuration Manager, even if it means I have a whole host of new problems to learn. At least there are enough other people using the product that the solutions aren't typically hard to come by.

I'm still working on this website here and there. I have a lot of backend work to make it into what I want. For the time being, though, this is it.

My intentions are to flesh out the profile with a bit more 'who am I' stuff and then use this column for content. Content at this point will be my periodic Journal updates of what I've been up to, as well as any odd thought or thing I want to share.

I hope to include photos as well, since they're colorful and interesting and I can't promise that everything I write will be. We'll have to see how that works, since I don't want to develop a whole other photo site thing. I need an easy way to drop in photos I have in other places.

At some point I'd also like to populate this section with my historical content from XeoMage as a precursor to retiring that site. This will likely be a slow migration, though, since I don't work on this site all that often.