Friday, May 30, 2008

Poke-Info PowerShell script

I was working next to someone the other day, and there were two monitors connected to each PC, and he had just dragged a script from his right monitor to the left to read. It occurred to me that it would be handy to have a grid-connected view of desktops/monitors, and you could essentially drag and drop windows between computers.

Needless to say this probably isn't going to be achievable in 10 lines of powershell script, but as a version 0.01 of the concept, this script will take a file and a computer as parameters, and copy the file remotely and start notepad on the interactive desktop of the remote machine.

param (
$file = "",
$computer = ""

# 30/05/2008, Wayne Martin, Initial version
# Description:
# Poke a file into a notepad.exe process in the interactive desktop on a remote machine.
# Assumptions, this script works on the assumption that:
# psexec.exe is available on the host
# the account running the powershell script has administrative access to the remote computer
# WMI is accessible on the remote machine (queries remote temp directory and windir)
# Usage
# Copy a text file to a remote machine and start notepad on the interactive desktop
# . .\PokeInfo.ps1 -f textfile.txt -m RemoteMachine1
# Notes
# psexec.exe is required on the host
# This could also be achieved by using win32_Schedulejob interactive now+1, but this assumes schedule is running on the remote machine and time is synchronised
# References:

write-output ("Copying and remotely loading " + $file + " on " + $computer)
$envTemp = (get-WMIObject -computerName $computer -class win32_environment -property VariableValue -filter "Name='Temp' AND SystemVariable=True").VariableValue
$windir = (get-WMIObject -computerName $computer -class win32_operatingsystem -property WindowsDirectory).WindowsDirectory

$tempfile = [System.IO.Path]::GetRandomFileName()
$localDest = $envTemp.Replace("%SystemRoot%", $windir) + "\" + $tempfile
$UNCDest = "\\" + $computer + "\" + $localDest.Replace(":", "$")

copy-item -path $file -dest $UNCdest

$cmd = "\\" + $computer + " /i /d notepad " + $localdest
write-output ("Running " + $cmd)
[diagnostics.process]::start("psexec.exe", "$cmd")

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Reading web content with PowerShell

The following PowerShell code shows 6 different methods of using the .Net framework to read HTML/ASP responses, useful for scraping web pages through script. If you don't have a proxy server, just comment out the $wc.proxy lines.

$user = $env:username
$webproxy = "http://proxy:8080"
$pwd = Read-Host "Password?" -assecurestring

$proxy = new-object System.Net.WebProxy
$proxy.Address = $webproxy
$account = new-object System.Net.NetworkCredential($user,[Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($pwd)), "")
$proxy.credentials = $account

# -- Method 1 - DownloadData -- #

$url = ""
$wc = new-object
$wc.proxy = $proxy
$webpage = $wc.DownloadData($url)
$string = [System.Text.Encoding]::ASCII.GetString($webpage)

# -- Method 2 - DownloadData (QueryString) -- #

$url = ""
$col = new-object System.Collections.Specialized.NameValueCollection

$wc = new-object
$wc.proxy = $proxy
$wc.QueryString = $col
$webpage = $wc.DownloadData($url)
$string = [System.Text.Encoding]::ASCII.GetString($webpage)

# -- Method 3 - UploadData (POST) -- #

$url = ""
$wc = new-object
$postData = "a=stats&s=s451qaz2wsx"
$wc.proxy = $proxy
[byte[]]$byteArray = [System.Text.Encoding]::ASCII.GetBytes($postData)
$webpage = $wc.UploadData($url,"POST",$byteArray);
$string = [System.Text.Encoding]::ASCII.GetString($webpage)

# -- Method 4 - GetResponse Stream -- #

$url = ""
$wr = [System.Net.WebRequest]::Create($url)
$wr.Proxy = $proxy

$requestStream = $wresponse.GetResponseStream()
$readStream = new-object System.IO.StreamReader $requestStream
$wrs = $readStream.ReadToEnd()

# -- Method 5 - UploadValues -- #

$url = ""
$wc = new-object
$wc.proxy = $proxy
$col = new-object System.Collections.Specialized.NameValueCollection
$wc.QueryString = $col
$webpage = $wc.UploadValues($url, "POST", $col)
$string = [System.Text.Encoding]::ASCII.GetString($webpage)

# -- Method 6 - UploadString - this doesn't seem to post properly -- #

$url = ""
$wc = new-object
$wc.proxy = $proxy
$postData = "/?a=stats&s=s451qaz2wsx"
$webpage = $wc.UploadString($url, $postData)

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Sunday, May 25, 2008

Automated Cluster File Security and Purging

If you have a cluster share that contains temporary data in separate top-level directories, this post may help you automate the security and purging of that shared data. This is useful for transient data such as drop directories for scanners and faxes, or scratch directories for general sharing.

To summarise, this will provide:

  1. A cluster-based scheduled task that runs each day, dependant on the network name and physical disk resource currently hosting the directory
  2. A batch file run by the scheduled task that secures each directory, and purges files older than 30 days, logging results to the physical node hosting the resource.

Creating the Scheduled Task

  1. Create the scheduled task cluster resource:
    cluster /cluster:%cluster% res "%resource_name%" /create /group:"%cluster_group%" /type:"Volume Shadow Copy Service Task"
    cluster /cluster:%cluster% res "%resource_name%" /priv ApplicationName="cmd.exe"
    cluster /cluster:%cluster% res "%resource_name%" /priv ApplicationParams="/c c:\admin\SecureAndPurge.bat"
    cluster /cluster:%cluster% res "%resource_name%" /priv CurrentDirectory=""
    cluster /cluster:%cluster% res "%resource_name%" /prop Description="%resource_name%"
    cluster /cluster:%cluster% res "%resource_name%" /AddDep:"%network_name_resource%"
    cluster /cluster:%cluster% res "%resource_name%" /AddDep:"%disk_resource%"
    cluster /cluster:%cluster% res "%resource_name%" /On
    cluster /cluster:%cluster% res "%resource_name%" /prop RestartAction=1
  2. Set the schedule for the cluster resource:
    • Use the cluster administrator GUI, this cannot currently be set with cluster.exe with the VSS scheduled task cluster resource
  3. Restart the resource to pickup the schedule change:
    cluster /cluster:%cluster% res "%resource_name%" /Off
    cluster /cluster:%cluster% res "%resource_name%" /On

Note that the cluster resource providing scheduled task capability is the ‘Volume Shadow Copy Service Task’ resource. This is a recommended solution from Microsoft for providing scheduled task capability on a cluster. See the ‘Cluster Resource’ document in the references below.

The LooksAlive and IsAlive functions for the VSSTask.dll simply check that the scheduled task is known to the local task scheduler. To further reduce the impact of resource failure, the resource should be marked as not affecting the cluster, preventing potential failover if this task were to fail more than three times (by default).

The scheduled task should run a simple batch file on the local disk of the cluster node. Keeping the batch file local further reduces the risk that problems with the batch file could cause the cluster group to fail. The theory is that if the batch file is on local disk, it can be modified/deleted before bringing the cluster resources online.

Creating the batch file

Create a batch file and set some environment variables for %directory%, %purgeDir%, %domain%, %logFile%, %AdminUtil%, %FileAge% to fit your environment, and then include at least the three commands below:

  • Set the security on each directory within the directory. Note that this assumes that for each directory, there is a matching same-named security group, prefixed with l (for local), eg lDirectory1.

    for /d %%i in (%Directory%\*) do cacls %%i /e /g %Domain%\l%%~ni:C >> %LogFile%
  • Move the files with robocopy that are older than %FileAge% days:

    %AdminUtil%\robocopy %Directory% "%PurgeDir%" *.* /minage:%FileAge% /v /fp /ts /mov /e /r:1 /w:1 /log+:%LogFile%
  • Delete the files that were moved:

    If Exist "%PurgeDir%" rd /s /q "%PurgeDir%"

Note that depending on the size of data, you might want to ensure that the purgedir is on the same volume as the source files, which won't use any disk space as the files are moved. If the purgedir was on a different drive you would temporarily need as much free space as the size of data being purged.


Cluster resource

Volume Shadow Copy Service resource type

Using Shadow Copies of Shared Folders in a server cluster

Scheduled task does not run after you push the task to another computer

Scheduled Task for the Shadow Copies of Shared Folders Feature May Not Run on a Windows Server 2003 Cluster

Behavior of the LooksAlive and IsAlive functions for the resources that are included in the Windows Server Clustering component of Windows Server 2003

Generic Cluster-enabled Scheduled Tasks:

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Friday, May 23, 2008

Managing printers at the command-line

This post provides a method of using the command-line to add printer drivers to a Windows 2003 print server, and then create and configure the port and printer. You can use this to create a control file with your printers, and then run these commands against the control file. There are also a few commands to gather information from your printer server(s) from the directory and event log.

Adding a driver

The printuientry calls have slightly different syntax between XP and 2003, hence the OS specific commands below even though the same print server is being targeted:

The Driver model name should be extracted from the INF and replace the example HP UPD driver.

On XP against a 32-bit print server:
rundll32 printui.dll,PrintUIEntry /ia /c\\printserver /m "HP Universal Printing PCL 6" /h "Intel" /v "Windows 2000 or XP" /f \\server\drivers\hpupd_pcl6_x86\hpu4pdpc.inf

On a 2003 x64 server, serving x86 clients (therefore both drivers are required):
rundll32 printui.dll,PrintUIEntry /ia /c \\printserver /m "HP Universal Printing PCL 6" /h "x64" /v "Windows XP and Windows Server 2003" /f file://server/drivers/hpupd_pcl6_x64/hpu4pdpc.inf

rundll32 printui.dll,PrintUIEntry /ia /c \\printserver /m "HP Universal Printing PCL 6" /h "x86" /v "Windows 2000, Windows XP and Windows Server 2003" /f \\server\drivers\hpupd_pcl6_x86\hpu4pdpc.inf

Creating ports/printers

Create a control file containing:

For example:
Printer1,,HP Universal Printing PCL 6,Printer/Location/1

Ping the IP Address for the printer:
for /f "tokens=1-4 delims=," %i in (controlfile) do @echo. &@echo %i,%j,%k,%l &@ping -n 1 %j

Register the prnadmin.dll if not already done:
cd /d "C:\Program Files\Windows Resource Kits\Tools" && regsvr32 /s prnadmin.dll

Assuming the driver is already installed, Add a DNS record create the port, Use the Microsoft Windows Resource Kit Tools to create the port and printer, configure the share and publish to AD, set duplex defaults, grant permissions to manage printers to a security group and ensure the print processor is winprint

for /f "tokens=1-4 delims=," %i in (controlfile) do @echo dnscmd %ad_dns_server% /recordadd %domain% %i A %j

for /f "tokens=1-4 delims=," %i in (controlfile) do @cscript portmgr.vbs -a -c \\printserver -p %i -h %j -t LPR -q %i

for /f "tokens=1-4 delims=," %i in (controlfile) do @cscript prnmgr.vbs -a -c \\printserver -b %i -m "%k" -r %i

for /f "tokens=1-4 delims=," %i in (controlfile) do @cscript prncfg.vbs -s -b \\printserver\%i -h %i -l "%l" +published

for /f "tokens=1-4 delims=," %i in (controlfile) do @setprinter \\printserver\%i 8 "pDevMode=dmDuplex=2,dmCollate=1,dmFields=duplex collate"

for /f "tokens=1-4 delims=," %i in (controlfile) do @c:\util\subinacl /printer \\printserver\%i /grant=domain\group=M

for /f "tokens=1-4 delims=," %i in (controlfile) do @setprinter \\printserver\%i 2 pPrintProcessor="WinPrint"

Print a test page to each printer:
for /f "tokens=1-4 delims=," %i in (controlfile) do cscript prnctrl.vbs -t -b \\printserver\%i

Check the printer is published in AD:
for /f "tokens=1-4 delims=," %i in (controlfile) do dsquery * CN=printserver,OU=ServerOU,DC=domain,DC=com -filter "(&(printShareName=%i))"

Note that subinacl.exe in the rktools isn't very functional, 5.2.3790.1180 is much better:

Gathering information from a print server

Export Print Server details:
dsquery * OU=ServerOU,DC=domain,DC=com -limit 0 -filter "(&(objectClass=printQueue)(objectCategory=printQueue))" -attr cn printerName driverName printCollate printColor printLanguage printSpooling driverVersion printStaplingSupported printMemory printRate printRateUnit printMediaReady

csvde -f PrinterDetails.csv -d OU=ServerOU,DC=domain,DC=com -r "(&(objectClass=printQueue)(objectCategory=printQueue))" -l cn,printerName,location,driverName,printCollate,printColor,printLanguage,printSpooling,driverVersion,printStaplingSupported,printMemory,printRate,printRateUnit,printMediaReady
Dump event logs for the last day:
c:\util\dumpel -s \\printserver -l System -e 10 -m Print -d 1

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

File System Filters and minifilters

File system filters with the filter manager and minifilters are often overlooked, until a clash occurs when you've got several different products using these filters, such as anti-virus, file screening, offline archiving, quotas etc.

This post provides information on a few utilities I've used to identify the file system filters currently installed, and how you can then start diagnosing issues with the verifier.exe driver verification tool.

The commands below show the file system filters installed, their current altitude, and which volumes they’re attached to:


Filter Name Num Instances Frame
------------------------------ ------------- -----
Datascrn 0 0
FileScreenFilter 3 0
EvFilter 3 0
Quota 0 0

C:\>Fltmc.exe instances
Filter Volume Name Altitude Instance Name
----------------------------- ----------------------------- ---------------- --------------------
FileScreenFilter C: 260800 FileScreenFilter
FileScreenFilter D: 260800 FileScreenFilter
EvFilter C: 185100 EvFilter
EvFilter D: 185100 EvFilter

Debugging tools for Windows also has some file system driver diagnostics:

  • Install the debugging tools for windows (windbg)
  • Load windbg
  • Debug the kernel, using local connection
  • Load the filter kernel debugging extensions - .load fltkd
  • Use !, or specific commands from the articles below (!filters !volumes etc)

Once you have worked out which drivers are the file system filters and minifilters, you can then run verifier.exe on the machine to monitor the drivers. This enables you to record statistics such as memory paging and interrupts. Any number of the currently installed drivers can be monitored

How NTFS Works

Filter Manager and Minifilter Driver Architecture

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

OpsMgr 2007 SSRS Reports using SQL 2005 XML

This post relates mostly to the concept of passing XML between SQL Server Reporting Services (SSRS), Operations Manager Smart Controls, and SQL Stored Procedures when you are customising Operations Manager 2007 reporting.

Passing multi-valued parameters from the Operations Manager Smart Controls through the SSRS report to SQL can be difficult, and the method employed most by the default Microsoft reports seems to be XML.

SQL 2005 includes the XML data type and associated methods, including XQuery, a subset of the XPath query standard. In addition, SQL supports OPENXML – a rowset provider to construct a relational rowset view of an XML document.

Included below are two examples of processing an XML string and querying an element value from the XML elements with a SQL insert/select clause.

declare @execerror int
declare @xmldoc xml
declare @ixmldoc int
set @xmldoc = '<Data><Objects><Object Use="Containment">376</Object><Object Use="Containment">300</Object></Objects></Data>'
DECLARE @tblTest table (test int)

/* Parse the XML document and insert the converted int <object> element value into a temporary table */

EXEC @ExecError = sp_xml_preparedocument @ixmldoc OUTPUT, @xmldoc
OPENXML (@ixmldoc, '/Data/Objects/Object')
WITH (InstanceFilter int '.')

select * from @tbltest

/* Translate the value of <object> nodes to an int from an XML document into the test field of the table */

INSERT INTO @tblTest (test)
select tblTest.test.value('.', 'int')
from @xmldoc.nodes('/Data/Objects/Object') AS tblTest(test)

select * from @tbltest

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Tuesday, May 20, 2008

Access Based Enumeration in 2003 and MSCS

This post provides an overview of Access Based Enumeration on Windows Server 2003, some limitations, advantages and information on controlling ABE in an Windows MSCS environment.

With a standard Windows file server, for users to map the share and view the directories they have access to, all users require the list directory right at the root of the share. The client would then see all directories, and receive access denied errors if they try to navigate to any sub-folder without NTFS access.

To provide access control similar to Netware, Microsoft have introduced Access Based Enumeration in Windows Server 2003 SP1. This provides a method of displaying only files and folders that a user has access to, rather than displaying all objects in the tree.

The best description I can give is that ABE will hide what you don't have access to, as opposed to ensuring you can see what you do have access to. For example, if you have .\A .\A\B and .\A\B\C, and you have access to C but you don't have access to B, C will be hidden through the explorer GUI.

While this allows for a seamless migration from Netware-based file servers, there are several potential limitations:

  • ABE has to be installed as a separate component to the Operating System, which must be documented and managed for implementation and recovery scenarios.
  • ABE is not cluster-aware, and as enabling ABE is a per-share operation, cluster failovers resulting in the dynamic creation of shares on a physical cluster node will not create ABE-enabled shares. A generic cluster application could be created to enable ABE on all shares as they are created on each cluster node. This is non-standard and not ideal.
  • Increased CPU usage on the file server and slower response times to the client, processing the file data to provide information to the client on which files and directories are visible. Depending on the algorithm used, the size and depth of data structures and file system security, this may be an issue.
  • There are known issues with DFS and Access Based Enumeration
  • There are known issues with the cache on multi-user workstations, which will provide a view of any files and directories that have been viewed by any user of a computer. Client cache characteristics such as retention and location are not known.


  • If looking at migrating from Netware to Microsoft file sharing, the migration will be seamless for users. The file/directory structure and security will be similar, along with end-user access.
  • Using ABE is a documented solution for managing the sharing and access control for clustered home folders, rather than using the share sub-directories feature of MSCS.


  • There are known issues with backup applications that do not use the ‘back up files and directories’ right backing up data through standard file shares.
  • It is unknown whether navigating with explorer to a top-level directory causes server-side processing of all files/folders in the share to determine the access path to all items in the tree, or whether the algorithm will process per-directory. This may be relevant in determining the test-cases to apply to assess performance. Based on testing, it is assumed that if access is denied at a top-level, sub-folders and files are not processed deeper in that branch.
  • Windows Server 2008 includes cluster-aware ABE, enabled by default on all shares. Going forward this minimises the risk that this is a non-standard solution.

Controlling ABE in a cluster environment

Access Based Enumeration is controlled through SMB share settings for each instance of the lanmanserver service – each physical node in the cluster. These settings are not cluster-aware, and will be lost during a cluster fail-over operation.

To ensure that ABE follows virtual cluster nodes during failovers, a generic cluster application can be created, running the abecmd.exe to verify that ABE is enabled after failover. The cluster application will be dependant on the file share resource, run for each file share.

Resource Type - Generic Cluster Application
Resource Name - <share> ABE
Description - <share> ABE
Dependencies - <share>
Command - cmd.exe /k abecmd.exe /enable <share>
Current Directory - %SystemRoot%
Interact with Desktop - De-selected


  1. This assumes abecmd.exe is available in the path of each physical cluster node (this is the case when you install the Microsoft package).
  2. The /k switch is required to ensure that the cmd.exe application remains open, after the abecmd.exe process terminates. This ensures that the cluster resource monitor does not detect the resource as failed. This also leaves a command shell running for each instance of ABE being enabled, certainly not ideal and potentially misleading.

Other solutions considered

Other solutions I've considered for controlling ABE in a Windows file and print cluster environment included:

  1. Creating a generic cluster application dependent on all shares within a particular group, using the 'abecmd.exe /all' method. This is potentially less maintenance, but does not offer granular control of particular shares
  2. Creating a generic script resource type, with a VBScript using the supported cluster entry points to enable ABE on shares when they are created. This requires VBScript knowledge to create and maintain the solution, as opposed to a standard Microsoft provided executable.
  3. Creating an external watcher than determine cluster failovers and share re-creation, running the appropriate abecmd.exe commands as required. This requires an external server process, either a compiled application or script, watching the cluster, not intuitive and adding a single point of failure
  4. Controlling Access Based Enumeration through Group Policy. Third-party software exists to control the enforcement of Access Based Enumeration to file servers. However, unless the scheduled GPO refresh period was very regular, this would not be relevant in a failover cluster scenario.

CPU usage and end-user response times

The biggest concern is response times, as the Microsoft whitepaper on ABE determines that with 150,000 files, the operation of ‘reading shared directories’ increase from 12 seconds to up to 58 seconds. However, there is no detail on the type of test performed or the hardware used.

On a 2003 cluster with several million files and more than a terabyte of data, no performance impacts have been noticed when accessing folder structures through shares with ABE enabled.


Access Mask:

Access Based Enumeration whitepaper:

Access Based Enumeration:

Enabling ABE with DFS:

Implementing home folders on a cluster server:

Windows Server 2008 failover clustering:

Scripting Entry Points:

Create a generic application resource type:

Generic script cluster resource:

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Friday, May 16, 2008

Find VM snapshots in ESX/VC

Below are four methods I have used to find snapshots of VMs in ESX and VirtualCenter, other than manually looking through each VM in the GUI. Query the database using SQL management studio, use sqlcmd from the command-line, list the files in the service console, or use the new VI Toolkit powershell snap-in.

  1. Query SQL using SQL Server Management Studio

    select ENT.Name as 'Name', Lower(DNS_Name) as 'DNS Name', Guest_OS as 'OS', Mem_Size_MB as 'Mem', IP_Address as 'IP', VM.FILE_Name as 'VMX location', VM.Suspend_Time as 'Suspend Time', VM.Suspend_Interval as 'Suspend Interval', VMS.Snapshot_Name as 'Snapshot Name', VMS.Snapshot_Desc 'Snapshot Description', DateAdd(Hour, 10, VMS.Create_Time) as 'Snapshot Time', VMS.Is_Current_Snapshot 'Current Snapshot' from vpx_vm VM inner join VPX_GUEST_NET_ADAPTER NET on VM.ID = NET.VM_ID inner join VPX_ENTITY ENT on VM.ID = ENT.ID inner join VPX_SNAPSHOT VMS on VM.ID = VMS.VM_ID

  2. Use sqlcmd.exe to run the query

    Run this query from a command-line after updating the server and database parameters:
    sqlcmd -S %server% -d %database% -W -s "," -Q "select ENT.Name as 'Name', Lower(DNS_Name) as 'DNS Name', Guest_OS as 'OS', Mem_Size_MB as 'Mem', IP_Address as 'IP', VM.FILE_Name as 'VMX location', VM.Suspend_Time as 'Suspend Time', VM.Suspend_Interval as 'Suspend Interval', VMS.Snapshot_Name as 'Snapshot Name', VMS.Snapshot_Desc 'Snapshot Description', DateAdd(Hour, 10, VMS.Create_Time) as 'Snapshot Time', VMS.Is_Current_Snapshot 'Current Snapshot' from vpx_vm VM inner join VPX_GUEST_NET_ADAPTER NET on VM.ID = NET.VM_ID inner join VPX_ENTITY ENT on VM.ID = ENT.ID inner join VPX_SNAPSHOT VMS on VM.ID = VMS.VM_ID"

  3. Connect to the service console and query for the files in the VMFS volumes

    ls -Ral /vmfs/volumes/* grep .vmsn

  4. Use the VI Toolkit to connect to a VC instance and query for snapshots. Note that this is currently in beta, so for write operations in production I suggest waiting for the production release of the toolkit.
  • Install VMware-Vim4PS-e.x.p-81531.exe (or later)
  • Run this command (or use the start menu shortcut) C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -PSConsoleFile "C:\Program Files\VMware\Infrastructure\VIToolkitForWindows\vim.psc1" -NoExit -Command ". \"C:\Program Files\VMware\Infrastructure\VIToolkitForWindows\init.ps1\"" \
  • Get-VC -server %vc-server% (you will be prompted for credentials unless you have a valid certificate path)
  • Get-VM Get-Snapshot

    Name Description PowerState
    ---- ----------- ----------
    A_Snapshot Test PoweredOff

    Note that the output above isn’t very well formed, so I prefer to export the results to a CSV file:
  • Get-VM Get-Snapshot export-csv -path c:\temp\VMsnapshots.csv

    Unfortunately the CSV output exports the the VM object as the object type instead of the properties in the object, so you can also use:
  • Get-VM Get-Snapshot foreach-object {$out= $_.VM.Name + "," + $_.Name + "," + $_.Description + "," + $_.PowerState; $out}


  • I doubt VMware would recommend querying the database directly, they’d say to use their SDK or the VI toolkit instead of either one or two above.
  • The SQL commands above add GMT+10 to the UTC times recorded in the database using DateAdd()

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Comparing MSCS/VMware/DFS File & Print

The following table shows information I was using to compare various Windows HA file and print solutions, including MSCS, VMware, VMware+MSCS, DFS, VMware+DFS and stand-alone servers. There are no recommendations, and most need to be adjusted or at least considered for your environment, but it might help crystalise your thoughts as it did mine.

Comparison Microsoft server ClusteringVMware HAMicrosoft Clustering on VMware HADFSStand-alone server(s)VMware HA with DFS for file shares
Highly AvailableYYYNNY
Satisfies SLAs??????
Maximum nodes8Limited by host hardware2N/AN/ALimited by host
Failover time<2 minutesVMotion or server startup time<2 minutesSPFSPFVMotion or server startup time
Single server Disaster Recovery – Software Failure<2 minutesSnapshot, server rebuild or manual recovery procedure<2 minutesSPFSPFpshot, server rebuild or manual recovery procedure
Single server Disaster Recovery – Hardware Failure<2 minutes< 30 seconds< 30 secondsSPFSPF< 30 seconds
Licensing2003 Enterprise per nodeDataCentre + CALs (depending on VM design)DataCentre + CALs (depending on VM design)2003 Standard2003 Standard2003 Standard
Hardware Failure – Data CommunicationsSingle/teamed NIC for prod interfaceNIC redundancy depending on virtual solution NIC redundancy depending on virtual solution + cluster-specific requirementsTeamed NICTeamed NICNIC redundancy depending on virtual solution
Hardware Failure – HBASingle HBA per nodeHBA redundancy depending on virtual solutionHBA redundancy depending on virtual solutionSingle HBASingle HBAHBA redundancy depending on virtual solution
OS Disk ConfigurationBasic DynamicBasicDynamicDynamicDynamic
Hardware utilisationPhysical serversVirtual serversVirtual serversPhysical serversPhysical serversVirtual servers
Cost allocationCost model requiredCost model requiredCost model requiredPer server/LUNPer server/LUNCost model required
Scalability/Flexibility – adding new nodes/LUNsYYYNNY, DFS for file
ManageabilityMSCS skills requiredVMware skills requiredComplex combination of both technologiesDFS skills requiredExisting skills, but increased per serverDFS and VMware skills required
User access to shares via UNCSingle nameMultiple namesSingle nameSingle nameMultiple namesDFS namespace
Future proofing – migration to new hardware/OSModerately complicated migrationRelatively simple upgrade path, reattaching LUNs or adding another VMModerately complicated migrationRelatively simpleRelatively simpleRelatively simple
Hardware on Vendor HCL??????
Backup/restore?Standard file backupVCB or ?Standard file backupStandard file backupVCB or ?
Printer administrationSimplified with Cluster 2003Duplicated effort on each print serverSimplified with Cluster 2003N/ADuplicated effort on each print serverDuplicated effort on each print server
Service and Event MonitoringCluster monitoring requiredStandard monitoring for virtual servers, host monitoring requiredCluster monitoring for virtual servers, VMware host monitoring requiredStandard monitoringStandard monitoringStandard monitoring for virtual servers, host monitoring required

1. Basic disks on a Microsoft server cluster can be extended if new space is visible on the LUN. The disks cannot be dynamic in MSCS.

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Modifying Exchange mailbox permissions

Two methods are discussed in this post regarding command-line modifications to Exchange 2003 mailbox security, VBScript and ADModify.Net.

Method 1 - VBScript

The first method is to use a VBScript - modifying exchange mailbox rights can be done in a very similar fashion to modifying the ntSecurityDescriptor attribute. However, to make this work you need CDOEXM installed, installed with the exchange console on your administrative workstation.

The CDOEXM (Collaboration Data Objects for Exchange Management) objects provide an interface called IExchangeMailbox. This interface inherits from IMailboxStore, and also provides an additional property key to modifying Exchange permissions - the MailboxRights property.

Instead of using the msExchMailboxSecurityDescriptor attribute, you need to get and set the DACL using the MailboxRights property. The msExchMailboxSecurityDescriptor is only a back-link from the Exchange store, it doesn’t have a matching forward-link - therefore it won't be replicated if you modify this AD attribute. It eventually would be overwritten when the DACL is replicated back from the exchange mailbox object.

I assume the reason for the backlink msExchMailboxSecurityDescriptor attribute is mostly convenience and backwards compatibility, ie, an application or an admin can read the Exchange permissions (excluding inheritance I think) without requiring CDO.

Note that the Microsoft references below provide example script on using MailboxRights to modify existing mailboxes, and although the articles are quite good and very detailed, the relevance of MailboxRights compared to msExchMailboxSecurityDescriptor wasn't immediately obvious (to me at least).

Method 2 - ADModify.Net

The second method is to use ADModify.Net - a very powerful (and therefore potentially dangerous) utility to perform bulk modifications on mailbox objects. It comes with both command-line and GUI versions.

admodcmd is the command-line version, and the following example command starts at the specified DN, filtering based on user accounts with a certain CN, and then adds full-access mailbox rights to these users for the specified group:

admodcmd -dn "OU=USERS,DC=domain,DC=com" -f "(&(objectClass=user)(objectCategory=person)(CN=user*))" -addtomailboxrights domain\GroupToAdd ACE_MB_FULL_ACCESS

Issuing Query....

12 items found matching the specified filter.

0% 50% 100%

Successful changes: 12
Already set to specified value: 0
Failed changes: 0

Operation Completed in 4.15625 seconds.


How to set Exchange Server 2003 and Exchange 2000 Server mailbox rights on a mailbox that exists in the information store

How to set Exchange Server 2000 and 2003 mailbox rights at the time of mailbox creation

Exchange 2003 SP2 Disabling Mapi/Non-Cached Access Per User

ADModify.NET - Home

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Nested 'for /f' catch-all

Have you ever used a nested 'for /f' command and realised that it's skipping some of your items because the operation you are executing against each item is not matching your primary criteria?

For example, if you have a control file listing computer names, and you want to report the IP address if the machine responds to a ping:

for /f %i in (test.txt) do for /f "tokens=1" %m in ('"ping -n 1 %i find /i "reply""') do @echo %i,%m

For a machine that either can't be resolved or doesn't reply, that machine would be skipped from the output.

However inside the second for loop, you can check the errorlevel returned by the find command, and echo an alternate response:

for /f %i in (test.txt) do @for /f "tokens=3 delims=: " %m in ('"ping -n 1 %i find /i "reply" & if errorlevel 1 echo 1 2 NoReply"') do @echo %i,%m

Note that the nested 'for /f' command above is extracting the third token, so the errorlevel echo also requires three tokens…

This provides a catchall so that for each of your items, something will be echoed, useful in many situations when you're potentially trying to check dozens or hundreds of items for something and you want results for all items.

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Thursday, May 15, 2008

PowerShell FindFirstFileW bypassing MAX_PATH

By default it seems PowerShell uses the ANSI versions of FindFirstFile and FindNextFile, and is therefore limited to MAX_PATH - 260 characters in total. This PowerShell script uses in-line compiled VB.Net to call the wide unicode versions - FindFirstFileW and FindNextFileW - to bypass the ANSI MAX_PATH limitations. The results are essentially the same as a 'dir /s/b/a-d' that won't return with 'The filename or extension is too long.' or 'The directory name x is too long' errors.

If you use '/l', the script will only return deep paths, and also provide the 8.3 equivalent to the deep path - a useful method to access these files. For example, using this method, you can access a file with UNC (eg \\server\share) over 18 levels deep ((260-16)/13).

Note that these wide calls only bypass MAX_PATH when using a mapped drive with the \\?\ prefix to disable path parsing. Just specify a mapped drive or local path normally, eg c:\temp, the script will automatically prepend \\?\

I've been experimenting with using PowerShell to dynamically compile VB.Net or C# code, and within that managed code, calling unmanaged platform invoke operations to get to APIs. I like the flexibility of using a scripting language rather than compiled code, and while this certainly isn't as functional as 'dir', it was useful to me when at least trying to get a list of deep files.

## FindFiles.ps1 ##
   [string] $dirRoot = $pwd,
   [string] $Spec = "*.*",
   [bool] $longOnly = $false

# Changes:
#  23/05/2008, Wayne Martin, Added the option to only report +max_path entries, and report the short path of those directories (which makes it easier to access them)
# Description:
#  Use the wide unicode versions to report a directory listing of all files, including those that exceed the MAX_PATH ANSI limitations
# Assumptions, this script works on the assumption that:
#  There's a console to write the output from the compiled VB.Net
# Author:
#  Wayne Martin, 15/05/2008
# Usage
#  PowerShell . .\FindFiles.ps1 -d c:\temp -s *.*
#  PowerShell . .\FindFiles.ps1 -d c:\temp
#  PowerShell . .\FindFiles.ps1 -d g: -l $true
# References:

$provider = new-object Microsoft.VisualBasic.VBCodeProvider
$params = new-object System.CodeDom.Compiler.CompilerParameters
$params.GenerateInMemory = $True
$refs = "System.dll","Microsoft.VisualBasic.dll"

$txtCode = @'
Imports System
Imports System.Runtime.InteropServices
Class FindFiles

Const ERROR_SUCCESS As Long = 0
Private Const MAX_PREFERRED_LENGTH As Long = -1

Public Structure WIN32_FIND_DATAW
    '''DWORD->unsigned int
    Public dwFileAttributes As UInteger
    Public ftCreationTime As FILETIME
    Public ftLastAccessTime As FILETIME
    Public ftLastWriteTime As FILETIME
    '''DWORD->unsigned int
    Public nFileSizeHigh As UInteger
    '''DWORD->unsigned int
    Public nFileSizeLow As UInteger
    '''DWORD->unsigned int
    Public dwReserved0 As UInteger
    '''DWORD->unsigned int
    Public dwReserved1 As UInteger
    Public cFileName As String
    Public cAlternateFileName As String
End Structure

Public Structure FILETIME
    '''DWORD->unsigned int
    Public dwLowDateTime As UInteger
    '''DWORD->unsigned int
    Public dwHighDateTime As UInteger
End Structure

Partial Public Class NativeMethods
    '''Return Type: HANDLE->void*
    '''lpFileName: LPCWSTR->WCHAR*
    '''lpFindFileData: LPWIN32_FIND_DATAW->_WIN32_FIND_DATAW*
    Public Shared Function FindFirstFileW( ByVal lpFileName As String,  ByRef lpFindFileData As WIN32_FIND_DATAW) As System.IntPtr
    End Function
    '''Return Type: BOOL->int
    '''hFindFile: HANDLE->void*
    '''lpFindFileData: LPWIN32_FIND_DATAW->_WIN32_FIND_DATAW*
    Public Shared Function FindNextFileW( ByVal hFindFile As System.IntPtr,  ByRef lpFindFileData As WIN32_FIND_DATAW) As  Boolean
    End Function

    '''Return Type: BOOL->int
    '''hFindFile: HANDLE->void*
    Public Shared Function FindClose(ByVal hFindFile As System.IntPtr) As  Boolean
    End Function

    '''Return Type: DWORD->unsigned int
    '''lpszLongPath: LPCWSTR->WCHAR*
    '''lpszShortPath: LPWSTR->WCHAR*
    '''cchBuffer: DWORD->unsigned int
    Public Shared Function GetShortPathNameW( ByVal lpszLongPath As String,  ByVal lpszShortPath As System.Text.StringBuilder, ByVal cchBuffer As UInteger) As UInteger
    End Function

End Class

    Dim FFW as New NativeMethods

Function Main(ByVal dirRoot As String, ByVal sFileSpec As String, Byval longOnly As Boolean) As Long
    Dim result As Long

    result = FindFiles(dirRoot, sFileSpec, longOnly)

    main = result          ' Return the result
End Function

Function FindFiles(ByRef sDir As String, ByVal sFileSpec as String, Byval longOnly As Boolean) As Long
    Const MAX_PATH As Integer = 260
    Dim FindFileData as WIN32_FIND_DATAW
    Dim hFile As Long
    Dim sFullPath As String
    Dim sFullFile As String
    Dim length as UInteger
    Dim sShortPath As New System.Text.StringBuilder()

    sFullPath = "\\?\" & sDir

    'console.writeline(sFullPath & "\" & sFileSpec)

    hFile = FFW.FindFirstFileW(sFullPath & "\" & sFileSpec, FindFileData)     ' Find the first object
    if hFile > 0 Then            ' Has something been found?
      If (FindFileData.dwFileAttributes AND FILE_ATTRIBUTE_DIRECTORY)  <> FILE_ATTRIBUTE_DIRECTORY Then  ' Is this a file?
        sFullFile = sFullPath & "\" & FindFileData.cFileName
        If (longOnly AND sFullFile.Length >= MAX_PATH) Then
          length = FFW.GetShortPathNameW(sFullPath, sShortPath, sFullPath.Length) ' GEt the 8.3 path
          console.writeline(sFullFile & " " & sshortpath.ToString())  ' Yes, report the full path and filename
        ElseIf (NOT longOnly)
        End If
      End If

      While FFW.FindNextFileW(hFile, FindFileData)        ' For all the items in this directory
        If (FindFileData.dwFileAttributes AND FILE_ATTRIBUTE_DIRECTORY) <> FILE_ATTRIBUTE_DIRECTORY Then ' Is this a file?
          sFullFile = sFullPath & "\" & FindFileData.cFileName
          If (longOnly AND sFullFile.Length >= MAX_PATH) Then
            length = FFW.GetShortPathNameW(sFullPath, sShortPath, sFullPath.Length) ' GEt the 8.3 path
            console.writeline(sFullFile & " " & sshortpath.ToString())  ' Yes, report the full path and filename
          ElseIf (NOT longOnly)
          End If
        End If
      End While
      FFW.FindClose(hFile)           ' Close the handle
      FindFileData = Nothing
    End If

    hFile = FFW.FindFirstFileW(sFullPath & "\" & "*.*", FindFileData)      ' Repeat the process looking for sub-directories using *.*
    if hFile > 0 Then
      If (FindFileData.dwFileAttributes AND FILE_ATTRIBUTE_DIRECTORY) AND _
          (FindFileData.cFileName <> ".") AND (FindFileData.cFileName <> "..") Then
        Call FindFiles(sDir & "\" & FindFileData.cFileName, sFileSpec, longOnly)      ' Recurse
      End If

      While FFW.FindNextFileW(hFile, FindFileData)
        If (FindFileData.dwFileAttributes AND FILE_ATTRIBUTE_DIRECTORY) AND _
            (FindFileData.cFileName <> ".") AND (FindFileData.cFileName <> "..") Then
          Call FindFiles(sDir & "\" & FindFileData.cFileName, sFileSpec, longOnly)     ' Recurse
        End If
      End While
      FFW.FindClose(hFile)           ' Close the handle
      FindFileData = Nothing
    End If

End Function

end class


$cr = $provider.CompileAssemblyFromSource($params, $txtCode)
if ($cr.Errors.Count) {
    $codeLines = $txtCode.Split("`n");
    foreach ($ce in $cr.Errors)
        write-host "Error: $($codeLines[$($ce.Line - 1)])"
        write-host $ce
        #$ce out-default
    Throw "INVALID DATA: Errors encountered while compiling code"
$mAssembly = $cr.CompiledAssembly
$instance = $mAssembly.CreateInstance("FindFiles")

$result = $instance.main($dirRoot, $Spec, $longOnly)

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Sunday, May 11, 2008

Running PowerSell Scripts from ASP.Net

I have been writing a few powershell scripts for administrative purposes lately and I thought it would be useful if I could run these scripts from a web page.

You could of course re-write the PowerShell scripts in C# or VB.Net, but I think it's quite useful to have a single script, that you can either run from the commandline, or through a web page.

The following example aspx shows how to read a script from file, pass parameters to the script and then run the script, echoing the output from the script to the web page.

To use this you will need:

  1. This aspx in the root of a web application (see below)
  2. A web.config for the ASP.Net config (see below)
  3. If you're application is running in an application pool using Network Service, that context will need read-access to the root of the virtual directory
  4. A powerShell script (or just paste some script to the textbox)
  5. System.Management.Automation in the .\bin directory of the web application

I even tested a PowerShell script that compiles dynamic VB.Net code and calls an API through PInvoke inside the VB.Net, and that PS1 script worked fine.

Note the use of "Out-String" to return the output in string format. I did try and use pipeline.Output, but that didn't return anything when the PS1 was calling write-output. Also, the parameter passing uses $args, as I couldn't get Params() to work as it does when at the top of a PS1 script.

Example web.config

<?xml version="1.0" encoding="utf-8"?>
<defaultProxy enabled="false"/>
<pages validateRequest="false"/>
<compilation defaultLanguage="c#" debug="false"/>
<customErrors mode="Off"/>
<authentication mode="Windows"/>
<identity impersonate="true"/>

Example test.ps1:

if ($args.count -ge 2)
{ $text = $args[1]}
write-output $text

Example Test.aspx

<%@ Page language="c#" AutoEventWireup="true" Debug="true" %>
<%@ Import Namespace="System" %>
<%@ Import Namespace="System.IO" %>
<%@ Import Namespace="System.Management.Automation.Runspaces" %>
<%@ Import Namespace="System.Management.Automation" %>
<%@ Import Namespace="System.Collections.ObjectModel" %>

<script language="C#" runat="server">

// The previous lines use <%...%> to indicate script code, and they specify the namespaces to import. As mentioned earlier, the assemblies must be located in the \Bin subdirectory of the application's starting point.

private void Button3_Click(object sender, System.EventArgs e)
String fp = Server.MapPath(".") + "\\" + tPowerShellScriptName.Text;
StreamReader sr = new StreamReader(fp);
tPowerShellScriptCode.Text = sr.ReadToEnd();

private void Button2_Click(object sender, System.EventArgs e)
tPowerShellScriptResult.Text = RunScript(tPowerShellScriptCode.Text);


private string RunScript(string scriptText)
Runspace runspace = RunspaceFactory.CreateRunspace();
Pipeline pipeline = runspace.CreatePipeline();

// Create a new runspaces.command object of type script
Command cmdScript = new Command(scriptText, true, false);
cmdScript.Parameters.Add("-t", txtInput.Text);
//You could also use: pipeline.Commands.AddScript(scriptText);

// Re-format all output to strings

// Invoke the pipeline
Collection<PSObject> results = pipeline.Invoke();

//String sresults = pipeline.Output.Count.ToString();
//sresults = sresults + "," + results.Count.ToString();
String sresults = "";

foreach (PSObject obj in results)
sresults = sresults + obj.ToString();

// close the runspace and set to null
runspace = null;

return sresults;


<form id="Form1" method="post" runat="server">
<P> <asp:Label id="Label1" runat="server" Width="104px">Parameter:</asp:Label>
<asp:TextBox id="txtInput" runat="server"></asp:TextBox></P>
<P> <asp:Button id="Button3" runat="server" Text="Load" OnClick="Button3_Click"></asp:Button> </P>
<P> <asp:Button id="Button2" runat="server" Text="Run" OnClick="Button2_Click"></asp:Button> </P>
<P> <asp:Label id="Label2" runat="server" >Relative script name:</asp:Label>
<asp:TextBox id="tPowerShellScriptName" Text="test.ps1" runat="server"></asp:TextBox></P>
<P> <asp:TextBox rows="20" columns="120" TextMode="multiline" id="tPowerShellScriptCode" runat="server"></asp:TextBox></P>
<P> <asp:TextBox rows="8" columns="120" TextMode="multiline" id="tPowerShellScriptResult" runat="server"></asp:TextBox></P>

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Thursday, May 8, 2008

Binary <-> Hex String files with Powershell

I needed a method of converting a text representation of a binary file (eg. the text 'AE00F00D' in a file) to the binary byte equivalent and vice versa.

Included are the PowerShell scripts I ended up writing for the task. The core parts of script are below, click 'Read More!' to see the full .PS1 scripts

Note that I have updated the original post, as the '-readcount 0' when reading the file, pre-allocating the array (thanks for the suggestion Phil) and using an assigned variable in the for loop made this much faster.

[byte[]]$bytes = Get-Content -encoding byte -readcount 0 -path $BinaryFile
  for ($i = 0; $i -le $count-1; $i++)
  { $hex = "{0:x}" -f $bytes[$i]
    [void]$output.Append($hex.PadLeft(2, "0"))  # Pad any single digits
    set-content $OutputFile -value $output.ToString()

  $HexString = Get-Content -readcount 0 -path $HexStringFile
  for ( $i = 0; $i -le $count-1; $i+=2 )
  { $bytes[$x] = [byte]::Parse($hexString.Substring($i,2), [System.Globalization.NumberStyles]::HexNumber)
    $x += 1 }
    set-content -encoding byte $OutputFile -value $bytes

I was using this to restore and update some images stored in elements in an Operations Manager 2007 Management Pack XML file.


   [string] $HexStringfile = "",
   [string] $OutputFile = ""

if ($HexstringFile -ne "") {
  $HexString = Get-Content -readcount 0 -path $HexStringFile
  $HexString = $HexString[0]
  $count = $hexString.length
  $byteCount = $count/2
  $bytes = New-Object byte[] $byteCount
  $byte = $null

  $x = 0
  for ( $i = 0; $i -le $count-1; $i+=2 )
    $bytes[$x] = [byte]::Parse($hexString.Substring($i,2), [System.Globalization.NumberStyles]::HexNumber)
    $x += 1

  if ($OutputFile -ne "") {
    set-content -encoding byte $OutputFile -value $bytes
  } else {
    write-host "No output file specified"
} else{
  write-host "Error, no input file specified"

# Byte.Parse Method (String, NumberStyles)


   [string] $Binaryfile = "",
   [string] $OutputFile = ""

if ($BinaryFile -ne "") {
  [byte[]]$bytes = Get-Content -encoding byte -readcount 0 -path $BinaryFile # Use a readCount of 0 to read all lines at once

  $output = new-object System.Text.StringBuilder # Using stringBuilder seems faster than $a = $a + "a" ?
  $count = $bytes.length    # The loop seems much faster when using a pre-set value?
  for ($i = 0; $i -le $count-1; $i++)
    $hex = "{0:x}" -f $bytes[$i]
    [void]$output.Append($hex.PadLeft(2, "0"))  # Pad any single digits
  if ($OutputFile -ne "") {
    set-content $OutputFile -value $output.ToString()
  } else {
    write-host "No output file specified"
} else{
  write-host "Error, no input file specified"

# .net formatting decimal to hex

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Tuesday, May 6, 2008

OpsMgr 2007 Current Performance Instances

In the default Operations Manager 2007 reports there doesn’t seem to be a report that will show you the most recent gathered performance instance of something. For example, for a group of servers, I wanted a report that showed the current free disk space on all logical drives of those servers.

I couldn’t find anything in OpsMgr 2007, so to start with I’ve written a SQL query that will provide the information from the Operations Manager Data Warehouse database.

Note that I don’t know much about SQL or Operations Manager, so this may not be the best method.

The following query generates a temporary named result set, partitioning that result set using the row-number() ranking windowing function, over managed entities by time. This provies a method of selecting the most recent performance rule instance for each the specified rule and managed entity.

WITH CurrentDiskFree AS
(SELECT PRI.Instancename, DateAdd(Hour, 10, PPR.DateTime) as DateTime,

ME.Path, ME.ManagedEntityRowID, PPR.SampleValue,

(partition by ME.ManagedEntityRowID order by PPR.DateTime DESC)as RowNumber
FROM vPerformanceRuleInstance PRI
inner join vPerformanceRule PR on PRI.RuleRowID = PR.RuleRowID
inner join perf.vPerfRaw PPR on PRI.PerformanceRuleInstanceRowID = PPR.PerformanceRuleInstanceRowID
inner join vManagedEntity ME on ME.ManagedEntityRowID = PPR.ManagedEntityRowID
inner join vRule RU ON RU.RuleRowID = PR.RuleRowID
WHERE RU.RuleDefaultName = 'Logical Disk Free Megabytes'
AND ME.Path like '%server%')

FROM CurrentDiskFree
where RowNumber = 1


  1. You could also restrict the query based on the members of a group, a more standard method in Operations Manager terms.
  2. One managed entity path can and will usually have more than one performance rule instance and ManagedEntityRowID. For example, C: and D: drive in a server would have one row each.
  3. The date is recorded in the database is UTC – GMT+0, I’ve calculated GMT+10 for my local timezone in the SQL query.
  4. To make this generic in a reporting sense, something with ManagementGroupID should be added to query and use the appropriate management group.

An example resultset:

C:2008-05-06 22:10:50.000server1.domain.com277251501
D:2008-05-06 22:15:50.000server1.domain.com278367491
C:2008-05-06 22:30:49.000server2.domain.com282128161
D:2008-05-06 22:10:49.000server3.domain.com183367701

Read more!

Monday, May 5, 2008

Impersonating a user without passwords

While playing with starting processes in the winlogon secure desktop and unlocking a machine without a password (remoteunlock.exe), I experimented with using ZwCreateToken through ztokenman.exe to start a process as a user without knowing their password.

Combined with psexec, this allows you to run something as a user that’s interactively logged on, while their workstation is locked and without knowing their password.

Start a process on the winlogon desktop, used when the machine is locked:

  • psexec /s \\%computer% cmd /c c:\windows\temp\psexec /accepteula /x /d /s cmd

From this command prompt, run ztokenman.exe and:

  1. In the Process drop-down, select a process owned by the user (eg explorer.exe)
  2. Click DumpProcessToken
  3. In the 'Create a Process With the Current Token' text-box, type cmd.exe
  4. Click 'CreateProcessAsUser with Current Token'

From the cmd.exe that opens, this should be under the context of the interactive user of the workstation. For example, if you run net use, you should see the connections the user has.

This uses an undocumented API - ZwCreateToken, after calling OpenProcessToken to duplicate a token from an existing process.

Is this actually useful for anything? Probably not, but it’s interesting nonetheless. Note that remoteunlock.exe will actually provide access to the desktop for the interactive winlogon session, even if the machine is locked.


RunAsEx and ztokenman:

Unlocking XP/2003 without passwords


Read more!

Running a process in the secure winlogon desktop

Have you ever wanted to run something in the secure desktop controlled by winlogon - the desktop you see when nobody is logged on?

I have, and recently realised that psexec supports this - with the '-x' command. For example, if you run the following command and then press ctrl+alt+del or logoff, you’ll still have a console with the ability to start other commands:

  • psexec /x /d /s cmd

This only works on the local machine, but of course psexec allows you to run things remotely! The following command therefore uses psexec to remotely run cmd to start psexec locally to run cmd in the local winlogon desktop of the remote computer:

  • psexec /s \\%computer% cmd /c psexec /accepteula /x /d /s cmd

This was done using psexec.exe v1.94 and the second command assumes that psexec.exe is available in the path on the remote computer.


Unlocking XP/2003 without passwords

Read more!

Shadow an XP Terminal Services session

I came across a method to shadow an XP desktop in a very roundabout sort of way. It is from a Microsoft Technet article, below is an example that I find clearer, and the link to the original article:

  1. mstsc to a 2003 server (eg. server1)
  2. From the 2003 Terminal Services session, mstsc to an XP machine (eg. workstation1)
  3. Open another TS session to the 2003 server in step 1 (server1)
    1. Run ‘query session’ to find the RDP session ID of the session in step 1
    2. Run ‘shadow %id%’ to shadow the TS session
  4. Both 2003 TS sessions now have an interactive child session to the XP machine.
  5. Press Ctrl-* to terminate the shadow on the second 2003 TS session (the minus and star keys from the numeric keypad)

This is a bit clunky, but it does provide a method to allow two people interactive control of an XP desktop, without having to use remote assistance.

How To Shadow a Remote Desktop Session in Windows XP Professional

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

Find where a user is logged on from

In a large environment, it's often useful to know which computer a user is logged on to, but historically that’s not that very easy to find – until FU that is.

The following command will tell you which workstation a user is logged in to, based on the assumption that your normal user account in Active Directory have a home drive set and automatically mapped when they logon. This assumes the file server hosting the user home drives is Windows 2000 or later.

To run, you'll need the dsquery/dsget utilities, and wmic, and a command prompt with a doskey macro to make it easier to run.

Set user=UserA
for /f "tokens=2 delims=\" %i in ('"dsquery user -name %user% dsget user -hmdir find /i "%user%""') do @for /f "skip=1 tokens=1-3" %m in ('"wmic /node:"%i" path win32_serversession WHERE "UserName Like '%user%'" Get ComputerName,ActiveTime,IdleTime"') do @for /f "tokens=2" %q in ('"ping -a %n -n 1 find /i "pinging""') do @echo %q %user% %n %i %m %o

I realise that’s not easy to type in, so you can use a doskey macro, by:
• Putting the command below in a text file
• Running doskey /macrofile=%path%\fu.txt (or modifying your command shell to run it automatically).

FU=for %g in ($1 $2 $3 $4 $5 $6 $7 $8 $9) do @for /f "tokens=2 delims=\" %i in ('"dsquery user -name %g dsget user -hmdir find /i "%g""') do @for /f "skip=1 tokens=1-3" %m in ('"wmic /node:"%i" path win32_serversession WHERE "UserName Like '%g'" Get ComputerName,ActiveTime,IdleTime"') do @for /f "tokens=2" %q in ('"ping -a %n -n 1 find /i "pinging""') do @echo %q %g %n %i %m %o

You can then run:

fu UserA UserA fileserver 123 11

Or specify up to 9 usernames, eg:

fu UserA UserB UserC UserD

Note that the major limitation of using WMI (this command) or the ADSI provider is that there is apparently no way of controlling the NetSessionEnum() call to specify the information level you require. Therefore, you'll need administrative privileges on the file server for this to work. See my post for a powershell equivalent that uses information level 10 instead of 502.

Granting WMI access to standard users is not enough, it seems access to Win32_ServerSession results in a call to NetSessionEnum, and regardless of the resultset you specify, access is denied unless you're an administrator. I went some way down the track of modifying the DACLs on securable objects to see whether the permission to enumerate sessions could be granted, but I couldn't find a method (using winobj mainly).

If you do want to grant WMI access for other reasons, on a Windows Server 2003 box, you can:

  1. Use wmimgmt.msc to provide 'enable account' and 'remote enable' to the root\CIMv2 namespace (assuming you are trying to access the root)
    Add a security group to the local 'Distributed COM Users' on the server. This provides remote access for DCOM calls.
  2. After doing this, a standard call to a normal class such as win32_operatingsystem works, but still not to win32_serversession. This does not grant the right to execute methods or write data.


The article below indicates that local administrative rights or server operator rights are required to enumerate sessions: NetSessionEnum Function

The Win32_ServerSession Windows Management Instrumentation class returns incorrect server session instances on a Windows Server 2003-based computer

Access to WMI Namespaces

Securing a Remote WMI Connection

DCOM Security Enhancements

Example: Getting WMI Data from a Remote Computer

Authorize WMI users and set permissions

How To Set WMI Namespace Security in Windows XP;en-us;295292

Low-level Security Descriptor Functions

Securable Objects

Access to WMI Securable Objects

Wayne's World of IT (WWoIT), Copyright 2008 Wayne Martin.

Read more!

All Posts

printQueue AD objects for 2003 ClusterVirtualCenter Physical to VirtualVirtual 2003 MSCS Cluster in ESX VI3
Finding duplicate DNS recordsCommand-line automation – Echo and macrosCommand-line automation – set
Command-line automation - errorlevels and ifCommand-line automation - find and findstrBuilding blocks of command-line automation - FOR
Useful PowerShell command-line operationsMSCS 2003 Cluster Virtual Server ComponentsServer-side process for simple file access
OpsMgr 2007 performance script - VMware datastores...Enumerating URLs in Internet ExplorerNTLM Trusts between 2003 and NT4
2003 Servers with Hibernation enabledReading Shortcuts with PowerShell and VBSModifying DLL Resources
Automatically mapping printersSimple string encryption with PowerShellUseful NTFS and security command-line operations
Useful Windows Printer command-line operationsUseful Windows MSCS Cluster command-line operation...Useful VMware ESX and VC command-line operations
Useful general command-line operationsUseful DNS, DHCP and WINS command-line operationsUseful Active Directory command-line operations
Useful command-linesCreating secedit templates with PowerShellFixing Permissions with NTFS intra-volume moves
Converting filetime with vbs and PowerShellDifference between bat and cmdReplica Domain for Authentication
Troubleshooting Windows PrintingRenaming a user account in ADOpsMgr 2007 Reports - Sorting, Filtering, Charting...
WMIC XSL CSV output formattingEnumerating File Server ResourcesWMIC Custom Alias and Format
AD site discoveryPassing Parameters between OpsMgr and SSRSAnalyzing Windows Kernel Dumps
Process list with command-line argumentsOpsMgr 2007 Customized Reporting - SQL QueriesPreventing accidental NTFS data moves
FSRM and NTFS Quotas in 2003 R2PowerShell Deleting NTFS Alternate Data StreamsNTFS links - reparse, symbolic, hard, junction
IE Warnings when files are executedPowerShell Low-level keyboard hookCross-forest authentication and GP processing
Deleting Invalid SMS 2003 Distribution PointsCross-forest authentication and site synchronizati...Determining AD attribute replication
AD Security vs Distribution GroupsTroubleshooting cross-forest trust secure channels...RIS cross-domain access
Large SMS Web Reports return Error 500Troubleshooting SMS 2003 MP and SLPRemotely determine physical memory
VMware SDK with PowershellSpinning Excel Pie ChartPoke-Info PowerShell script
Reading web content with PowerShellAutomated Cluster File Security and PurgingManaging printers at the command-line
File System Filters and minifiltersOpsMgr 2007 SSRS Reports using SQL 2005 XMLAccess Based Enumeration in 2003 and MSCS
Find VM snapshots in ESX/VCComparing MSCS/VMware/DFS File & PrintModifying Exchange mailbox permissions
Nested 'for /f' catch-allPowerShell FindFirstFileW bypassing MAX_PATHRunning PowerSell Scripts from ASP.Net
Binary <-> Hex String files with PowershellOpsMgr 2007 Current Performance InstancesImpersonating a user without passwords
Running a process in the secure winlogon desktopShadow an XP Terminal Services sessionFind where a user is logged on from
Active Directory _msdcs DNS zonesUnlocking XP/2003 without passwords2003 Cluster-enabled scheduled tasks
Purging aged files from the filesystemFinding customised ADM templates in ADDomain local security groups for cross-forest secu...
Account Management eventlog auditingVMware cluster/Virtual Center StatisticsRunning scheduled tasks as a non-administrator
Audit Windows 2003 print server usageActive Directory DiagnosticsViewing NTFS information with nfi and diskedit
Performance Tuning for 2003 File ServersChecking ESX/VC VMs for snapshotsShowing non-persistent devices in device manager
Implementing an MSCS 2003 server clusterFinding users on a subnetWMI filter for subnet filtered Group Policy
Testing DNS records for scavengingRefreshing Computer Account AD Group MembershipTesting Network Ports from Windows
Using Recovery Console with RISPAE Boot.ini Switch for DEP or 4GB+ memoryUsing 32-bit COM objects on x64 platforms
Active Directory Organizational Unit (OU) DesignTroubleshooting computer accounts in an Active Dir...260+ character MAX_PATH limitations in filenames
Create or modify a security template for NTFS perm...Find where a user is connecting from through WMISDDL syntax in secedit security templates

About Me

I’ve worked in IT for over 20 years, and I know just about enough to realise that I don’t know very much.