New Tool: Chocolatey Application Wrapper for MDT

Update: Updated script location

I’ve been spending some time recently working on my own private Deployment Share, one of the downsides is the overhead of keeping my application packages up to date. The worst offenders are of course Google Chrome, and the Adobe Flash and PDF readers, seems like these packages are constantly changing, and I need to find and update frequently. If only there was a better way, something to make adding packages easier like my Dell Driver Pack Update tool, or the ZTIWindowsUpdate.wsf script found in MDT to run Windows Update/WSUS. Any way to keep my environment up to date without the manual hassle. :^)

There are several tools out there on the internet that help and assist you with application package management. At first I started looking at Ninite.com, however their tools are designed for End Users, and if you want to have an automated solution (like I need), then you must pay a per-machine licensing fee. (yuck)

Then I started looking at Chocolatey, which provides a command line interface for installing programs similar to Ninite.com without the UI fluff. I come across a couple of packages that were found in Ninite but not found in Chocolatey, but that list is small: (Google Talk, AIM, Yahoo!, KMPlayer, Winamp, K-Lite Codecs, Mozy, RealVNC) and the list of packages found in Chocolatey is very large. Cool!

Anther advantage of Chocolatey is that it’s in alignment with future Microsoft plans around packaging (see OneGet). However Oneget still requires some installation overhead that I didn’t want to deal with for this release, so something for the future.

So I spent some time experimenting, reverse engineering Chocolatey and scripting a workable solution, and I think I have something ready for use.

The Script

https://raw.githubusercontent.com/keithga/DeployShared/master/Templates/Distribution/Scripts/Extras/Install-Chocolatey.ps1

The script is fairly straight forwards. When calling the script, we pass in a list of parameters in through the command line, if no parameters are passed in, then we will default to the Chocolatey001 … Chocolatey00n variables you may be familiar with when using MDT and adding variables into your CustomSettings.ini file.

[CmdletBinding()]
param(
    [parameter(ValueFromRemainingArguments=$true)]
    [string[]]$Packages = $TSEnvList:Chocolatey
)

Then we determine a local path for the Chocolatey install, looking to see if it was already installed, then forcing it to use the local MDT temp directory (C:\MININT) if present. This means that Chocolatey won’t be available after MDT installation, great for image capture.

write-verbose "Construct a local path for Chocolatey"
if ($env:ChocolateyInstall -eq $Null)
{
    $env:ChocolateyInstall = [Environment]::GetEnvironmentVariable( "ChocolateyInstall" , [System.EnvironmentVariableTarget]::User)
    if ($env:ChocolateyInstall -eq $Null)
    {
        $env:ChocolateyInstall = join-path ([System.Environment]::GetFolderPath("CommonApplicationData")) "Chocolatey"
        if ($tsenv:LogPath -ne $null)
        {
            $env:ChocolateyInstall = join-path $tsenv:LogPath "Chocolatey"
        }
    }
}

We look to see if the Choco.exe program is found locally, and if not we will install Chocolatey from the internet.

$ChocoExe = join-path $env:ChocolateyInstall "bin\choco.exe"

write-verbose "Chocolatey Program: $ChocoExe"
if ( ! (test-path $ChocoExe ) )
{
    write-verbose "Install Chocolatey..."
    Invoke-Expression ((new-object net.webclient).DownloadString('https://chocolatey.org/install.ps1'))
    if (!(test-path $ChocoExe))
    {
        throw "Chocolatey Install not found!"
    }
}

Finally, we will go through all the packages requested, and kick off the installation. The list of requested packages will come from the command line

write-verbose "Install Chocolatey packages from within MDT"
foreach ( $Package in $Packages ) 
{
    write-verbose "Install Chocolatey Package: $ChocoExe $Package"
    & $ChocoExe install $Package 2>&1 | out-string
}

write-verbose "Chocolatey install done"

Had some problems with Chocolatey and the MDT PowerShell provider with some overly verbose output, so I redirect all errors to the Standard Console, that’s what the weird 2>&1 is all about.

Integration with MDT LiteTouch

To add into your MDT LiteTouch environment, simply download the ZTIChocolatey-Wrapper.ps1 into your \\Server\DeploymentShare$\Scripts folder.

Then add a new step into your task sequence pointing to the powershell script: %ScriptRoot%\ZTIChocolatey-Wrapper.ps1
chocolatey

You can add entries into your CustomSettings.ini file, here is a snippet from my server:

[Settings]
Priority=UUID,Role,Default
Properties=ModelAlias,UseWindowsUpdate,CustomizeStartScreen,Chocolatey(*)

[Default]
OSInstall=Y
SkipCapture=YES
...
Chocolatey001=VLC
Chocolatey002=7zip
Chocolatey003=speccy
Chocolatey004=WindowsADK
Applications001={91a974c5-6a30-4924-bec2-ade0f82793bc}
Applications002={631f0a5a-ed76-47b8-8ed4-74aeb43f4b49}

The Install Applications step will install the Applications001 and Applications002 Guid I selected above, and the Chocolatey install will install the VLC, 7zip, speccy, and WindowsADK steps above.

Adding the Chocolatey wrapper as an Application

Alternatively you can create an application package (without source files), and use the following command:

powershell.exe -NoProfile -ExecutionPolicy unrestricted "%ScriptRoot%\ZTICHocolatey-Wrapper.ps1" -verbose -Packages "AdobeReader"

Future

The next logical step is to create a Litetouch Wizard page for Chocolatey, allowing you to select packages from a list in the GUI. But I’m a bit stumped on the best way to get a list of all available packages ( clist.exe ), *and* get the package metadata in a graceful way. Right now, I only know how to download each package from the server as *.zip file, extract out the *.xml file and display. If you know how to use PowerShell to get the list of packages *and* get a list of descriptions, versions, and icons, please let me know.

Customizing Your Windows Deployments

Hello again from Mall of America in Minneapolis Minnesota at the Minnesota Management Summit 2014 (MMS for short). #mmsminnesota

Today I am giving a presentation on “Customizing Your Windows Deployments – Tips, Tricks, and Code”.

Links

Stick with Well-Known and Proven Solutions
MDT Wizard Studio

Code

I wanted share the source code examples from my presentation:

MMS2014 – Customizing Your Windows Deployments.zip

Inside you will find the following:

MMS2014 - Customizing Your Windows Deployments.pptx
1 - Unattend\makevhdx.ps1
1 - Unattend\make_client_vhd.cmd
1 - Unattend\Run-VHD.ps1
1 - Unattend\unattend.Workgroup.xml
1 - Unattend\unattend.xml
2 - ztigather\CustomSettings.ini
2 - ztigather\UserExit.vbs
3 - CMD\AppWrapper.cmd
3 - CMD\Source\7z925x64.msi
4 - VBS\ZTI_XXXXXXX.wsf
4 - VBS\ZTIUtility.vbs
5 - PS1\CM1.CleanUp.ps1
5 - PS1\Get-Service.ps1
5 - PS1\Test-MDTPowerShell.cmd
5 - PS1\Test-MDTPowershell.ps1

In the “1 – Unattend” folder is my example of how to take a Windows *.iso image and mount in Hyper-V.
In the “2 – ZTIGather” folder is a quick example of a customsettings.ini file and a userexit.vbs script.
In the “3 – CMD” folder is an example of using a CMD file to wrap commands in a batch script.
In the “4 – VBS” folder I have some examples on how to create MDT VBScript files.
In the “5 – PS1” folder I have an example of how to call a powershell script from within MDT.

-k

Offline Patching with help from MDT

With Windows 10 on the way, I’ve been interested in releases and builds.

It’s become clear that Microsoft is moving towards a model of more frequent major builds with incremental *.msu updates between. What’s going to be interesting is to see how Microsoft makes builds available for updates within enterprises. We’ve seen some interesting ideas with the Windows 8.1 Updates where Microsoft has released a major update via a couple of *.msu packages distributed via WSUS.

The future of updating

Another interesting avenue is the ability to “Upgrade” machines from Windows 7, Windows 8, and Windows 8.1 to Windows 10. This is a change for Windows Setup, where in the past Windows “Upgrade” was only available from the previous version of Windows. So you can’t “Upgrade” from Windows XP to Windows 7, nor can you upgrade from Vista to windows 8.

The problem with Upgrading is that you can *only* use the Clean Windows Image directly from Microsoft, if you install your own Applications or customizations into a Custom reference image (the sysprep+image method most IT departments are familiar with), you can’t use that custom image to Upgrade. Instead you need to use the MDT and/or SCCM “Refresh” method to migrate Settings and Files using USMT from one OS to another.

It was because of the inability to use custom images, and the inability to Upgrade from older OS versions that the “Upgrade” scenario was taken out of the current version of MDT, it was present in older MDT 2010 versions.

Although you can’t use Custom images that are syspreped and captured, there is still some question as to whether or not we will be able to use images that have been “offline” patched using DISM in “Upgrade” scenarios. There are a couple of hundred updates for Windows 7, and no IT department in their right mind should ever send out an unpatched image to machines with the *hope* that the user will get around to running Windows Update to patch the machine. So keeping deployment images up to date is important.

What remains to be seen is if Microsoft will do a better job of releasing regular updates for Windows 10 so that IT departments don’t have to patch these OS’es. There have been hints from Microsoft about regular full Updates to the OS, but we won’t see any evidence of that until the second half of 2015.

Offline Servicing

All of this got me thinking about the Offline Servicing patching cycle in general. What is involved in patching an image?

The principles are available from Microsoft:
http://technet.microsoft.com/en-us/library/dd744559(v=ws.10).aspx
• Take an existing *.wim image and mount locally
• Use the dism command to add a package or collection of packages.
• Un mount the *.wim image.

Yet there is far more to the process than just this: What Updates should I use? Where do I get the updates? What order should I install them? Are there any updates form WU that I can’t use? What about Delta Compressed updates? Etc…

I’ve been asking around about updates. I started off looking at http://download.wsusoffline.net/ but I got a bit discouraged at the size and complexity of the WSUSOffline solution.

So I developed a script to run at the end of my LiteTouch Deploy and Capture task sequence. The script will list all of the updates installed on each machine and dump out the Links where the update came from:

https://onedrive.live.com/?cid=5407B03614346A99&id=5407B03614346A99%2114113

param(
    [string] $Filter = "IsInstalled = 1 and Type = 'Software'"
)

$objSession = New-Object -ComObject "Microsoft.Update.Session"

foreach($update in $objSession.CreateUpdateSearcher().Search($Filter).Updates)
{
    foreach($bundledUpdate in $update.BundledUpdates)
    {
        foreach($content in $bundledUpdate.DownloadContents)
        {
            if ($content.IsDeltaCompressedContent)
            {
                write-verbose "Ignore Delta Compressed Content: $($Update.Title)"
                continue
            }
            
            if ( $content.DownloadURL.toLower().EndsWith(".exe") )
            {
                write-verbose "Ignore Exe Content: $($Update.Title)"
                #continue
            }

            [pscustomobject] @{
                ID = $update.Identity.UpdateID
                KB = $update.KBARticleIDs| %{ $_ } 
                URL = $update.MoreInfoUrls| %{ $_ } 
                Type = $Update.Categories | ?{ $_.Parent.CategoryID -ne "6964aab4-c5b5-43bd-a17d-ffb4346a8e1d" } | %{ $_.Name }
                Title = $update.Title
                Size = $bundledUpdate.MaxDownloadSize
                DownloadURL = $content.DownloadURL
                Auto = $update.autoSelectOnWebSites
            }
        }
    }
}

This provided me with a list of downloads for each platform type Win10, Win7SP1, Win8.1, Win2k8, Win2012, Win10Server.

Example:

...
WU(505): 8d865f13-ec5f-4bfe-95d9-4a172680523e
	True
	http://download.windowsupdate.com/d/msdownload/update/software/secu/2014/09/windows6.1-kb3000869-x64_be731ba069a45d2c2786e7f8f5de13014aa7786e.cab
	Security Updates,
	3000869
	http://support.microsoft.com/kb/3000869	Security Update for Windows 7 for x64-based Systems (KB3000869)
...

I wrote another PowerShell Script to download each update from Windows Update (Filtering out “Express”, Exe and psf updates)

Then another PowerShell script to apply all the updates for each platform to the core OS images.

It took me a while to get the process down right, and I can imagine how difficult it would be to start this process from scratch. How do you determine which updates to install? Is it a manual hit or miss process?

Results

• I was a bit surprised at how slow the update process is. On my fast i7 machine with a SSD Drive, the full offline update of Windows 2008 R2 took about 63 minutes.

• End results show that for the Offline Updated Windows Server 2008 image, was missing
   o Latest Version of IE 11
   o .NET Framework 4.5
   o nor the Windows Malicious Software Removal tool

Missing updates after deployment from Offline Syspreped image:
updates1

Missing updates after deployment from MDT Syspreped image:
updates.MDT

• There wasn’t any noticeable difference in the installation time at deployment.

• There were some noticeable differences in the size, generally MDT Syspreped images were 23% larger than offline images (except for the Windows Server 2012 Image was actually smaller in the MDT Syspreped image.

RTM Original Image Offline Capture LiteTouch Capture
Win2008R2Sp1 2.62 GB 3.45 GB ( 32%) 4.3 GB ( 64%)
Win2012R2U 3.67 GB 4.56 GB ( 24%) 4.53 GB ( 23%)
Win7SP1x64Eval 2.62 GB 3.53 GB ( 35%) 4.38 GB ( 67%)
Win7SP1x86Eval 1.97 GB 2.48 GB ( 26%) 3.22 GB ( 63%)
Win81Ux64Eval 3.12 GB 4.05 GB ( 30%) 4.41 GB ( 41%)
Win81Ux86Eval 2.32 GB 2.85 GB ( 23%) 3.41 GB ( 47%)

Offline vs MDT

So what are the pros and cons here for using offline images:

Offline Images (dism.exe patching):
• Pro: With the correct package manifest/collection, images Can be created quickly.
• Pro: It might be possible to use images in Windows 10 “Upgrade” scenario (TBD)
• Pro: Deployment is easy with MDT and/or SCCM.
• Pro: Base images are smaller.
• Con: Not all updates can be serviced offline (not all updates come as *.msu/*.cab files)
• Con: You can’t add any custom applications.
• Con: Difficult to find the correct manifest of drivers to update, manual process.

Online Update (Apply,Update,Install,Sysprep,Capture with MDT)
• Con: Images can’t be used in Windows “Upgrade” scenarios.
• Pro: Deployment is easy with MDT and/or SCCM.
• Pro: Easy to install applications and configurations into image that’s ready for deployment.

Conclusions

So which one is better?

Well, if you have a MDT build and capture task sequence setup for your master images, I wouldn’t change.

If you don’t put much into your base image, just a couple of security patches updates (MSU/CAB), then sticking with offline updates can work fine. But if you need to get any more complex then time to look at MDT.

The question about Windows 10 and the future still remains to be seen. If Microsoft can get a system up and running and keep it consistent, then moving away from MDT imaging could be viable. But that’s over a year away.

-k

Imaging Factory performance

I’ve been experimenting recently with building a Hydration Imaging Factory on one of my servers. A Hydration Factory is a Windows Host that constructs Windows images for use in deployment.

Perhaps you have a simple setup in your environment using MDT LiteTouch. This could be something like a task sequence that installs Windows 7 x64, runs Windows Update, syspreps and captures back to a *.wim file. Or perhaps you have a laundry list of applications that need to be installed in your corporate standardized image for VDI scenarios. With the correct settings in your CustomSettings.ini file, this process could be fully automated, and repeatable. Spin up a Virutal Machine and 30 minutes later you have a new install.wim file.

A Hydration Imaging Factory will combine the automation of MDT LiteTouch with some PowerShell automation to build out a list of virtual machines.

Configuration

I’ve been spending some time trying to make my Hydration Factory system modular, and right now I can kick off a new build and my Host.
In my system:

  • All images are fully patched and have IE 11 and the KMDF
  • Some images are “Min” – No applications, just Updated/Patched
  • Some images are “Full” – Applications like Adobe Reader, Chrome, VCRT, etc.
  • I also create a Hyper-V specific versions (PersistAllDeviceInstalls)
  • I have packages for Office and SQL, but did not include below
  • I run a dism /clean command just before sysprep to trim the images
  • Results

    Given my host test machine (Simple single processor multi-core desktop, i7, 32GB of ram, and multiple SSD Drives). It took about 7 hours to build out the following Virtual Machines.


    4,080,851,813 WIN10STPX64.SRV.Full.HV.WIM
    4,080,982,651 WIN10STPX64.SRV.Full.WIM
    3,954,997,709 WIN10STPX64.SRV.Min.Core.HV.WIM
    3,955,009,907 WIN10STPX64.SRV.Min.Core.WIM
    3,955,336,881 WIN10STPX64.SRV.Min.HV.WIM
    3,955,175,443 WIN10STPX64.SRV.Min.WIM
    3,882,925,692 WIN10TPX64.ENT.Full.HV.WIM
    3,882,213,922 WIN10TPX64.ENT.Full.WIM
    3,754,245,946 WIN10TPX64.ENT.Min.HV.WIM
    3,754,582,199 WIN10TPX64.ENT.Min.WIM
    2,989,545,883 WIN10TPX86.ENT.Full.HV.WIM
    2,992,590,857 WIN10TPX86.ENT.Full.WIM
    2,921,467,219 WIN10TPX86.ENT.Min.HV.WIM
    2,921,762,549 WIN10TPX86.ENT.Min.WIM
    5,775,824,112 WIN2008R2SP1.Full.HV.WIM
    5,775,798,368 WIN2008R2SP1.Full.WIM
    4,618,522,652 WIN2008R2SP1.Min.HV.WIM
    4,618,521,668 WIN2008R2SP1.Min.WIM
    4,921,167,148 WIN2012R2U.Full.HV.WIM
    4,921,555,872 WIN2012R2U.Full.WIM
    4,513,623,325 WIN2012R2U.Min.Core.HV.WIM
    4,554,749,492 WIN2012R2U.Min.Core.WIM
    4,451,558,474 WIN2012R2U.Min.HV.WIM
    4,459,989,734 WIN2012R2U.Min.WIM
    5,955,716,470 WIN7SP1X64EVAL.Full.HV.WIM
    5,751,198,710 WIN7SP1X64EVAL.Full.WIM
    4,761,940,473 WIN7SP1X64EVAL.Min.HV.WIM
    4,776,248,329 WIN7SP1X64EVAL.Min.WIM
    4,223,192,736 WIN7SP1X86EVAL.Full.HV.WIM
    4,179,078,039 WIN7SP1X86EVAL.Full.WIM
    3,440,522,203 WIN7SP1X86EVAL.Min.HV.WIM
    3,440,523,165 WIN7SP1X86EVAL.Min.WIM
    5,443,684,448 WIN81UX64EVAL.Full.HV.WIM
    5,442,443,606 WIN81UX64EVAL.Full.WIM
    4,723,143,084 WIN81UX64EVAL.Min.HV.WIM
    4,722,734,367 WIN81UX64EVAL.Min.WIM
    4,278,679,489 WIN81UX86EVAL.Full.HV.WIM
    4,277,099,032 WIN81UX86EVAL.Full.WIM
    3,648,938,088 WIN81UX86EVAL.Min.HV.WIM
    3,650,404,303 WIN81UX86EVAL.Min.WIM
    40 File(s) 172,408,546,058 bytes

    Post Processing

    I have scripts to merge similar install wims together to save space. This is similar to what Microsoft does with the Windows Release DVD’s, putting multiple SKU’s in the same *.wim file.


    4,298,516,365 WIN10STPX64.SRV.wim
    4,053,529,276 WIN10TPX64.ENT.wim
    3,138,798,185 WIN10TPX86.ENT.wim
    6,750,236,116 WIN2008R2SP1.wim
    5,292,634,524 WIN2012R2U.wim
    6,785,580,325 WIN7SP1X64EVAL.wim
    5,080,754,294 WIN7SP1X86EVAL.wim
    6,067,072,626 WIN81UX64EVAL.wim
    4,684,309,077 WIN81UX86EVAL.wim
    9 File(s) 46,151,430,788 bytes

    Additionally, I tried out Johan’s Beyond Zip method to shrink files down even more…
    http://www.deploymentresearch.com/Research/tabid/62/EntryId/148/Beyond-Zip-How-to-store-183-GB-of-VMs-in-a-19-GB-file-using-PowerShell.aspx


    23,190,306,816 CapturePackage.vhdx

    From 160GB down to 21.6GB, an savings of about 87% Wow!

    Finally, I have other scripts to convert the *.wim images to *.vhdx files for easy import into Hyper-V or Azure. See my last post on persistalldeviceinstalls

    Uploading

    As a service, I’ve been thinking of uploading my updated/patched images for these Operating Systems (and more) to a public internet file sharing site like my OneDrive for Business account. Rebuilding everything from scratch every Patch Tuesday. One drive for business has 1TB for use, I could share the images, how cool would that be?

    First glitch is that OneDrive for Business still has the 2GB file limitation, so that would require splitting the files up into 2047MB chunks and reassembling later.

    However, my biggest problem right now is my ISP connection. Today, I was averaging about 11.14Mbps upload speed to OneDrive. To upload 42GB of Wim files to OneDrive for Business would take more than 8 hours, which is more time than it took to build the images in the first place. That combined with my ISP’s data caps, makes sharing this from my current office cost prohibitive.

    Customers

    Let me know if you are interested in setting up your own imaging factory environment. I’ve already done this for a large Video Chipset Mfg. And I can customize for your needs.

    KeithGa@DeploymentLive.com

    Use Power to get MSI Properties

    Read a great post today by Nickolaj regarding reading MSI packages in powershell:

    I had developed some earlier scripts a while back for reading MSI packages in powershell, but the scripts were … well … ugly, with an external COM xml type library.

    Nickolaj’s script got me thinking how to improve my version, so I updated it and will post here:

    The main difference between my version and the one by Nickolaj, is that this script will return all Properties, rather than only those properties requested when calling. This helps in discovery of available properties.

    param(
    [parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true)]
    [ValidateScript({Test-Path $_})]
    [IO.FileInfo]$Path
    )

    $WindowsInstaller = New-Object -com WindowsInstaller.Installer
    $MSIDatabase = $WindowsInstaller.GetType().InvokeMember(
    "OpenDatabase","InvokeMethod",$Null,
    $WindowsInstaller,@($Path.FullName,0))

    $View = $MSIDatabase.GetType().InvokeMember(
    "OpenView","InvokeMethod",$null,
    $MSIDatabase,"SELECT * FROM Property")

    $View.GetType().InvokeMember(
    "Execute", "InvokeMethod", $null, $View, $null)
    while($Record = $View.GetType().InvokeMember(
    "Fetch","InvokeMethod",$null,$View,$null))
    {
    @{ $Record.GetType().InvokeMember(
    "StringData","GetProperty",$null,$Record,1) =
    $Record.GetType().InvokeMember(
    "StringData","GetProperty",$null,$Record,2)}
    }
    $View.GetType().InvokeMember("Close","InvokeMethod",$null,$View,$null)

    Examples

    Now, Why would you want to look at the properties of a MSI package?

    Well, say you installed an application package into your Windows Install.wim image. However, you might want to try re-installing the same application package during deploy time. How to ensure that you don’t try to re-run the installation for the same package again if installed.

    Well ZTIApplications in MDT can leverage the MSI Product Code. IT will check the registry for the MSI Product Code, and skip over the installation if present.

    Just ensure that the product code is set in the application package as the uninstall key name:

    See:
    http://blogs.technet.com/b/mniehaus/archive/2008/09/25/dynamic-application-detection-with-lite-touch.aspx

    Source

    https://onedrive.live.com/?cid=5407B03614346A99&id=5407B03614346A99%2110698

    Cheers!

    Checking for tablets in MDT – Part 2

    @david_obrien sent me a tweet today about my tablet detection algorithm.

    Turns out that according to my detection algorithm, Dell thinks the XPS is a Laptop, not a Tablet!!!

    a5hg-800

    Why is that?

    As far as I can tell, Win32_ComputerSystem->PCSystemTypeEx is most likely getting it’s values from POWER_PLATFORM_ROLE enumeration type. This value comes from the Firmware (BIOS), and is filled out by the OEM. However, it’s not really about touch, this field is about power profiles, how the OEM feels this machine will be used. Is the machine a tablet, or a laptop? For the purposes of power management, Dell felt the XPS was more like a Laptop.

    So, that still leaves us with the question as to, how do we detect if a machine is a tablet? Perhaps a different way to ask that is to see if the machine supports “Touch”.

    That will require a slightly different detection method, and a call to a Win32 API.

    Here is an example Powershell script that should detect if a machine has Multi-point-Touch:

    Add-Type @"
    using System.Runtime.InteropServices;
    namespace WinAPI
    {	
        public class User32 { 
        [DllImport("user32.dll")] public static extern int GetSystemMetrics(int nIndex); }
    }
    "@
    
    if ( $Result = [WinAPI.User32]::GetSystemMetrics(86) -band 0x41 -eq 0x41 ) 
    {
        "Tablet with Multi-Input and Integrated-Touch"
    }
    else
    {
        "Not Tablet"
    }
    

    0x41 is a combination of
    NID_INTEGRATED_TOUCH 0x01 The device has an integrated touch digitizer.
    and
    NID_MULTI_INPUT 0x40 The device supports multiple sources of digitizer input.

    Dell Driver Pack Catalog

    Dell has recently re-released their Driver Pack Catalog, and while looking saw an excellent script by Dustin Hedges that shows how to parse the XML Catalog and download some packages. I wanted to enhance the experience for MDT LiteTouch, so I wrote some scripts.

    Install-DellWinPEDriverCatalog.ps1

    The first script is designed for downloading and installing the correct Dell WinPE driver pack to your MDT Litetouch Environment.

    • The script will automatically determine your local MDT LiteTouch Deployment share and connect to it. If there is more than one share in your MDT Deployment Console, then the script will prompt you for which one to use.
    • The script will automatically download, extract and parse the Dell Driver Pack Catalog.
    • The script will automatically search your local host for the WAIK or ADK, and download the corresponding Dell WinPE Driver Pack.
    • The Script will automatically extract the Dell WinPE DriverPack and import the drivers into the MDT Deployment Share identified above.
    • Finally the script automatically create a Selection Profile for the drivers imported, and assign your Deployment share to build WinPE images with these drivers.

    For most IT Pros this is a silent and automated process. And if you are just starting out with WinPE Drivers, this is a great way to setup your environment.

    If you would like to see this script in action, I have recorded a video that shows how it works:

    Download: Install-DellWinPEDriverCatalog.ps1

    Install-DellDriverCatalog.ps1

    The second script is designed for downloading and installing the Dell driver pack for specific makes and models to your MDT LiteTouch environment.

    • The script will automatically determine your local MDT LiteTouch Deployment share and connect to it. If there is more than one share in your MDT Deployment Console, then the script will prompt you for which one to use.
    • The script will automatically download, extract and parse the Dell Driver Pack Catalog.
    • The script will display a list of Driver Packs available from Dell (using out-gridview), allowing you to select which packs to install.
    • The Script will automatically extract each selected Dell DriverPack(s) and import the drivers into the MDT Deployment Share identified above.
    • Finally the script will import each driver pack in a %Make%\%Model% folder hierarchy allowing for the Total Control method of driver management (Thanks Johan).

    Here is an example of what the package list looks like while running the script:
    delllist

    Read the notes at the top of the script to ensure you are following the total control method of drivers. Selection Profile must be set to “Nothing” and you should have a step in your task sequence that contains the one of the following (choose the correct one):

    DriverGroup001=Windows 7 x86\%Make%\%Model%
    DriverGroup001=Windows 7 x64\%Make%\%Model%
    DriverGroup001=Windows 8.1 x86\%Make%\%Model%
    DriverGroup001=Windows 8.1 x64\%Make%\%Model%
    

    For most IT Pros this is a quick way to import drivers based on Make\Model into your MDT Litetouch Environment, just select the packages you want from the import page and GO!

    If you would like to see this script in action, I have recorded a video that shows how it works:

    Download: Install-DellDriverCatalog.ps1

    Notes

    • Sadly I don’t have a lot of Dell machines in my home lab (only one), so I haven’t done nearly enough testing with makes and models, if you have any problems, send me a bdd.log and I’ll look at it.
    • I removed some OS types from the script:
      • XP is no longer supported
      • Windows 8.0 is superseded by Windows 8.1 (free upgrade)
      • Vista is well… yuck.
    • The Dell driver catalog doesn’t support a 1:1 mapping for each platform to the Makes/Models stored in the firmware/BIOS (example: R3PXF J5VX9 VH2G7), so the script only displays drivers that I felt confident it could re-construct the values from Win32_ComputerSystem. No Tablets.
    • It should be possible to modify the script to also import drivers directly into SCCM, if I get enough feedback, I may do that as well.

    Update: 9/15/2016

    I have updated the code to handle instances where the driver contains both x86 and x64 drivers.

    Links have been updated:

    https://1drv.ms/u/s!AplqNBQ2sAdUiOpG_sqeDIQpoX4PiQ

    https://1drv.ms/u/s!AplqNBQ2sAdUip4dnfwMUGpRW0QJ9A

    When DFS goes wrong…

    (Also posted in this month’s TrueSec.com newsletter!)
    A while back I consulted with the IT department at a large computer hardare company (well known in the computer graphics business). Josh, their MDT Administrator, asked recently for debugging help with MDT and DFS shares.

    System Center Configuration Manager has an excellent system to replicate content out to remote distribution points. MDT LiteTouch, however, has nothing built in, instead you can use DFS-R do the replication out to remote DFS leaf nodes. Many years ago I worked on a deployment project that relied on DFS, and it was a nightmare. The DFS links were complex and setup by hand (not DFS-R), not all sites received the same content (by design), or sometimes sites got content very slowly, like a week later. I spent a lot of time trying to debug DFS replication issues.

    From the client’s point of view, it’s hard to tell where the DFS share actually points. When DFS is working, this is great, end users don’t need to know where their requests are actually going. But if you wanted to check, one way is to use Windows Explorer to select a folder on a DFS Share, Right click and select “Properties”, there should be a “DFS” tab showing each DFS leaf, and the status of which DFS server is active.

    DFS

    My friend Josh asked about adding a script to dump the DFS status for each client into the bdd.log during LiteTouch deployment. If there was an error in the deployment he could use this information to track down the real DFS Leaf Node server and verify that the content got replicated correctly. I had developed a tool a while back that helped dump DFS information on the client side, but it was written in C/C++, and I thought this was a great opportunity to experiment with PInvoke and calling .net Classes from Powershell.

    I knew from my previous coding experience that the trick to getting DFS info on the client was using the Win32 API call NetDfsGetClientInfo(). This API call can’t be made directly from PowerShell, instead we need to use some conversions in order to make the call. I went to the excellent http://PInvoke.net site to create a C#/.net program to make the call and return a native .net framework structures. I could then use PowerShell to compile the C# code, and make the call call to .NET. PowerShell does a great job of parsing the returned structures, and makes it appear that the results were native PowerShell objects.

    code

    With the powershell script, it’s just a matter of adding this script to our task sequence. If you have added powershell to WinPE, you can try this during the “pre-install” phase of the Task Sequence, but in this case, I put the script at the start of the “State Restore” phase.

    TS

    When done I could see in my BDD.log file the results of the script and which active DFS Leaf node was active.

    Code attached get-DFSLinkStatus.ps1

    Hope you find this helpful.

    Copy-ItemWithProgress

    A good friend of mine, Tim, once commented, half jokingly, that his job was to watch Progress Bars. Ha, so true! :^). Most of the work I do involves setting up Operating Systems, and the infrastructure behind it. This means copying a *LOT* of files back and forth. *.ISO, *.Wim, *.vhd, *.vhdx, Drivers, Scripts, whatever. Lately, with most of the control work being done by PowerShell Scripts, and with more of my scripts being wrapped by PowerShell host programs, it was time to re-consider how my large file copies are handled in my PowerShell scripts.

    PowerShell Copy

    PowerShell has built-in functions for file copy, the copy-item cmdlet with the -recurse switch works quite well, however, for larger files, I would instead recommend RoboCopy.exe which is designed to handle large copies in a structured and *robust* manner.

    (I won’t go into the details of why RoboCopy is better than copy-item in PowerShell here in this post. That part is assumed. :^)

    Of course the problem with RoboCopy in PowerShell, is that the RoboCopy command was not designed to interface directly with the shell environment, instead it just blindly writes data to the stdout console. However, Copy-item isn’t much better at progress either, although it will copy items, and display the items being copied if the “-verbose” switch is specified, it still does not display PowerShell progress.

    I did some work in MDT to help design some capture techniques for console programs. We did modify some of the scripts to capture imagex.exe, DISM.exe, wdsmcast.exe, and USMT and display the progress in a native manner with the SCCM Task Sequencer engine. It involved reading the output and looking for the correct search strings.

    mdt progress

    RoboCopy

    I got inspired this week to re-evaluate robocopy progress after reading a post by Trevor Sullivan, a fellow Microsoft MVP: http://stackoverflow.com/questions/13883404

    Wow! The post and associated video is very good. However, there were a few things that I wanted to enhance if I wanted to incorporate into my own processes:

    • The script only supports /mir copy, with no other parameters. Sometimes I need to call with other switches like /xd and /xf. (robocopy has an amazing collection of options available).
    • I though it could be speeded up by launching the 2nd robocopy command immediately after the 1st robocopy command. Rather than waiting for the 1st robocopy to complete.
    • Finally, I also wanted to view the progress of not only the files being copied, but the sub-progress of any large file. This was very important, most of my projects involve copying individual files that often over 1GB in size.

    Script

    <#
    .SYNOPSIS
    RoboCopy with PowerShell progress.
    
    .DESCRIPTION
    Performs file copy with RoboCopy. Output from RoboCopy is captured,
    parsed, and returned as Powershell native status and progress.
    
    .PARAMETER RobocopyArgs
    List of arguments passed directly to Robocopy.
    Must not conflict with defaults: /ndl /TEE /Bytes /NC /nfl /Log
    
    .OUTPUTS
    Returns an object with the status of final copy.
    REMINDER: Any error level below 8 can be considered a success by RoboCopy.
    
    .EXAMPLE
    C:\PS> .\Copy-ItemWithProgress c:\Src d:\Dest
    
    Copy the contents of the c:\Src directory to a directory d:\Dest
    Without the /e or /mir switch, only files from the root of c:\src are copied.
    
    .EXAMPLE
    C:\PS> .\Copy-ItemWithProgress '"c:\Src Files"' d:\Dest /mir /xf *.log -Verbose
    
    Copy the contents of the 'c:\Name with Space' directory to a directory d:\Dest
    /mir and /XF parameters are passed to robocopy, and script is run verbose
    
    .LINK
    https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress
    
    .NOTES
    By Keith S. Garner (KeithGa@KeithGa.com) - 6/23/2014
    With inspiration by Trevor Sullivan @pcgeek86
    
    #>
    
    [CmdletBinding()]
    param(
    	[Parameter(Mandatory = $true,ValueFromRemainingArguments=$true)] 
    	[string[]] $RobocopyArgs
    )
    
    $ScanLog  = [IO.Path]::GetTempFileName()
    $RoboLog  = [IO.Path]::GetTempFileName()
    $ScanArgs = $RobocopyArgs + "/ndl /TEE /bytes /Log:$ScanLog /nfl /L".Split(" ")
    $RoboArgs = $RobocopyArgs + "/ndl /TEE /bytes /Log:$RoboLog /NC".Split(" ")
    
    # Launch Robocopy Processes
    write-verbose ("Robocopy Scan:`n" + ($ScanArgs -join " "))
    write-verbose ("Robocopy Full:`n" + ($RoboArgs -join " "))
    $ScanRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $ScanArgs
    $RoboRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $RoboArgs
    
    # Parse Robocopy "Scan" pass
    $ScanRun.WaitForExit()
    $LogData = get-content $ScanLog
    if ($ScanRun.ExitCode -ge 8)
    {
    	$LogData|out-string|Write-Error
    	throw "Robocopy $($ScanRun.ExitCode)"
    }
    $FileSize = [regex]::Match($LogData[-4],".+:\s+(\d+)\s+(\d+)").Groups[2].Value
    write-verbose ("Robocopy Bytes: $FileSize `n" +($LogData -join "`n"))
    
    # Monitor Full RoboCopy
    while (!$RoboRun.HasExited)
    {
    	$LogData = get-content $RoboLog
    	$Files = $LogData -match "^\s*(\d+)\s+(\S+)"
        if ($Files -ne $Null )
        {
    	    $copied = ($Files[0..($Files.Length-2)] | %{$_.Split("`t")[-2]} | Measure -sum).Sum
    	    if ($LogData[-1] -match "(100|\d?\d\.\d)\%")
    	    {
    		    write-progress Copy -ParentID $RoboRun.ID -percentComplete $LogData[-1].Trim("% `t") $LogData[-1]
    		    $Copied += $Files[-1].Split("`t")[-2] /100 * ($LogData[-1].Trim("% `t"))
    	    }
    	    else
    	    {
    		    write-progress Copy -ParentID $RoboRun.ID -Complete
    	    }
    	    write-progress ROBOCOPY -ID $RoboRun.ID -PercentComplete ($Copied/$FileSize*100) $Files[-1].Split("`t")[-1]
        }
    }
    
    # Parse full RoboCopy pass results, and cleanup
    (get-content $RoboLog)[-11..-2] | out-string | Write-Verbose
    [PSCustomObject]@{ ExitCode = $RoboRun.ExitCode }
    remove-item $RoboLog, $ScanLog
    
    

    Code Walkthrough

    Parameters

    [CmdletBinding()]
    param(
    	[Parameter(Mandatory = $true,ValueFromRemainingArguments=$true)] 
    	[string[]] $RobocopyArgs
    )
    

    The script only has one parameter, and that’s the set of arguments passed directly to RoboCopy. I also added the ValueFromRemainingArguments flag so that all arguments passed to command line (other than CommonParameters) would be interpreted as $RobocopyArgs.

    Hidden

    $ScanRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $ScanArgs
    

    When calling Robocopy.exe, I call the script with the Hidden WindowStyle. This means that the RoboCopy window is not displayed during execution. This is important to me, since I may wish to call RoboCopy from within a non-console window PowerShell Host.

    Error Handling

    if ($ScanRun.ExitCode -ge 8)
    {
    	$LogData|out-string|Write-Error
    	throw "Robocopy $($ScanRun.ExitCode)"
    }
    

    I did add one check into the script to check for obvious errors, for example if you were to call RoboCopy with an incorrect number of parameters, for example one. This would detect the error from the 1st RoboCopy pass, and halt the program with the errors passed to the calling program.

    Parsing Logs

    $LogData = get-content $RoboLog
    $Files = $LogData -match "^\s*(\d+)\s+(\S+)"
       if ($Files -ne $Null )
       {
        $copied = ($Files[0..($Files.Length-2)] | %{$_.Split("`t")[-2]} | Measure -sum).Sum
        if ($LogData[-1] -match "(100|\d?\d\.\d)\%")
        {
    	    write-progress Copy -ParentID $RoboRun.ID -percentComplete $LogData[-1].Trim("% `t") $LogData[-1]
    	    $Copied += $Files[-1].Split("`t")[-2] /100 * ($LogData[-1].Trim("% `t"))
        }
        else
        {
    	    write-progress Copy -ParentID $RoboRun.ID -Complete
        }
        write-progress ROBOCOPY -ID $RoboRun.ID -PercentComplete ($Copied/$FileSize*100) $Files[-1].Split("`t")[-1]
    }
    

    I’ll admit that there is some ungraceful code here when parsing the main RoboCopy log file. We are looking for several things here. A) We are parsing for a list of Files copied and their sizes. B) We are also looking for the percentage of the current file being copied.

    When parsing the sub-status of large files, I only look for those files large enough to merit having one tenth of a percent displayed, rather than a full percent ( 42.4% would be parsed, however 42% would not).

    Testing and Feedback

    I did some testing within a console, and within a PowerShell host (in this case the PowerShell ISE).

    Host Progress

    Console Progress

    I have only done some limited testing on my Windows 8.1 Dev box, please let me know if you have any feedback.

    Files

    Copy-ItemWithProgress.ps1

    Image Factory Automation

    One area where MDT Litetouch excels is with Image Creation. I know of several groups within Microsoft (including Microsoft Consulting) who recommend using MDT LiteTouch to create images, even if those images are eventually used within SCCM OSD (Operating System Deployment).

    Background

    Back in the Windows XP days, the OS came with it’s own proprietary installation system. Say you spent some time getting XP updated to the latest Service Pack, along with all the necessary security updates, and the latest version of Office. You might want to take a snapshot (or checkpoint) of this reference disk (Image) to reload on other machines. That’s where Sysprep and 3rd party products like Ghost came to the picture.

    Starting with Windows Vista, Microsoft started distributing the OS using a new file archival format: Windows Imaging Format (*.wim files). *.wim files are compressed archives with space to store extra metadata. It’s also intelligent to hold multiple archive sets within a single *.wim file, and only keep a single instance of the same file, so a single wim file can hold Windows Starter, Home Premium, and Ultimate on the same disk!

    The main difference between Ghost (*.gho) files, and WIM (*.wim) files is that Ghost Files store the contents of a hard disk in block format (partitions and all), whereas the WIM files are stored as files. That means that when you apply a *.gho file to a disk of different size, Ghost itself needs to do some resizing of the partitions to make it fit. Whereas the *.wim file can only hold those files and streams it knows about (boot sectors and deleted files are ignored).

    One of the coolest features of *.wim files is that Microsoft allows customers the ability to capture the contents of a Drive (volume) into your own *.wim file. In fact for some versions of Windows, you can replace the install.wim file on the Install DVD with your own captured *.wim file, and continue with the installation process like it came from Microsoft.

    Imaging

    So how do you create your own image for use?

    • First off, you should setup a machine *just* the way you want it.
      Install the OS, Install Apps, Configure Settings, Add drivers if necessary.
    • Next, run Sysprep on the machine. This will prepare the machine to re-run OOBE Setup.
    • Finally, boot into WinPE and capture the image using imagex.exe or dism.exe into a *.wim file.

    Of course this is a major oversimplification of a complex process. For example, adding Drivers to the image depends on the scenario. If you *know* for certain that this image will only be applied against a single kind of computer system you could perform the capture on that reference system so that the image contains all necessary drivers, ready to go. Otherwise, if you are targeting several kinds of hardware, I would strongly recommend using Hyper-V Virtual Machines to create your images, since the OS won’t load any extra drivers into the image.

    Enter MDT LiteTouch

    The MDT LiteTouch Client and Server deploy Task sequences were designed from the start to handle the full deployment installation process. OS installation, Application Installation, Sysprep and Capture, all from the default Client and Server Task Sequence templates.

    One of the cool things to do is to make the LiteTouch process into a Fully automated No-Touch process (we reserve the ZeroTouch name for SCCM with MDT extensions :^).

    Let’s start off with a Deployment Share setup specifically for image creation in our lab. To automate the process I have created an account on the local machine that has read/write permissions on the Deployment Share but is *not* a member of the local users group. I have also given it a random password.

    In our Bootstrap.ini file, we add four lines to the bottom:

    [Settings]
    Priority=Default
    
    [Default]
    DeployRoot=\\PICKETTV\Create$
    
    ;  # NEW LINES FOR AUTOMATION:
    SkipBDDWelcome=YES
    UserID=MDTUser
    UserDomain=PICKETTV
    UserPassword=de36c86a4340#
    

    This will allow us to skip over the MDT LiteTouch Welcome Wizard, and connect directly to the Deployment Share.

    I have created a virtual Machine used to capture Windows 8.1 x64 images, I booted up the machine and found out it’s machine BIOS GUID (check the bdd.log file), and in this case the GUID is: {29c80ff5-4dc4-4497-a035-472118542fd7}. Some people use the MAC address of the virtual machine.

    IN our CustomSettings.ini file, in addition to the standard settings used by our regular deployments, I have added the following entries:

    [Settings]
    Priority=UUID,Default
    [...]
    [29c80ff5-4dc4-4497-a035-472118542fd7]
    TaskSequenceID=Win81Ux64
    SkipWizard=YES
    AdminPassword=P@ssw0rd
    FinishAction=shutdown
    SkipFinalSummary=YES
    DoCapture = YES
    ComputerBackupLocation = %DeployRoot%\Captures
    BackupFile = %TaskSequenceID%.wim
    MandatoryApplications001={524d3d3d-bc51-4624-b014-4777aa75a99b}
    

    First off I have changed the [Settings]Priority to add UUID. This means that the first thing processed in the CS.ini file will be the matching GUID section found in the file (if any). Within my GUID Section, I have created an application bundle to install my preferred applications set, Defined the settings to capture the machine back to the imaging server, and set everything else to full automation.

    As one last trick, I take a Snapshot/Checkpoint of my Virtual Machine so that I can roll back the machine and restart this *automated* imaging process from scratch. This can be great for Patch Tuesday, just roll back and re-image. The only work on my part is to kick off the imaging, and review the logs when finished.

    SCCM

    What about SCCM you ask? SCCM OSD (with MDT integration) has the same ability to install an OS, Applications, Sysprep, and Capture. Why not use that system?

    Well, if you have a fully functional SCCM OSD deployment system ready, along with all the applications pre-packages, then yes, it may be a good idea to continue using SCCM OSD to create your images. However… if you do not have a fully functional system ready with all of your applications packaged (fully automated). I would not recommend starting with LiteTouch instead.

    Running in the Administrator context in MDT LiteTouch allows us more leeway when building our images with unproven systems and components. We can see what’s being installed, see error messages on the screen, and debug in real time on the console. There is just no need to install the overhead of SCCM for a small contained process like imaging creation if you have not already automated everything in SCCM.

    Automation

    Now there is an important point to make here. If we can Automate as much as possible in the imaging process, typically the installation of our Applications, we can rebuild our core images over and over again with little efforts.

    Of course there are some scenarios where Component “X” is difficult to install in a fully automated fashion (or we don’t know how to install). Some times we can *RePackage* the application using some 3rd party tool, or perhaps we can push the installation of this application to another process, perhaps during OS deployment, rather than during Image Creation. MDT LiteTouch also has a “Manual” step that can be added to the Task Sequence to allow an Imaging team to perform non-automated steps.

    However, for the most part, my recommendation (and the recommendation from many at Microsoft) is that if you can automate the installation of applications, you should. As you can now leverage no-touch image building.

    Image Factories

    Once you have some of the basic settings defined for creating your image, the next step would be to automate the whole thing with PowerShell.

    We can use PowerShell to Create User Accounts, Create Virtual Machines, Assign Network Switches, Apply our the LitetouchPE_x86.iso, and start. We can also use PowerShell to inject our No-Touch settings from above dynamically into the MDT Process.

    While working for Microsoft’s own IT department, we would create multiple images at once, each for a different use ( Windows 7, Windows 8, Windows Server 2008 R2, Windows Server 2012, With *and* without Office ). Why would we provide an image *without* Office? Well, there are groups within Microsoft who don’t want Office, they are developing and DogFooding the *next* version of office :^).

    We call this whole system an “Image Factory”. There are a lot of moving parts, but when done properly, rebuilding your image set for patch Tuesday is no problem.