Windows 1709 In Place Upgrade Bug

Thanks to Johan Arwidmark and Dan Vega for pointing me out to this bug. It took me a while to set up a scenario where it could reproduce, but it’s a good bug. Windows 10 In-Place Upgrade is an important new feature of Windows 10, and it’s good to have MDT support it.

The Bug

When upgrading to Windows 10 Version 1709 using the built-in MDT “Standard Client Upgrade Task Sequence”, you will get an error during OS upgrade within a MDT Wizard page that shows something like:

A VBScript Runtime Error has occurred:
Error: 500 = Variable is undefined

VBScript Code:

At this point, the OS *has* been upgraded, but the Task Sequence can no longer continue.

The Background

During OS upgrades, MDT needs a way to hook into the OS Installation process when done and continue installation tasks in the New OS.

Before In-Place upgrades, MDT would perform this by adding in a custom step into the unattend.xml file. Perhaps you have seen this segment of code before:

 <SynchronousCommand wcm:action="add">
  <CommandLine>wscript.exe %SystemDrive%\LTIBootstrap.vbs</CommandLine>
  <Description>Lite Touch new OS</Description>

Windows would run the LTIBootStrap.vbs script, which would call the LiteTouch.wsf script, which would find the existing Task Sequence environment and kick of the remaining steps within the “State Restore” of the Task Sequence.

For the Windows 10 In-Place upgrade process, instead of processing the unattend.xml file, they have their own method of calling back into our LiteTouch environment using a SetupComplete.cmd file. This SetupComplete.cmd is responsible for finding our LiteTouch script and calling it.

The Analysis

It took me a while to setup a repro scenario (my test lab is configured for new-computer scenarios with the Windows 10 Eval bits, which can’t be used in an In-Place upgrade scenario). But I was able to reproduce the issue, and I got the bug, and was able to get the bdd.log file for analysis.

The challenge here is that during In-Place upgrade, I can’t open a Cmd.exe window for debugging using F8 or Shift-F10. Instead I hard coded a “start cmd.exe” line into the SetupComplete.cmd file.

What I observed is that the Window being displayed was the ZTIGather.wsf progress screen. LiteTouch.wsf will kick off the ZTIGather.wsf script early in the process, and will show a customized version of the LiteTouch wizard as a progress dialog. Well clearly the wizard isn’t working properly. But after closer analysis, ZTIGather.wsf shouldn’t be running AT ALL. For some reason, LiteTouch.wsf didn’t recall that it was in the MIDDLE of a task sequence, and that it should just directly go back to the TS in progress.

MDT has two methods for storing variables. When within the SMS Stand Alone Task Sequencing Engine, MDT LiteTouch scripts will read the SMS variables through the Microsoft.SMS.TSEnvironment COM object. But if ZTIUtility.vbs can’t open the variable store, it will store variables locally in the Variables.dat file.

Finally, after getting a powershell.exe window, creating the Microsoft.SMS.TSEnvironment, and making a couple of test calls to verify the contents of the variable store, a surprise. All variables returned empty *Successfully* (which is bad) but writes caused an Exception (which is correct). Since we were not running the SMS Stand Alone Task Sequencing Engine, all calls should have caused an exception.

Why is the Microsoft.SMS.TSEnvironment COM object registered, but not working? Well, that’s still under investigation, but now we can work on a fix/work around for MDT!!!

The Fix

The fix (Not written by me), is to force the Microsoft.SMS.TSEnvironment COM object to unregister before calling LiteTouch.wsf. This can be done by a simple call at the start of the SetupComplete.cmd script:

:: Workaround for incorrectly-registered TS environment
reg delete HKCR\Microsoft.SMS.TSEnvironment /f

The issue has been acknoleged by Microsoft, and this fix is currently targeted for the next release of MDT.

<Code Removed>




Create a Remote Desktop Conn. Mgr. file for Azure

Every now and then, I will spin up a test environment in azure with a couple of Virtual Machines. And as much as enjoy the User interface of the Portal, there have been times that I longed for the ease of use of the Remote Desktop Connection Manager.

One of the frustrations is that the *.rdp files downloaded from Azure run at full screen, which I find awkward, so the first thing I do is bring down the resolution, which is a pain, when I have several machines.

So I decided to create a tool to auto-generate a Remote Desktop Configuration Manager configuration file.

This also gave me the opportunity to play around with the way *.rdg stores passwords.

RDG security

First a word about security.

The Remote Desktop Connection Manager uses the “Data Protection API” to “Encrypt” passwords stored within the *.rdg file.


The great thing about this API, is that if another user were to open this file on another machine, it can’t be read. Only the user running on the same machine can extract this password.

Note that any program or script running under your user’s context can read this password as plaintext, works great for this script, but my one design change to the “Remote Desktop Connection Manager” would be to add a “PIN” or other layer of security abstraction to prevent other “rouge” processes or users from gaining access to the passwords stored locally on my machine.


This script will log into your Azure Account, enumerate all the service groups, and get a list of all Virtual Machines within these groups, Creating entries within the “Remote Desktop Connection Manager” for each Azure Virtual Machine.


Auto Generate a RDG file for Azure.
Will create a Microsoft Remote Desktop Connection Manager *.RDG file
from the Virtual Machines within your Azure Tenant.
Location of the target *.RDG file.
The default is "My Azure Machines.rdg" placed on the desktop
Will create the RDG file *even* if the file already exists (force it).
.PARAMETER Credential
An array of [PSCredential] objects to be placed in the RDG file.
Credentials for logging into Azure
C:\PS> .\RDGGen.ps1
Generate the RDG file with no built in credentials.
C:\PS> $cred = Get-Credential
C:\PS> .\RDGGen.ps1 -Credential $Cred
Generate an RDG file with credentials from the prompt.
Please be aware that although credentials are stored within the *.RDG file
"encrypted", any program running within the user's context can extract the
password as plain text. YMMV.
Copyright Keith Garner, All rights reserved.
Apache License
[pscredential[]] $Credential,
[string] $path = ([Environment]::GetFolderPath("Desktop") + "\My Azure Machines.rdg" ),
[switch] $force,
[pscredential] $AzureCred
#region Support Routines
function Get-CredentialBlob {
param( [pscredential[]] $Credential )
process {
foreach ( $cred in $Credential ) {
$PasswordBytes = [System.Text.Encoding]::Unicode.GetBytes($cred.GetNetworkCredential().password)
$SecurePassword = [Security.Cryptography.ProtectedData]::Protect($PasswordBytes, $null, [Security.Cryptography.DataProtectionScope]::LocalMachine)
$Base64Password = [System.Convert]::ToBase64String($SecurePassword)
<credentialsProfile inherit="None">
<profileName scope="Local">$($cred.UserName)</profileName>
function Get-MyAzureServices {
param ( $Services )
foreach ( $Service in $Services ) {
foreach ( $VM in Get-AzureVM ServiceName $service.label ) {
$Port = $VM | Get-AzureEndpoint | ? Name -eq RemoteDesktop | % Port
# Connect to Azure and get the server list..
Import-module azure Force ErrorAction SilentlyContinue
$Services = get-azureservice ErrorAction SilentlyContinue
if ( -not $Services ) {
if ( $AzureCred ) {
Add-AzureAccount Credential $AzureCred
else {
$Services = get-azureservice ErrorAction SilentlyContinue
<?xml version="1.0" encoding="utf-8"?>
<RDCMan programVersion="2.7" schemaVersion="3">
$( get-CredentialBlob $Credential )
if ( $Credential ) {
<logonCredentials inherit="None">
<profileName scope="File">$($Credential | Select-object first 1 | % UserName )</profileName>
<remoteDesktop inherit="None">
$( Get-MyAzureServices $Services )
<connected />
<favorites />
<recentlyUsed />
"@ | out-file filepath $path Encoding utf8 force:$Force
if ( test-path $path ) {
& 'C:\Program Files (x86)\Microsoft\Remote Desktop Connection Manager\RDCMan.exe' $path

view raw


hosted with ❤ by GitHub


Make DisMount-DiskImage work

TL;DR – DisMount-DiskImage doesn’t work the same it did in Windows Server 2012 R2, here is how to make it work in Windows 10 and Server 2016.

Dirty Boy

OK, with the release of Windows 1709, I’ve been downloading all sorts of *.ISO images from MSDN to try out the latest features, kits, and support utilities. As I develop scripts to auto mount and extract the contents, sometimes I leave the ISO images mounted, so I’ll need to clear everything off before I begin a new test run.

I developed a new powershell script function to do all of this for me: Dismount-Everything. I had the VHDX part working, but somehow the DVD part wasn’t working well.

I specifically wanted to dismount ISO images, even though I might now recall the path where they came from, even though the drive letter should be easily visible.


I went to a couple of blog sites to find out how to dismount ISO images, and got some hits.

But on the superuser site, the author suggests that you should be able to take the output from get-volume and pipe it into Get-DiskImage, but that was not working for me.

Get-Volume [Drive Letter] | Get-DiskImage | Dismount-DiskImage

No matter what, Get-DiskImage didn’t like the volume Path.

Get-Volume would output:


but Get-DiskImage would expect:


Well, Windows NT will create virtual shortcuts between long path names like \\?\volume and something more readable like \\.\CDROM1, so I assumed there was an association there.

Well after testing I found out that this command didn’t work:

get-diskimage -devicepath \\?\Volume{1f8dfd40-b7ae-11e7-bf06-9c2a70836dd4}\

But this command did:

get-diskimage -devicepath \\?\Volume{1f8dfd40-b7ae-11e7-bf06-9c2a70836dd4}

Turns out that I just needed to strip out the trailing \.



Get-Volume | 
  Where-Object DriveType -eq 'CD-ROM' |
  ForEach-Object {
    Get-DiskImage -DevicePath  $_.Path.trimend('\') -EA SilentlyContinue
  } |

Notes on Microsoft ADV170012 – TPM Madness.

Hidden within the latest Microsoft Security Advisory is a Whooper: ADV170012

The summary is that some of the Infineon TPM chip implementations have a bug. And appears that someone has produced a Proof of Concept exploit. Wow.

Microsoft and Infineon have arguably done the right thing here and have announced the issue, produced a Hotfix to help customers better identify the issue, and have developed tools to update the issue through firmware.

What’s not clear to me is just what the issue is, and what the hotfix does. Unfortunately, it may be a while before Microsoft releases more information, while they give companies a head start working on application of the hotfixes.

A link in the article above suggest that exploit is not easy:

A successful attack depends on conditions beyond the attacker’s control. That is, a successful attack cannot be accomplished at will, but requires the attacker to invest in some measurable amount of effort in preparation or execution against the vulnerable component before a successful attack can be expected.

Which leads me to believe that any exploit is hard, requiring a highly skilled attacker, not someone who is going to steal my laptop from the local Starbucks in the hopes of getting my Credit Card number, saved somewhere on the machine.

Stay tuned…


In the mean time, I decided to re-write the PowerShell script in the article above. The latest version works great when issuing commands remotely and collecting the data in a centralized location.

For example I could run the command:

icm { iwr '' | iex } -Cred $cred -Computer Pickett1,Pickett2,Pickett3,Pickett4 | ft *

And see the output:


Lucky for me I have Four machines that are affected with the Bad TPM Module.

  • One machine is my work machine (Version 6.41)
  • Two machines don’t have bad Infineon version numbers (Verison 3.19), but may need to be cleared anyways. Easy to do.
  • One machine has the bad Infineon version (Version 4.32), but the TPM Module is on a replacement Riser card, and I can purchase a new one for $50.

Now to figure out how to address this at work.


#Requires -Version 3
#requires -RunAsAdministrator
TPM Infineon Riemann Check
Checks the status of TPM on the local machine and returns status as a PowerShell object.
Must be run at elevated permissions.
PSCustomObject with several properties.
C:\PS> .\Test-TPMReimann.ps1
hasTPM : True
ManufacturerId : 0x53544d20
ManufacturerVersion : 13.12
FirmwareVersionAtLastProvision :
NeedsRemediation : False
Reason : This non-Infineon TPM is not affected by the Riemann issue. 0x53544d20
C:\PS> icm -scriptblock { iwr '; | iex } -RunAsAdministrator -ComputerName PC1,PC2
Given the URL path to this script ( to get the script, click on the raw link above ), will run the command on the machines and collect the results locally.
If (-NOT ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")){
Throw "Not Administrator"
$TPM = try { Get-Tpm } catch { $Null }
$FirmwareVersionAtLastProvision = Get-ItemProperty Path "HKLM:\SYSTEM\CurrentControlSet\Services\TPM\WMI" Name "FirmwareVersionAtLastProvision" ErrorAction SilentlyContinue | % FirmwareVersionAtLastProvision
#region Infineon version test routines
function Test-RiemannVersion ( [string[]] $version ) {
# Returns True if not safe
switch ( $version ) {
4 { return $version[1] -le 33 -or ($version[1] -ge 40 -and $version[1] -le 42) }
5 { return $version[1] -le 61 }
6 { return $version[1] -le 42 }
7 { return $version[1] -le 61 }
133 { return $version[1] -le 32 }
default { return $False }
#region Test Logic
if ( !$TPM ) {
$Reason = "No TPM found on this system, so the Riemann issue does not apply here."
$NeedsRemediation = $False
elseif ( $TPM.ManufacturerId -ne 0x49465800 ) {
$Reason = "This non-Infineon TPM is not affected by the Riemann issue. 0x$([convert]::ToString($TPM.ManufacturerId,16))"
$NeedsRemediation = $False
elseif ( $TPM.ManufacturerVersion.IndexOf('.') -eq -1 ) {
$Reason = "Could not get TPM firmware version from this TPM. $($TPM.ManufacturerVersion)"
$NeedsRemediation = $False
elseif ( Test-RiemannVersion ( $Tpm.ManufacturerVersion -split '\.' ) ) {
$reason = "This Infineon firmware version TPM is not safe. $($Tpm.ManufacturerVersion)"
$NeedsRemediation = $true
elseif (!$FirmwareVersionAtLastProvision) {
$Reason = "We cannot determine what the firmware version was when the TPM was last cleared. Please clear your TPM now that the firmware is safe."
$NeedsRemediation = $true
elseif ($FirmwareVersion -ne $FirmwareVersionAtLastProvision) {
$Reason = "The firmware version when the TPM was last cleared was different from the current firmware version. Please clear your TPM now that the firmware is safe."
$NeedsRemediation = $true
} else {
$reason = 'OK'
$NeedsRemediation = $False
#region Output Object
[PSCustomObject] @{
# Basic TPM Information
hasTPM = $TPM -ne $null
ManufacturerId = "0x" + [convert]::ToString($TPM.ManufacturerId,16)
ManufacturerVersion = $Tpm.ManufacturerVersion
FWVersionAtLastProv = $FirmwareVersionAtLastProvision
# Does the machine need Remediation for Riemann issue?
NeedsRemediation = $NeedsRemediation
# Reason String
Reason = $Reason


Don’t recall why I  named it Reimann, I think I saw that as a code word in an article somewhere, and it stuck. Is the name of the researcher who found the issue, or just an arbitrary code name?

Not sure why you need to know when the last Provision Time was on machines *without* the issue. Either the TPM chips works or not!?!?



ZTISelectBootDisk.wsf new with BusType

Several years ago I wrote a script to help select which disk to deploy Windows to during your MDT LiteTouch or ZeroTouch task sequence.

Well, based on a request from my latest client, I have created a similar script that support BusType.


My client is trying to install Windows Server 2016 on a Server with a SAN. When the machine boots to WinPE, one of the SAN drives appears *first* as Disk 0 (Zero). By default MDT Task Sequences will deploy to Disk Zero! My ZTISelectBootDisk.wsf already shows how to override. All we need to do is to find a way to tell MDT which disk to choose based on the correct WMI query.

Turns out it was harder than I thought.

What we wanted was the BusType that appears in the “Type” field when you type “Select Disk X” and then “detail disk” in Diskpart.exe.  When we ran “Detail Disk” in DIskpart.exe we could see the bus type: Fibre as compared to regular disks like SCSI or SAS.

The challenge was that the regular Win32_diskDrive WMI query wasn’t returning the BusType value, and we couldn’t figure out how to get that data through other queries.

I tried running some PowerShell queries like ‘Get-Disk’ and noticed that the output type was MSFT_Disk, from a weird WMI Namespace: root\microsoft\windows\storage. But adding that query to the script works! Yea!!!


What kind of BusTypes are there?

Name Value Meaning
Unknown 0 The bus type is unknown.
1394 4 IEEE 1394
Fibre Channel 6 Fibre Channel
SAS 10 Serial Attached SCSI (SAS)
SATA 11 Serial ATA (SATA)
SD 12 Secure Digital (SD)
MMC 13 Multimedia Card (MMC)
Virtual 14 This value is reserved for system use.
File Backed Virtual  15 File-Backed Virtual
Storage Spaces  16 Storage spaces
NVMe 17 NVMe

For this script we are *excluding* the following devices:

Name Value Meaning
Fibre Channel 6 Fibre Channel
Storage Spaces  16 Storage spaces
NVMe 17 NVMe

Meaning that the *FIRST* fixed device not in this list will become the new *Target* OS Disk. Run this query on your machine to see what disk will become the target:

gwmi -namespace root\microsoft\windows\storage -query 'select Number,Size,BusType,Model from MSFT_Disk where BusType <> 6 and BusTy
pe <> 9 and BusType <> 16 and BusType <> 17' | Select -first 1


Reminder that this script requires MDT (latest), and the script should be placed in the %DeploymentShare%\Scripts folder. Additionally you should install all the Storage packages for WinPE, sorry I don’t recall *which* packages I selected when I did testing.


<job id="ZTISelectBootDisk">
<script language="VBScript" src="ZTIUtility.vbs"/>
<script language="VBScript" src="ZTIDiskUtility.vbs"/>
<script language="VBScript">
' // ***************************************************************************
' //
' // Copyright (c) Microsoft Corporation. All rights reserved.
' //
' // Microsoft Deployment Toolkit Solution Accelerator
' //
' // File: ZTISelectBootDisk.wsf
' //
' // Version: <VERSION>
' //
' // Purpose: Given a collection of Storage Devices on a machine,
' // this program will assist in finding the correct
' // device to be processed by "ZTIDiskPart.wsf"
' //
' // Currently hard coded to select the *FIRST* drive that is
' // Not iSCSI, Fibre Channel, Storage Spaces, nor NVMe.
' //
' // REQUIRES that you install the correct WinPE Storage Components!
' //
' //
' // WARNING: If there are any *other* disks that need to be Cleaned
' // and formatted, they should be processed first.
' // And this the global Variable OSDDiskIndex should be
' // set to <blank> when done being processed by ZTIDiskPart.wsf.
' //
' // Variables:
' // OSDDiskIndex [ Output ] – Disk Index
' //
' // Usage:
' // cscript.exe [//nologo] ZTISelectBootDisk.wsf [/debug:true]
' // cscript.exe [//nologo] ZTIDiskPart.wsf [/debug:true]
' // cscript.exe [//nologo] ZTISetVariable.wsf [/debug:true] /OSDDiskIndex:""
' //
' // ***************************************************************************
Option Explicit
'// Main Class
Class ZTISelectBootDisk
'// Main routine
Function Main
Dim oWMIDisk
Dim bFound
Dim oDiskPartBoot
Dim oContext, oLocator, objQuery, objStorageWMI, objStorage
oLogging.CreateEntry "—————- Initialization —————-", LogTypeInfo
IF oEnvironment.Item("DEPLOYMENTTYPE") <> "NEWCOMPUTER" Then
oLogging.ReportFailure "Not a new computer scenario, exiting Select Boot Disk.", 7700
End If
bFound = FAILURE
' 1st Pass – Find any disk that matches the Query "Select * From Win32_diskPartition %OSBootDiskOverrideWQL%"
Set oContext = CreateObject("WbemScripting.SWbemNamedValueSet")
oContext.Add "__ProviderArchitecture", 64
Set oLocator = CreateObject("Wbemscripting.SWbemLocator")
set objStorageWMI = oLocator.ConnectServer("","root\Microsoft\Windows\Storage","","",,,,oContext)
set objQuery = objStorageWMI.ExecQuery("select Number,Size,BusType,Model from MSFT_Disk where BusType <> 6 and BusType <> 9 and BusType <> 16 and BusType <> 17")
If objQuery.Count = 0 then
oLogging.CreateEntry "No Disk Drives Found!?!?! Dude, did you install the right storage drivers into WinPE 0x7b.",LogTypeError
exit function
End if
For each objStorage in objQuery
oLogging.CreateEntry "Found Device: N:" & ObjStorage.Number & " S:" & ObjStorage.Size & " M:" & ObjStorage.Model & " T:" & ObjStorage.BusType & " " , LogTypeInfo
oEnvironment.Item("OSDDiskIndex") = ObjStorage.Number
bFound = SUCCESS
exit for
' 2nd pass – Use the 1st Partition larger than 15GB on the first disk with a bootable partition.
If bFound = FAILURE then
oLogging.CreateEntry "No drive was found using search parameters, Use the 1st \Windows Partition found.", LogTypeInfo
set oDiskPartBoot = GetBootDriveEx( false, oEnvironment.Item("ImageBuild"), false )
If not oDiskPartBoot is nothing then
oEnvironment.Item("OSDDiskIndex") = oDiskPartBoot.Disk
bFound = SUCCESS
End if
End if
TestAndLog bFound = SUCCESS, "Verify OSDDiskIndex was found and set: " & oEnvironment.Item("OSDDiskIndex")
Main = bFound
End Function
End class




PowerShell Switch type never $null

Tales from the code review…

How do you test for a switch type in a PowerShell script?

How do you test for the *absence* of a switch in a PowerShell Script?

Came across this recently, and decided to dig into it further.


Function Test-Switch ( [switch] $Test ) {
# Correct use of a switch Test (True case)
if ( $test ) {
"Do Something"
# Bad use of a switch test (False case)
if ( $test -eq $null ) {
"Never going to do it!"
# Better use of a switch test (False case)
if ( -not $test ) {
"Don't do it"
Test-Switch Test:$False
Test-Switch Test
Test-Switch Test:$True

IN the example above, We have a function with a single switch argument. We then test against that argument, displaying “do something” if it’s set, and “Don’t do it” if not set.

Example Output:

PS C:\Users\Keith> C:\Users\Keith\Source\Example\test-switches.ps1
Don't do it
Don't do it
Do Something
Do Something

Cool!  Um… where did the “Never going to do it!” go? Well turns out that even when we don’t specify -test as an argument to the function, it’s still a switch defined as IsPresent = $false. So testing to see if it’s equal to $null will always fail, because it’s never $null.


Microsoft Groove RIP – Export your Playlist

OK… I’m using Groove. Don’t know why I paid the annual subscription, perhaps I had grand plans to sync up my music lists to a single platform and decided to give it a chance. Oh well… Microsoft just killed it.

Anyways, I’ve been collecting some songs over the past couple of years, and before I forget what they are, I thought I would export the list, only to find out that Groove only supports exporting to Spotify, well I don’t know what music service I’m planning on moving to, but it *might* not be Spotify, so I need to figure out how to export my list now.

I tried getting an Groove Music API, key, but Microsoft shutdown the service, I also tried another online service, but they wanted to charge a monthly fee. I did figure out that I can download my playlist locally to my machine. The files will be DRM protected, but I can use the file names to generate a playlist. How? Powershell to the rescue!

IF you would like to create a list, open up a powershell.exe command prompt and run the following command (Single line):

iwr | % Content | IEX | export-csv -NoTypeInformation -path $env:USERPROFILE\desktop\myGrooveList.csv

This command will download the powershell script from GitHub, execute, and export to a file called MyGrooveList.csv on your desktop. ( or replace desktop with downloads, whatever).


Then you can open the MyGrooveList.csv file in Excel and import later.

Here is the full script:

Export Groove Playlist
Export Groove Music playlist (tested on Groove Music version 9/25/2017)
* Open Groove Music
* Click on "My Music"
* Select all tracks ( Ctrl-A )
* Click on "Download"
This should download encrypted (protected) music files locally.
You won't be able to downlaod these tracks, but you can now get a manifest of the tracks with this script.
Specifies the file name.
.PARAMETER Extension
Specifies the extension. "Txt" is the default.
C:\PS> .\Export-GroovePlaylist.ps1
Will export your groove playlist to a formatted text file
C:\PS> .\Export-GroovePlaylist.ps1 | export-csv -NoTypeInformation -Path $env:USERPROFILE\Desktop\myGrooveList.csv
Will export your groove playlist to a CSV (comma Seperated Value) file that can be opened in excel
$path = "$env:userprofile\Music\Music Cache\Subscription Cache"
get-childitem recurse $path file |
foreach-object {
$Song = $_.FullName.replace("$path\",'').replace('.wma','') -split '\\'
if ( $Song[2] -match '^[0-9][0-9] ' ) {
[pscustomobject] @{ Artist = $Song[0]; Album = $Song[1]; Track = $song[2].Substring(0,2) -as [int]; Song = $Song[2].Substring(3) } | Write-Output
else {
[pscustomobject] @{ Artist = $Song[0]; Album = $Song[1]; Song = $Song[2] } | Write-Output

Download Ignite 2017 videos locally

Thanks to Michel de Rooij on TechNet gallery for posting this slick script where you can download TechNet content locally to your machine.

I wanted to select which videos to download, and wrote this powershell script to use out-gridview to download content. It calls the script above.


You can run the command directly from powershell, just cut and paste this command:

iwr | % Content | Iex 


  • Will download and cache the content locally so you can re-run the script repeatedly without having to wait to parse the website.
  • Will then display all the sessions in the PowerShell Out-GridView. Out-gridview is powerful.

  • Then will ask If you want to save the list to a *.html file for online viewing later.
  • Will also ask if you want to save the offline content to a local file.

The script:

Script to assist finding content to download from MS Ignite 2017
Will use the script Get-IngiteSession.ps1 from:
[string] $URI = '',
[string]$DownloadFolder = "$ENV:SystemDrive\Ignite",
[string[]] $IgniteProperties = @( 'sessionCode', 'title', 'abstract', 'dayOfTheWeek', 'topics', 'format', 'audience', 'levels', 'personas', 'durationInMinutes', 'products', 'tags', 'speakerNames', 'speakerIds', 'slideDeck', 'onDemand', 'downloadVideoLink' )
Function Remove-InvalidFileNameChars {
$invalidChars = [IO.Path]::GetInvalidFileNameChars() -join ''
$re = "[{0}]" -f [RegEx]::Escape($invalidChars)
return ($Name -replace $re)
if ( -not ( test-path $DownloadFolder ) ) {
new-item ItemType directory Path $DownloadFolder ErrorAction SilentlyContinue force | out-null
#region Download and Cache session list
$IgniteCache = join-path $DownloadFolder 'IgniteSessionCache.xml'
if ( -not ( test-path $IgniteCache ) ) {
Write-verbose "Get meta-data on sessions"
$LocalSCript = Join-Path $DownloadFolder (split-path $URI leaf)
if ( -not ( test-path $LocalScript ) ) {
write-verbose "download Script $URI"
Invoke-WebRequest UseBasicParsing Uri $URI OutFile $LocalScript
$IgniteList = & $LocalScript InfoOnly
remove-item Path $LocalScript ErrorAction SilentlyContinue | out-null
$IgniteList | Export-Clixml Path $IgniteCache
else {
$IgniteList = Import-Clixml path $IgniteCache
#region Parse the list and display in out-gridview
$MyList = $igniteList |
# We are only intrested in presentations with content.
where { $_.SlideDeck -or $_.DownloadVideoLink -or $_.OnDemand } |
# Don't display any repeat sessions
where { ! $_.SessionCode.EndsWith('R') } |
select Property $IgniteProperties |
Out-GridView Title 'Select Items to download (use ctrl to select more than one)' OutputMode Multiple
$count = $MyList | measure-object | % Count
write-host "Selected $count Items"
if ( $Count -eq 0 ) { exit }
#region Display OnDemand content
if ( [System.Windows.MessageBox]::Show("Found $Count Items`r`n`r`nOK to generate ONDemand link?",'Ignite Download','YesNo','Info') -eq 'yes' ) {
write-host "List of OnDemand Content: "
$MyList | ? { $_.OnDemand } | ft Property Title,onDemand | out-string Width 200 | write-host
$MyList | ? { ! $_.OnDemand -and $_.slideDeck } | ft Property Title,slideDeck | out-string Width 200 | write-host
"<html><head><title>OnDemand Ignite Session List</title></head><body>" > $DownloadFolder\OnDemandList.html
$MyList | ? { $_.OnDemand } | % { "<a href=""$($_.onDemand)"">$($_.SessionCode)$($_.Title)</a></br>" } >> $DownloadFolder\OnDemandList.html
"</br></br>" >> $env:temp\OnDemandList.html
$MyList | ? { ! $_.OnDemand -and $_.slideDeck } | % { "<a href=""https:$($_.slideDeck)"">$($_.SessionCode)$($_.Title)</a></br>" } >> $DownloadFolder\OnDemandList.html
"</body></html>" >> $DownloadFolder\OnDemandList.html
start $DownloadFolder\OnDemandList.html
#region Download content
if ( [System.Windows.MessageBox]::Show("Found $Count Items`r`n`r`nOK to Download to: $DownloadFolder",'Ignite Download','YesNo','Info') -eq 'yes' ) {
$MyList |
foreach-object {
if ( $_.DownloadVideoLink ) {
[PSCustomObject] @{ Source = $_.DownloadVideoLink; Destination = join-path $DownloadFolder ( Remove-InvalidFileNameChars "$($_.SessionCode)$($_.Title).mp4" ) }
elseif ( $_.SlideDeck ) {
[PSCustomObject] @{ Source = [System.Web.HttpUtility]::UrlDecode( $_.Slidedeck.replace('//','') ) ;
Destination = join-path $DownloadFolder ( Remove-InvalidFileNameChars "$($_.SessionCode)$($_.Title).pptx" ) }
} |
where-object { -not ( test-path $_.Destination ) } |
#remove-item -Path $IgniteCache -ErrorAction SilentlyContinue | out-null


Update CustomSettings.ini file remotely!

Got on a discussion this week with someone how to use PowerShell to update an MDT CustomSettings.ini file over the network. Well a *lot* of CS.ini files.. 🙂

My manager is the Global Ops Manager and now he is asking me to find a way to run [update of customsettings.ini] on about 50 servers worldwide so the other MDT admins don’t have to log onto each server just to add one line.

The example given was to update the AdminPassword in CS.ini. I hope this company is following best practices, and disabling the local Administrator account and/or changing the Password once joined to the domain or connected to SCCM.

Anywho, INI files are a tad bit difficult to modify in Powershell because there are no native PowerShell or .NET functions to perform the action. So instead we need to do some ugly Pinvoke calls to the appropriate Win32 API.


Update CustomSettings.ini file.
Updates one or more CUstomSettings.ini files with a common value.
Calling powershell.exe instance must have read/write privelages to the share.
.PARAMETER DeployShares
The full path to the share. Can be input from the pipeline
The section name to update.
THe name to update
The value to write
C:\PS> .\Update-INIFiles -DeployShares c:\DeploymentShare -Section Default -Name AdminPassword -value 'P@ssw0rd'
set a new password in an MDT deployment share
C:\PS> "\\localhost\DeploymentShare$" | .\Update-INIFiles -Section Default -Name AdminPassword -value 'P@ssw0rd'
set a new password in an MDT deployment share, get the file from the pipeline.
C:\PS> type .\MyMDTServerList.txt | .\Update-INIFiles -Section Default -Name AdminPassword -value 'P@ssw0rd'
set a new password in an MDT deployment share, get the list of files from a list of servers passed in through the cmdline.
C:\PS> [Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null
C:\PS> $NewPassword = [System.Web.Security.Membership]::GeneratePassword(10,2)
C:\PS> "The new password will be: $NewPassword"
The new password will be: F{nK:*[L}H
C:\PS> type .\MyMDTServerList.txt | .\Update-INIFiles -Section Default -Name AdminPassword -value $NewPassword
Generate a new random password with powershell, then update all Cs.ini files from a list of servers passed in through the command line.
[parameter(Mandatory=$true, ValueFromPipeline=$true)]
[string] $Section,
[string] $Name,
[string] $Value
begin {
## The signature of the Windows API that retrieves INI settings
$signature = @'
[DllImport("kernel32.dll", CharSet=CharSet.Unicode, SetLastError=true)]
public static extern bool WritePrivateProfileString(
string lpAppName,
string lpKeyName,
string lpString,
string lpFileName);
public static extern uint GetLastError();
## Create a new type that lets us access the Windows API function
$type = Add-Type MemberDefinition $signature Name API Namespace Win32 PassThru
process {
foreach ( $DPShare in $DeployShares ) {
if ($pscmdlet.ShouldProcess("$DPShare", "CustomSettings.ini write")){
$result = [Win32.API]::WritePrivateProfileString($Section, $Name, $Value, "$DPShare\control\customsettings.ini")
if ( -not $result ) {
$err = [Win32.API]::GetLastError()
throw ( New-Object ComponentModel.Win32Exception ($err -as [int]) )

New script – Import Machine Objects from Hyper-V into ConfigMgr

Quick Post, been doing a lot of ConfigMgr OSD Deployments lately, with a lot of Hyper-V test hosts.

For my test hosts, I’ve been creating Machine Objects in ConfigMgr by manually entering them in one at a time (yuck). So I was wondering what the process is for entering in Machine Objects via PowerShell.

Additionally, I was curious how to inject variables into the Machine Object that could be used later on in the deployment Process, in this case a Role.

Up next, how to extract this information from VMWare <meh>.

Generate a computer list from Hyper-V ready to import into Configuration Manager
Given a Hyper-V server and a set of Hyper-V Virtual Machines, this script will
extract out the necessary information required to create the associated Machine Object in
Configuration Manager.
Name of the CSV file to be created
.PARAMETER SourceComputer
Optional parameter used to pre-populate the SourceComputer Field in the CSV output.
Optional Paramater used to pre-populate the
If you modify this file in Excel, you should save the file in "CSV (MS-DOS) *.csv" format to ensure there are no extra double-quotes present.
.\Get-VMListFOrCM.ps1 | ft
Get all virtual machines and display in a table.
.\Get-VMListFOrCM.ps1 | convertto-csv -NoTypeInformation | % { $_.replace('"','') }
Find all Virtual Machines and convert to a CSV format without any doublequotes.
.\Get-VMListFOrCM.ps1 -Verbose -name hyd-cli* -Role ROLE_Test1 -path .\test.csv
Find all Virtual Machines that start with the name HYD-CLI, and export to .\test.csv
param (
[string[]] $Name,
[string] $computerName,
[pscredential] $Credential,
[string] $path,
[string] $SourceComputer = '',
[string] $Role = ''
$GetVMProp = @{}
if ( $computerName ) { $GetVMProp.add( 'ComputerName',$computerName ) }
if ( $Credential ) { $GetVMProp.add( 'Credential',$Credential ) }
$VitSetData = get-wmiobject Namespace "Root\virtualization\v2" class Msvm_VirtualSystemSettingData @GetVMProp
if ( $Name ) { $GetVMProp.add( 'Name',$Name ) }
write-verbose "Extract data from Hyper-V"
$Results = Get-VM @GetVMProp |
ForEach-Object {
[PSCustomObject] @{
ComputerName = $_.Name
'SMBIOS GUID' = $VitSetData |
Where-Object ConfigurationID -eq $_.VMId.Guid |
ForEach-Object { $_.BIOSGUID.Trim('{}') }
'MAC Address' = $_ | Get-VMNetworkAdapter | select-object first 1 | ForEach-Object { $_.MacAddress -replace '..(?!$)', '$&:' }
'Source Computer' = $SourceComputer
'Role001' = $Role
$Results | out-string Width 200 | Write-Verbose
write-verbose "write out to CSV file"
if ( $path ) {
$Results |
ConvertTo-Csv NoTypeInformation |
ForEach-Object { $_.replace('"','') } |
Out-File FilePath $path Encoding ascii
write-verbose @"
You can now import this list into CM:
# Sample Script to import into Config Manager
import-module 'C:\Program Files (x86)\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1'
get-psdrive | Where-Object { $ -eq 'CMSite' } | Select-object -first 1 | ForEach-Object { set-location `"`$(`$`" }
Import-CMComputerInformation -CollectionName "All Systems" -FileName "$Path" -VariableName Role001
else {