Archives for the Month of November, 2007

Guitar Hero 3 … for Drums

I'm not particularly a drum fan, but this is an amazing feat of skill and technology.

Since I injured my middle left finger playing Guitar Hero 2 way too much when it came out, I've been dying to get back to playing it. ... so I thought "wouldn't it be great to be able to play Guitar Hero on the drums?" So I thought about how that might be accomplished... researched, implemented, borrowed, and here I outline the finished product.

Even just listening to the drum kit during the song sounds great. For more information, Egyokeo has also been kind enough to outline the full build.

PowerShell Cookbook vs PowerShell in Action

The ACoupleOfAdmins blog recently posted a book review of the PowerShell Cookbook (and were kind enough to also write an Amazon Review.) They bring up some excellent points. Mainly,

The Windows PowerShell Cookbook will stay on my shelf as a reference book (for the code samples), but I would look to other resources first (e.g. Windows PowerShell In Action by Bruce Payette), if you need a resource to help learn PowerShell.

I thought long and hard about the depth and breadth of the book. One theme they're picking up on is that the PowerShell Cookbook is not a language focused book, and does not go into gritty detail about each language feature. This is intentional, as I wanted the book to have a very clear and unique value. We already have one PowerShell in Action, so there's really not a need for another.

One thing that's missed by the review is the implicit False Dichotomy -- that you should only have one book on PowerShell. The PowerShell Cookbook is intended to be a reference book (for its code samples and pre-packaged solutions,) while PowerShell in Action is intended to be a guided PowerShell tutorial. There is very little overlap between Bruce's book and the PowerShell Cookbook, and both provide significant value.

Since both Bruce and I wrote books with the intention to benefit the PowerShell community, it would not be in the best interest of anybody to have competing books!

3rd Parties and PowerShell Execution Policies

A question came up on the newsgroup recently about why Exchange changes PowerShell’s execution policy from “Restricted” to “RemoteSigned.” Doesn’t that lower PowerShell’s security?

The "Restricted" execution policy isn't intended to be something that PowerShell users live with forever. It's a safe default that protects non PowerShell users from being impacted by PowerShell-based malware.

For example, many home users had never used VBScript, but still got bitten by the flurry of WSH-based viruses that got mailed to them. PowerShell's Restricted execution policy solves this. To an attacker, a computer that has never used PowerShell is the same as a computer that doesn't have PowerShell installed at all. And since attackers care look to affect the largest number of computers possible, PowerShell becomes a less attractive vehicle for their attack. That, in turn, makes PowerShell users safer too. Given all of PowerShell’s security barbs, an attacker is better off using executables or other popular scripting languages as an attack vehicle.

So in light of all this, the Exchange team updates the execution policy to RemoteSigned by our recommendation. If you are an application that uses PowerShell support, you probably fit into one of these situations:

You expect the vast majority of your users to run your scripts

If you are a vendor that ships PowerShell scripts as an integral part of your offering (not just cmdlets,) you should absolutely expect that your users will want to run them. It would be a poor user experience if they had to install your product, and then change their execution policy when they try to use it. There’s also the very real threat of them just trying to make the message go away and picking a needlessly lax execution policy (such as Unrestricted.)

In this situation, we recommend that:

  • You sign your scripts with a real Authenticode code signing certificate
  • During installation, you change the execution policy to AllSigned if it is currently Restricted. If it is anything but Restricted, leave it alone.
  • DO NOT modify the user's Certificate Store. They will (and should) be prompted the first time they run your scripts to ensure they trust your signing certificate.
  • If your installer supports silent installation, offer an option to not modify the execution policy.

To clarify the “leave it alone” point – a non-default execution policy has been put there for a reason. While you might want to increase security to change the execution policy to AllSigned, it is more likely that you will inadvertently break other installed products, or parts of their IT infrastructure that depends on unsigned scripts.

If you do not have a code signing certificate, we recommend getting one! If you cannot, then your installer should prompt the user to change the execution policy to RemoteSigned if it is currently Restricted. If it is anything but Restricted, leave it alone.

“<This product> includes scripts that help you <manage Active Directory, etc.> Your current PowerShell script execution policy will prevent these scripts from running. Would you like to update the execution policy to allow these scripts to run?” ([Yes – Default] / No / Cancel)

Your product itself depends on PowerShell scripts

In this case, PowerShell scripts are simply an implementation language, just like C#, C++, or assembly language. Since you are only running scripts that you wrote, use the PSExecutionPolicyPreference environment variable to change the execution policy for the scope of your application only. RemoteSigned is the best option in this situation so that your application doesn't need to prompt the user. If your application offers the ability to run random scripts (or launch a PowerShell console,) ensure that you clear your PSExecutionPolicyPreference variable before doing so.

Now what about Exchange? Surely they can afford a code signing certificate!

From their design and user research, they know that a vast majority of their user base will write their own scripts, or run scripts from the community. When they do that, they will ultimately get our execution policy warning, and be forced to make a decision. The most secure decision they can make is the RemoteSigned execution policy, so the Exchange installer makes that decision on their behalf.


Laughing babies are contagious

Here's a nice light-hearted video if you need a laugh 🙂

Video: William laughing


Syntax Highlighting in PowerShell

Since we just released a CTP of PowerShell V2, I thought I'd share a handy little script to demonstrate one of the new APIs we introduced: Show-ColorizedContent.ps1.

This CTP introduces a new tokenizer API that lets you work with PowerShell script content the same way that our parser does -- as a collection of items (tokens) that represent the underlying structure of that script Until now, any tool that works with the data of a PowerShell script needs to parse the script on its own -- usually with fragile regular expressions or other means.

This often works, but usually falls apart on complex scripts:


In the first line, "Write-Host" is an argument to the Write-Host cmdlet, but gets parsed as a string. Fair enough, but the second line does not treat the argument the same way. In fact, since it matches a cmdlet name, the argument gets parsed as another cmdlet call. In the here string that follows, the Write-Host cmdlet name gets highlighted again, even though it is really just part of a string.

This is absolutely not a slam on the authors of existing highlighters -- it's just that we've now introduced something that makes life so much easier.

$content = [IO.File]::ReadAllText("c:\temp\ContentTest.ps1")
$errors = [System.Management.Automation.PSParseError[]] @()
[System.Management.Automation.PsParser]::Tokenize($content, [ref] $errors)

This API generates a collection of PSToken objects that give all the information you need to properly dissect a PowerShell script:

PS C:\Temp> [System.Management.Automation.PsParser]::Tokenize($content, [ref] $errors) | ft -auto

Content                           Type Start Length StartLine StartColumn EndLine EndColumn
-------                           ---- ----- ------ --------- ----------- ------- ---------
Write-Host                     Command     0     10         1           1       1        11
Write-Host                      String    11     12         1          12       1        24
...                            NewLine    23      2         1          24       2         1
Write-Host                     Command    25     10         2           1       2        11
Write-Host             CommandArgument    36     10         2          12       2        22
...                            NewLine    46      2         2          22       3         1
...                            NewLine    48      2         3           1       4         1
Write-Host Write-Host           String    50     23         4           1       4        24
...                            NewLine    73      2         4          24       5         1
...                            NewLine    75      2         5           1       6         1
testContent                   Variable    77     12         6           1       6        13
=                             Operator    90      1         6          14       6        15
Write-Host Hello World          String    92     30         6          16       8         3
...                            NewLine   122      2         8           3       9         1

This adds a whole new dimension to the way you can interact with PowerShell. Some natural outcomes are:

  • syntax highlighting
  • preparing a script for production (replacing all aliased commands with their expanded equivalent, etc)
  • script refactoring
  • FxCop / Style guideline checks
  • PowerTab 🙂

As a starter example, I've attached Show-ColorizedContent.ps1 -- a script to colorize PowerShell scripts in a console window. Its primary goal is to support demonstrations of PowerShell snippets. For that, it adds line numbers to let you easily refer to portions of your script. It also includes a -HighlightRanges parameter to let you highlight specific ranges of the script. The -HighlightRanges parameter is an array of line numbers, which you can easily create using PowerShell's standard array range syntax:

PS C:\Temp> .\Show-ColorizedContent.ps1 Show-ColorizedContent.ps1 ` >> -HighlightRanges (4..5+1+30..33) >> 001 > #requires -version 2.0 002 | 003 | param( 004 > $filename = $(throw "Please specify a filename."), 005 > $highlightRanges = @(), 006 | [System.Management.Automation.SwitchParameter] $excludeLineNumbers) 007 | 008 | # [Enum]::GetValues($host.UI.RawUI.ForegroundColor.GetType()) | % { Write-Host -Fore $_ "$_" } (...)
026 | $highlightColor = "Green" 027 | $highlightCharacter = ">" 028 | 029 | ## Read the text of the file, and parse it 030 > $file = (Resolve-Path $filename).Path 031 > $content = [IO.File]::ReadAllText($file) 032 > $parsed = [System.Management.Automation.PsParser]::Tokenize($content, [ref] $null) | 033 > Sort StartLine,StartColumn 034 | 035 | function WriteFormattedLine($formatString, [int] $line) 036 | { 037 | if($excludeLineNumbers) { return } 038 | 039 | $hColor = "Gray" 040 | $separator = "|"

Enjoy -- you can download it here.

Will it Pipe? Brevity and Readability

Scott Hanselman and I recently chatted about using PowerShell for a bit of log analysis. The majority of his solution (in green) ended up being quite elegant, flowing into a pipeline nearly as easily as you might speak it:

PS C:\> $re =[regex]"\d{2}(?=[_.])"; import-csv file.csv |
select File, Hits, @{Name="Show";Expression={$re.matches($_.File)[0] } } | sort Show -desc | group Show |
select Name,
{($_.Group | Measure-Object -Sum Hits).Sum }

(Of course, one should never pronounce a regex aloud in polite company.)

The bit in red took a little hammering on, but eventually produced the desired results. At this point, I mentioned that I didn't think the final solution was a good demonstration of PowerShell's pipeline power. 80% of it, absolutely. But since the last bit took some fussing, the solution stopped being an example of how easy the pipeline makes everything, and instead became an example of how you could write a pipeline to do anything.

If I was to blog it with the intent of education, I probably would have written a more scripty function:

function Get-ShowHits
    $regex = '/hanselminutes_(\d+).*'
    $shows = Import-CSv File.csv | Select File,Hits | Group { $_.File -replace $regex,'$1' }

    foreach($show in $shows)
        $showOutput = New-Object System.Management.Automation.PsObject
        $showOutput | Add-Member NoteProperty Name $show.Name
        $showOutput | Add-Member NoteProperty Hits ($show.Group | Measure-Object -Sum Hits).Sum
Get-ShowHits | Sort -Desc Hits

This example illustrates a couple great points about PowerShell:

  • You can write functions and scripts to wrap complex functionality into a more usable form
  • You can write pipelines to easily express powerfull object flows (in the $shows line)
  • You can create your own objects with their own properties -- and manipulate them just as easily

But the most important point about these two examples is how easy they are to modify and extend.

Jon Udell keyed in on Scott's post, and in the comments of the two blogs, language comparisons quickly blossomed. Hey, we've been here before!


These types of problems are like mosquito bites for me - I can’t stop itching at them. A better ruby one-liner. Doesn’t print a header but not a big deal…"test.csv").inject( {|h,row| h[row[0][/\d{4}/].to_i] += row[1].to_i;h}.sort.each {|i| puts("#{i[0]}\t#{i[1]}")}

Once you have a solution that works, a natural scripter's passion is to tinker it down to one line. It's no longer educational, intelligible, or extendable, but it's fun. You can do that in PowerShell, too:

$foo = @{}; ipcsv test.csv | % { $foo[0+($_.File -replace '.*?(\d+).*','$1')] += (0+$_.Hits) }; $foo.GetEnumerator() | Sort Value

Mmmm. Pipeline smoke.