jq Is the grep, sed and awk for json

The only problem with jq is that its not installed by default in ubuntu or ubuntu-server. Its not in the default ubuntu-cloudimg. One must apt-get install jq.

https://stedolan.github.io/jq/manual/ says, “jq is a lightweight and flexible command-line JSON processor.

In working with juju, we work with json formatted cookies in a ~/.go-cookies file. Sometimes we need to investigate these cookies to develop, verify, and debug our services.

An unexpired cookie value might be as good as a password or authentication token and so for the purpose of our debugging sometimes everything but the value is good enough. The jq filter ‘.[]|del(.Value)‘ strips all of the .Value properties from every object in the input array. This results in:

{
“Name”: “macaroon-a40e7abc65a78faf130dc652d45052c1c8b5b4aeff8181f44a15175b6525558f”,
“Domain”: “api.staging.example.com”,
“Path”: “/identity/”,
“Secure”: false,
“HttpOnly”: false,
“Persistent”: true,
“HostOnly”: true,
“Expires”: “2016-05-09T19:52:21Z”,
“Creation”: “2016-04-11T15:52:21.466266522-04:00”,
“LastAccess”: “2016-04-11T15:52:21.928768825-04:00”,
“Updated”: “2016-04-11T15:52:21.928768825-04:00”,
“CanonicalHost”: “api.staging.example.com”
}
{
“Name”: “macaroon-a605d07b7a95ba7e57a267ed507f673bce1188d0de7f544074f1c33ec4a8ff2a”,
“Domain”: “www.example.org”,
“Path”: “/identity/”,
“Secure”: false,
“HttpOnly”: false,
“Persistent”: true,
“HostOnly”: true,
“Expires”: “2016-05-03T21:46:35Z”,
“Creation”: “2016-04-05T17:46:36.351842179-04:00”,
“LastAccess”: “2016-05-02T15:11:10.525298848-04:00”,
“Updated”: “2016-04-05T17:46:36.351842179-04:00”,
“CanonicalHost”: “www.example.org”
}
{
“Name”: “macaroon-authn”,
“Domain”: “www.example.org”,
“Path”: “/NEENR/”,
“Secure”: false,
“HttpOnly”: false,
“Persistent”: true,
“HostOnly”: true,
“Expires”: “2016-05-03T19:11:09.794240373Z”,
“Creation”: “2016-05-02T15:11:10.592852105-04:00”,
“LastAccess”: “2016-05-02T15:23:26.813664654-04:00”,
“Updated”: “2016-05-02T15:11:10.592852105-04:00”,
“CanonicalHost”: “www.example.org”
}

Now lets say you want to remove the cookie with the Path value “/NEENR/”.

The jq filter: ‘.[] | select(.Path!=”/NEENR/”)’ does that job.

These examples show filter and map, but what about reduce?

Min, max, min_by and max_by are nice default reducers.

  • min_by(.Expires) shows the next expiring cookie.
  • max_by(.Created) shows the most recently created cookie.
  • [.[]|.Expires]|max if you don’t care about the rest of the cookie and just want the max date.
  • [.[]|.Expires]|min if you just want the min date.

See the Array Construction section of the manual for the details on the syntax. I like to think of it as the .[]|.NAME returns elements and if I want them in an array I wrap it in [] for array construction.

jq is a sweet tool that I’m glad to have in my toolbox.

Version from debian/changelog

Almost two years ago I did some scripting of updating debian/changelog and building a package to enable a CI environment for some software. I wanted to parse the changelog correctly and so I copied and changed some perl from the source of dpkg-buildpackage. This turned out to be the wrong solution.

There is a nice tool called dpkg-parsechangelog. You can get just the version for use in scripts with this simple awk:

dpkg-parsechangelog | awk ‘/Version/ { print $2 }’

I didn’t even think to write about it until I ran across someone else who’d written some perl to do exactly the same thing. Dear world, we need to stop reinventing this wheel.

MacGyvering Windows 8.1 Remote Assistance

My Mother called me up rather frazzled this evening.

This isn’t too surprising. Since her stroke 16 years ago she can sometimes become confused or forget simple things, things she once knew.

Tonight, the cause of her frazzled state was her computer.

After listening to her rant and ramble about her computer, I quickly realized that she had some web browser pop-up phishing telling her she had a virus. Partly because of who she is, and partly because of brain damage from stroke, she called the phone number that the pop-up displayed. When they told her they can fix it for $199 and if she took it to Best Buy, they would charge her $350-$400, this fueled her worry.

After some calming I finally had her start the Windows Remote Assistance application, but unfortunately she has forgotten what saving files actually means and she has no email configured. So she is unable to save the remote assist file and she can’t use Windows Remote Assistance to automatically email the request to me. It was at this point that I suggested she mail the laptop to me. I also may have said, “never again!” when I agree to support a laptop that someone else gifted her.

But, I couldn’t let it go. This was a challenge and I love a challenge.

I searched around a bit and tried my hand at the msra.exe command line. After a bit of trial and error, I realized I can have her open a powershell and type

msra /saveasfile helpme 12345678

Yes, I’m ok with the 12345678 password in this case. Trying some other password over the phone and having her type it was error prone.

“Did you say bee?”

“No I said pee, like Paul.”

“Bee like ball?”

“No…”

I still needed a way to get a file to me. I’ve had an aversion to PowerShell ever since it launched, despite tech reviewing a very fine PowerShell book. I knew it was probably my best bet at getting a file to me. After a bit of poking I found the invoke-webrequest helper, thingy. I don’t know PowerShell terminology. It looks like a function to me.

I have my home server on the internet. Its running Ubuntu Linux  and I’ve had 4 line php upload scripts with html forms that let people send me files for years. Could I use this?

The shoelace was there. The paperclip was there. Did I also have some bubble gum?

All I really needed as an index.php in a /mom/ directory that looked like this:

<?php
file_put_contents('err.out', file_get_contents('php://input'));
?>

Wow that is some trivial stuff. Bland bubble gum, I guess.

Why an index.php and a /mom/? Well, because that will be easy for me to relay over the telephone.

I did some testing and found invoke-webrequest works nicely coupled with this http request body dumping php.

invoke-webrequest -uri jrwren.xmtp/mom/ -infile .\helpme
.msrcincident -method post

I was able to call my mom back, tell her, to press windows key-r, reminding her that windows key is usually between the ctrl and alt on the keyboard, and to type powershell and press enter.

“Powershell, P-O-W-E-R-S-H-E-L-L- no spaces?”

“Yup”

On first try, I tried to have her use the password 1234, but msra.exe complained that it was too short. Working through this mistake, I tried to have her use the up arrow to edit the previously executed command line in powershell.

“What is the up arrow?”

This honestly dumbfounded me and I had absolutely no idea what to do for a minute or so.

“The up arrow on my keyboard is on the right. There is an inverted tee of arrows, left right up down to the left of my left control key.”

Whew, I got lucky and she found it.

Once we had the msra.exe create the helpme file, I had her type out the invoke-webrequest command, prompting her to press tab after typing helpme to autocomplete the file extension.

The multiline color output of running the command shocked and surprised her. It maybe even scared her a little bit, but as she was reading it aloud, I heard her say, “200 OK”

“200 OK is great”, I said.

I checked my server and there was an err.out file along side the index.php. The only two files in the mom directory.

My home server always has samba setup. I used Windows Explorer to navigate to H:\public_html\mom and I renamed err.out to helpme.msrcincident. I double clicked it.

Mom said, “Oh what is this? jrwren wants to share your computer.”

I rejoiced inside.

The hard part being done, I was able to connect and control her computer. Microsoft has done a very nice job with Windows Remote Assist, ever since Windows 7. I’m impressed that my Windows 7 can connect flawlessly to her Windows 8.1. I’m thankful that PowerShell is out of the box in all versions of windows. I do not think I’d have been able to walk her through this over the phone with this few keystrokes without PowerShell.

To the evil con artists who extort money from poor little old disabled ladies who work two jobs: please stop.

Testing Out Apache All By Yourself

By all by yourself, I mean, without root.

This is on my Mac running OSX 10.10.

  1. Get yourself an httpd.conf – cp /private/etc/apache2/httpd.conf .
  2. Edit it to use a port >1024 and with user you – Listen 8081 & User jrwren & Group staff
  3. Log to a place you can write – ErrorLog /home/jrwren/errorlog & CustomLog /home/jrwren/access_log combined
  4. Use different pidfile –  PidFile /home/jrwren/httpd.pid Do this fter the Include /private/etc/apache2/extra/httpd-mpm.conf
  5. Accept mutex –  Mutex file:/home/jrwren
  6. Edit whatever else you want – ProxyPass / http://localhost:8080 & SetOutputFilter DEFLATE to see that Apache proxy does gzip for you
  7. Start httpd – httpd -d . -f httpd.conf -X

Outlook 2007 autodiscover for the rest of us

Outlook 2007 has a new feature which is provided to it by default only by Exchange 2007.

I recently thought of giving Outlook a try instead of Windows Live Mail. Of course, the geek in me wanted my Outlook 2007 first run experience to be awesome. I wanted autodiscover to work for me.

It turns out if you read a KB article and a technet article you can figure out that for 99% of POP/IMAP/SMTP installations, placing an autodiscover.xml file at a url of http://myemail.com/autodiscover/autodiscover.xml or at http://autodiscover.myemail.com/autodiscover/autodiscover.xml is all that is needed. The format of the xml is very simple.

<?xml version="1.0" encoding="utf-8"?>
<Autodiscover xmlns="http://schemas.microsoft.com/exchange/autodiscover/responseschema/2006">
  <Response xmlns="http://schemas.microsoft.com/exchange/autodiscover/outlook/responseschema/2006a">
    <Account>
      <AccountType>email</AccountType>
      <Action>settings</Action>
      <Protocol>
        <Type>IMAP</Type>
        <Server>mail.xmtp.net</Server>
        <Port>993</Port>
        <DomainRequired>off</DomainRequired>
        <SPA>off</SPA>
        <SSL>on</SSL>
        <AuthRequired>on</AuthRequired>
      </Protocol>
      <Protocol>
        <Type>SMTP</Type>
        <Server>mail.xmtp.net</Server>
        <Port>25</Port>
        <DomainRequired>off</DomainRequired>
        <SPA>off</SPA>
        <SSL>on</SSL>
        <AuthRequired>on</AuthRequired>
        <UsePOPAuth>on</UsePOPAuth>
        <SMTPLast>on</SMTPLast>
      </Protocol>
      <Protocol>
        <Type>POP3</Type>
        <Server>mail.xmtp.net</Server>
        <Port>995</Port>
        <DomainRequired>off</DomainRequired>
        <SPA>off</SPA>
        <SSL>on</SSL>
        <AuthRequired>on</AuthRequired>
      </Protocol>
    </Account>
  </Response>
</Autodiscover>

 

Do that, and the next time an Outlook2007 client tries to configure itself, it will.

Using the MVPS.ORG Hosts file with ISC Bind

I run my own DNS in our home. You may think this is crazy, but test after test has shown that nearly all ISPs provide substandard DNS to their customers. Even the finest DNS servers are only responsive 95% of the time. With the number of times you use DNS, you could be losing seconds or minutes per day while you wait for timeouts and rerequests.

Mvps.org maintains a list of “known bad domains”. While it is certainly not a replacement for other security measures, its another line of defense. It is another tool in the bag. For more reasons, read their site.

http://www.mvps.org/winhelp2002/hosts.htm

I don’t want to maintain host files on all of my home systems and all of the VMs too, I’d like to just tell my DNS server about these hosts and have it do the right thing.

By combining the downloaded hosts file and using this little boo script to map it into bind config, I have done just that. I use include files to bind. I’ve added a line like this to my /etc/bind/named.conf.local on my ubuntu server

include "/etc/bind/named.conf.mvps";

Then, I’ve added the output of this boo script to the /etc/bind/named.conf.mvps file. Reload bind and everything is done.

import System.IO
for line as string in [line for line in  @/\n|\r\n/.Split( File.OpenText("HOSTS").ReadToEnd() ) if (not line.StartsWith("#") and line!=string.Empty and not line.Contains("localhost"))]:
    fields = @/ +/.Split(line)
    if (fields.Length > 1):
      host = fields[1]
      print "zone \"${host}\" { type master; file \"/etc/bind/db.local\"; };"

Before hand, host resolution looked like this:

$ host ad.a8.net
ad.a8.net has address 203.190.224.60

After reloading bind, it looks like this:

$ host ad.a8.net
ad.a8.net has address 127.0.0.1
ad.a8.net has IPv6 address ::1

RegexReplace in SQL Server

My last post was about SQL Server. Even 10+ years later, I’m continually amazed by features which are daily use features in MySQL and Postgresql which are missing from MS SQL Server.

Regular expression matching and replacing is a severely lacking feature.

Sql Server 2005 introduced a means to write user defined functions in .NET code, so this method is uploaded to a sql server and exposed as a function.

[Microsoft.SqlServer.Server.SqlFunction]
public static SqlString RegexReplace(SqlString input, SqlString pattern, SqlString replacement)
{
    var result = System.Text.RegularExpressions.Regex.Replace(input.Value, pattern.Value, replacement.Value);
    return new SqlString(result);
}

 

Now you can use it in select statements.

SELECT        Name, dbo.RegexReplace(Name, N'(?:\d+-)?\d+PPM’, N”) AS e
FROM            Table2

 

Or use it to update tables

update table2 set name=dbo.RegexReplace(Name, N'(?:\d+-)?\d+PPM’, N”)

 

I have absolutely no idea how any database developer could live without this kind of functionality.

SQL Server Command Line Administration

I don’t have SQL Management Studio installed. Perhaps I should install it, it would make my life easier.

Here are the commands I used to create a new database and add myself to it as administrator. I want to own the database too. I don’t want to have to be sa (root for you mysql folk) just to be able to create tables. I want to delegate the ownership of this database instance (not the whole sql server) to a non-admin user. This is just for me, for dev on my laptop.

C:\>osql -E -S .\SQLEXPRESS -V 2 -Q "create database test2"
C:\>osql -E -S .\SQLEXPRESS -V 2 -Q "use test2;exec sp_changedbowner [theknife\jrwren]"
C:\>osql -E -S .\SQLEXPRESS -V 2 -Q "exec sp_configure ‘clr enabled’, 1"
Configuration option ‘clr enabled’ changed from 0 to 1. Run the RECONFIGURE
statement to install.
C:\>osql -E -S .\SQLEXPRESS -V 2 -Q "reconfigure"

Finding this set of commands was the must frustrating experience that I have had in a VERY long time. It made me long for Postgresql or MySQL.

Mono trunk on Ubuntu Jaunty

With the release of Ubuntu 9.04 aka Jaunty later this month, I thought I’d share how to get a current version of Mono on the latest Ubuntu release.

For a number of reasons, Ubuntu always seems to be just a little behind current with its Mono packages. The largest reason IMO is the difficulty in properly packaging Mono for Debian/Ubuntu. I’ve tried and it is not easy.

  1. Make our install a tiny developer environment.
    sudo apt-get install build-essential subversion autoconf libtool bison gettext pkg-config libglib2.0-dev
  2. Install the old Mono 2.0 C# compiler from jaunty so that we can bootstrap the trunk compiler.
    sudo apt-get install mono-mcs
    ln –s /usr/bin/mcs1 /usr/bin/mcs
    This installs enough mono to let the C# compiler run. We will remove this and all its dependencies later.
  3. Fetch the source from SVN. I use a mono-dev-update script to checkout the first time and keep me updated. The script pulls and installs mono, mcs, xsp to /opt/mono on a default ubuntu server install once step 1 above has been performed. To build more parts of mono such as GTK# and MonoDevelop you will need more gtk+ library dependencies.
    sudo mkdir –p /opt/mono/src
    sudo chown $USER /opt/mono /opt/mono/src
    ./mono-dev-update
  4. Remove the bootstrap mcs
    sudo apt-get remove binfmt-support cli-common libmono-corlib1.0-cil libmono-corlib2.0-cil libmono-i18n1.0-cil libmono-i18n2.0-cil libmono-security2.0-cil libmono-system1.0-cil libmono-system2.0-cil libmono0 mono-2.0-gac mono-2.0-runtime mono-common mono-gac mono-jit mono-mcs mono-runtime

The real work is done by the mono-dev-update script. Its a mess of bash script which has served me well for a couple of years now.

I’ll paste the current version of it here, but I’ll also try to keep it up to date for download here.

#!/bin/bash
#mono-dev-update
#Jay R. Wren <jrwren@xmtp.net>
#usage:
#   -s      skip svn operations. don't try to pull updates.
#   -f      force autogen.sh or configure to run.
#   -l n    set build level to value of 'n'. This controls optional package 
#           builds. Not used much
#   -n      no auto retry. if make fails, do not try to ./configure and 
#           make again.

MONO_PREFIX=/opt/mono
#GNOME_PREFIX=/opt/gnome
GNOME_PREFIX=/usr
export LD_LIBRARY_PATH=$MONO_PREFIX/lib:$LD_LIBRARY_PATH
export C_INCLUDE_PATH=$MONO_PREFIX/include:$GNOME_PREFIX/include
export ACLOCAL_PATH=$MONO_PREFIX/share/aclocal
export PKG_CONFIG_PATH=$MONO_PREFIX/lib/pkgconfig:$GNOME_PREFIX/lib/pkgconfig
PATH=$MONO_PREFIX/bin:$PATH
export MANPATH=$MANPATH:$MONO_PREFIX/share/man
if [[ ! -d $MONO_PREFIX/src ]];then mkdir $MONO_PREFIX/src; fi
pushd $MONO_PREFIX/src

SVNDIRS="mcs mono xsp monodevelop gtk-sharp gtkmozembed-sharp type-reflector debugger banshee-sample-plugin olive mono-tools" 
CVSDIRS="banshee"
MAKEDIRS="mono xsp debugger mono-tools monodevelop gtkmozembed-sharp "
BANSCHEEPLUGINS="banshee-sample-plugin" 

level=0
skipsvn=0

optCount=0;
while getopts :xvfsnd:t:cl:m: OPTS ;do
	if [[ $OPTS == "x" ]]; then outputStyle=xml ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "v" ]]; then action=view ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "f" ]]; then forceAGen="true" ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "s" ]]; then skipsvn=1 ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "n" ]]; then noautoretry="true" ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "d" ]]; then DEBUG="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "t" ]]; then transform="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "c" ]]; then for i in $MAKEDIRS;do pushd $i ; make clean ; popd ; done ;exit ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "l" ]]; then level="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "m" ]]; then MAKEDIRS="$OPTARG" ; optCount=$((optCount+2));fi
done
if [[ $optCount != $((OPTIND-1)) ]] ; then echo "optcount($optCount) not equal to OPTIND($OPTIND)";fi

echo "using makedirs $MAKEDIRS"

echo "using prefix $MONO_PREFIX"

if [[ 1 > $skipsvn ]]; then
for i in $SVNDIRS
do
	echo -e "\e[1;31m pushd $i \e[m"
	if [[ -d $i ]];then
		pushd $i 
        if [[ ! -d .svn ]];then pushd .. ; svn co http://anonsvn.mono-project.com/source/trunk/$i ;popd; fi
		echo -e "\e[1;31m svn info"
		svn info
		echo 'svn log -r `svn info|grep Revision|cut -f2 -d' '`:HEAD'
		echo -e "\e[m"
		svn log -r `svn info|grep Revision|cut -f2 -d' '`:HEAD
		echo svn up
		nice -n 20 svn up 
		if [ $? != 0 ]; then echo "ERROR: $?" ;popd ; break; fi
		pwd
		popd
	fi
done
else
	echo "-s detected, skipping svn update"
fi 


#mono, xsp, MD, GTK# etc
function autogenAndMake () {
	echo "running for $*"
	for i in $*
	do
		if [[ -d $i ]]; then
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i 
			if [[ -f "autogen.sh" ]];then
				PROG=./autogen.sh
			else
				PROG=./configure
			fi
			if [[ "true" != $forceAGen ]]; then
				nice -n 20 make && nice -n 20 make install  
			fi

			if [[ "true" != $noautoretry && $? != 0 || "true" == $forceAGen ]]; then 
				echo -e "\e[1;31m "
				echo 'make clean ; $PROG --prefix=$MONO_PREFIX --enable-aspnet --enable-aspnetedit --with-preview --with-moonlight && nice -n 20 make && nice -n 20 make install'
				echo -e "\e[m"
				if [[ -f Makefile ]]; then make clean; fi
				$PROG --prefix=$MONO_PREFIX --enable-aspnet --enable-aspnetedit --with-preview --with-moonlight && nice -n 20 make && nice -n 20 make install || break
			fi
			popd 
		fi
	done 
}

#-l not  speced or even means build all - odd means build only banshee & olive
if [[ $((level % 2)) == 0 ]]; then 
autogenAndMake $MAKEDIRS
fi

if [[ $level > 1 ]]; then
	i=gtk-sharp
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i
			echo -e "\e[1;31m "
			echo 'make clean ; ./bootstrap-2.12 --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install'
			echo -e "\e[m"
			if [[ -f Makefile ]]; then make clean ; fi
			./bootstrap-2.12 --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install || break
	popd
fi

if [[ $level > 1 ]];then
	#olive
	for i in olive 
	do 
		if [[ -d $i ]] ; then
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i
			echo -e "\e[1;31m "
			echo 'make clean ; ./configure  --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install'
			echo -e "\e[m"
			if [[ -f Makefile ]]; then make clean ; fi
			./configure  --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install || break
			popd
		fi
	done
fi

if [[ $level > 1 ]];then
	#banshee
	if [[ -d banshee ]];then
	pushd banshee
		cvs up
		./configure --prefix=$MONO_PREFIX --disable-helix && nice -n 20 make && nice -n 20 make install || break
	popd
	autogenAndMake $BANSCHEEPLUGINS
	fi
fi

popd

Mix09 Downloadable Content – All Days

I’ve seen it posted a couple of times, now, but each post just shows a series of links which I am required to click. I’m not savvy enough to figure out plugins like “DownloadThemAll”, so I wrote this script.

The script throws all of the day one content into a directory called day1, but for days two and three, I thought it would add to the feeling of being there if I broke them out into time slots. Within the day2 and day3 directory there are directories for the start time of each session.

#these first sessions are snagged from: http://blogs.msdn.com/nigel/archive/2009/03/20/mix09-day-1-sessions-available-now.aspx
#day1
mkdir day1
pushd day1
#add key01 to this list if you want the keynote
for i in t66m t71m t25f t40f t19f c10f c24f t04f t41f t72m t65m c28m t12f t61f t79m t09f t05f b02m t46f t45f t24f b04m t43f c02f c12f c27m t38f t07f t14f t26f ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd

#the rest are snatched from : https://content.visitmix.com/2009/Sessions/
#missing from the above originally were t10f, c19f, t07f, t37f

mkdir day2
pushd day2
mkdir 10.30am
pushd 10.30am
for i in t17f c09f t52f c14f c04f c18f c08f t09f t84m t75m t76m
do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 1pm;pushd 1pm;
for i in t33f t51f t20f t56f t23f t31f t54f t01f b05m t85m b01m  ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 2.30pm;pushd 2.30pm;
for i in c05f t12f c01f c03f t28f t49f c22f t02f b03m t82m t74m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 4.15pm;pushd 4.15pm;
for i in t48f c21f t50f t13f c13f t60f c23f t03f t69m t86m t68m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
popd

mkdir day3
pushd day3
mkdir 9am;pushd 9am;
for i in t39f t59f t44f t42f t18f c17f t11f t06f t67m t78m t81m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 10.45am;pushd 10.45am;
for i in c20f t27f t62f t58f t29f c07f t35f c16f t63m t64m t80m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 12.30pm;pushd 12.30pm;
for i in t32f t55f t22f c26f t34f t36f t30f t16f c29m t77m c30m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 2pm;pushd 2pm;
for i in c15f t57f t47f t15f c06f c11f t87f t53f t83m t70m t73m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd

popd

You can grab this script here: http://workwith.net/scripts/mix09fetch.sh

It requires only a cygwin install with bash and curl.