Windows 7 DhcpNodeType Is Still a Bummer: How I returned from teched09 to have a broken network

https://connect.microsoft.com/windows7/feedback/ViewFeedback.aspx?FeedbackID=452703

I use file sharing on my home network quite a bit. I don’t have a large hard drive in my laptop. I use network shares at home to watch video downloads (all Mix08, PDC08, Mix09 content!) and store some personal files.

I don’t use Active Directory. I don’t use Windows Servers. Why would I do that at home? I’m a home user after all.

Enter Windows Behavior Fail. I connected to exactly 3 wifi networks while I was away at TechEd09: TechEd, Comfort Inn, and t-mobile @ LAX. One of them used DHCP to force my node into Netbios Name Resolution point to point mode, rather than its normal hybrid mode. This means that instead of broadcasting for a netbios name, my client host would talk only to the specified WINS browser on the network.

Except I don’t have a WINS server on the network, which effectively disables Netbios Name Resolution. I came back home, plugged into (or Wifi connected, either way is the same) my home network, and I had no access to my file shares.

I’ll bet a Mac would never do this.

So, if you ever use Windows (not just if you call yourself a Windows user, mind you). Please click the link at the beginning of this post and beg Microsoft to change their behavior. I can understand disabling WINS while I am on a public WIFI. Heck, I might ever recommend it to some customers who have public WIFI, but for the setting to not revert after I am back on a different network which does not set this DHCP setting is simply unacceptable.

Of course, the network administrators work around is to forcefully set this DHCP setting to hybrid. That is fine for savvy network administrators, but it is NOT fine for the rest of us with Linksys routers doing the job of DHCP.

Incidentally, if you happen to run ISC DHCP (which I highly recommend), you can send this setting to your DHCP clients with this setting in your subnet block:

option netbios-node-type 8;

I hate to rant like this immediately after TechEd09 (which was an awesome event), but it was the most pressing issue on my return home.

Mono trunk on Ubuntu Jaunty

With the release of Ubuntu 9.04 aka Jaunty later this month, I thought I’d share how to get a current version of Mono on the latest Ubuntu release.

For a number of reasons, Ubuntu always seems to be just a little behind current with its Mono packages. The largest reason IMO is the difficulty in properly packaging Mono for Debian/Ubuntu. I’ve tried and it is not easy.

  1. Make our install a tiny developer environment.
    sudo apt-get install build-essential subversion autoconf libtool bison gettext pkg-config libglib2.0-dev
  2. Install the old Mono 2.0 C# compiler from jaunty so that we can bootstrap the trunk compiler.
    sudo apt-get install mono-mcs
    ln –s /usr/bin/mcs1 /usr/bin/mcs
    This installs enough mono to let the C# compiler run. We will remove this and all its dependencies later.
  3. Fetch the source from SVN. I use a mono-dev-update script to checkout the first time and keep me updated. The script pulls and installs mono, mcs, xsp to /opt/mono on a default ubuntu server install once step 1 above has been performed. To build more parts of mono such as GTK# and MonoDevelop you will need more gtk+ library dependencies.
    sudo mkdir –p /opt/mono/src
    sudo chown $USER /opt/mono /opt/mono/src
    ./mono-dev-update
  4. Remove the bootstrap mcs
    sudo apt-get remove binfmt-support cli-common libmono-corlib1.0-cil libmono-corlib2.0-cil libmono-i18n1.0-cil libmono-i18n2.0-cil libmono-security2.0-cil libmono-system1.0-cil libmono-system2.0-cil libmono0 mono-2.0-gac mono-2.0-runtime mono-common mono-gac mono-jit mono-mcs mono-runtime

The real work is done by the mono-dev-update script. Its a mess of bash script which has served me well for a couple of years now.

I’ll paste the current version of it here, but I’ll also try to keep it up to date for download here.

#!/bin/bash
#mono-dev-update
#Jay R. Wren <jrwren@xmtp.net>
#usage:
#   -s      skip svn operations. don't try to pull updates.
#   -f      force autogen.sh or configure to run.
#   -l n    set build level to value of 'n'. This controls optional package 
#           builds. Not used much
#   -n      no auto retry. if make fails, do not try to ./configure and 
#           make again.

MONO_PREFIX=/opt/mono
#GNOME_PREFIX=/opt/gnome
GNOME_PREFIX=/usr
export LD_LIBRARY_PATH=$MONO_PREFIX/lib:$LD_LIBRARY_PATH
export C_INCLUDE_PATH=$MONO_PREFIX/include:$GNOME_PREFIX/include
export ACLOCAL_PATH=$MONO_PREFIX/share/aclocal
export PKG_CONFIG_PATH=$MONO_PREFIX/lib/pkgconfig:$GNOME_PREFIX/lib/pkgconfig
PATH=$MONO_PREFIX/bin:$PATH
export MANPATH=$MANPATH:$MONO_PREFIX/share/man
if [[ ! -d $MONO_PREFIX/src ]];then mkdir $MONO_PREFIX/src; fi
pushd $MONO_PREFIX/src

SVNDIRS="mcs mono xsp monodevelop gtk-sharp gtkmozembed-sharp type-reflector debugger banshee-sample-plugin olive mono-tools" 
CVSDIRS="banshee"
MAKEDIRS="mono xsp debugger mono-tools monodevelop gtkmozembed-sharp "
BANSCHEEPLUGINS="banshee-sample-plugin" 

level=0
skipsvn=0

optCount=0;
while getopts :xvfsnd:t:cl:m: OPTS ;do
	if [[ $OPTS == "x" ]]; then outputStyle=xml ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "v" ]]; then action=view ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "f" ]]; then forceAGen="true" ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "s" ]]; then skipsvn=1 ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "n" ]]; then noautoretry="true" ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "d" ]]; then DEBUG="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "t" ]]; then transform="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "c" ]]; then for i in $MAKEDIRS;do pushd $i ; make clean ; popd ; done ;exit ; optCount=$((optCount+1)) ; fi
	if [[ $OPTS == "l" ]]; then level="$OPTARG" ; optCount=$((optCount+2)) ; fi
	if [[ $OPTS == "m" ]]; then MAKEDIRS="$OPTARG" ; optCount=$((optCount+2));fi
done
if [[ $optCount != $((OPTIND-1)) ]] ; then echo "optcount($optCount) not equal to OPTIND($OPTIND)";fi

echo "using makedirs $MAKEDIRS"

echo "using prefix $MONO_PREFIX"

if [[ 1 > $skipsvn ]]; then
for i in $SVNDIRS
do
	echo -e "\e[1;31m pushd $i \e[m"
	if [[ -d $i ]];then
		pushd $i 
        if [[ ! -d .svn ]];then pushd .. ; svn co http://anonsvn.mono-project.com/source/trunk/$i ;popd; fi
		echo -e "\e[1;31m svn info"
		svn info
		echo 'svn log -r `svn info|grep Revision|cut -f2 -d' '`:HEAD'
		echo -e "\e[m"
		svn log -r `svn info|grep Revision|cut -f2 -d' '`:HEAD
		echo svn up
		nice -n 20 svn up 
		if [ $? != 0 ]; then echo "ERROR: $?" ;popd ; break; fi
		pwd
		popd
	fi
done
else
	echo "-s detected, skipping svn update"
fi 


#mono, xsp, MD, GTK# etc
function autogenAndMake () {
	echo "running for $*"
	for i in $*
	do
		if [[ -d $i ]]; then
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i 
			if [[ -f "autogen.sh" ]];then
				PROG=./autogen.sh
			else
				PROG=./configure
			fi
			if [[ "true" != $forceAGen ]]; then
				nice -n 20 make && nice -n 20 make install  
			fi

			if [[ "true" != $noautoretry && $? != 0 || "true" == $forceAGen ]]; then 
				echo -e "\e[1;31m "
				echo 'make clean ; $PROG --prefix=$MONO_PREFIX --enable-aspnet --enable-aspnetedit --with-preview --with-moonlight && nice -n 20 make && nice -n 20 make install'
				echo -e "\e[m"
				if [[ -f Makefile ]]; then make clean; fi
				$PROG --prefix=$MONO_PREFIX --enable-aspnet --enable-aspnetedit --with-preview --with-moonlight && nice -n 20 make && nice -n 20 make install || break
			fi
			popd 
		fi
	done 
}

#-l not  speced or even means build all - odd means build only banshee & olive
if [[ $((level % 2)) == 0 ]]; then 
autogenAndMake $MAKEDIRS
fi

if [[ $level > 1 ]]; then
	i=gtk-sharp
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i
			echo -e "\e[1;31m "
			echo 'make clean ; ./bootstrap-2.12 --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install'
			echo -e "\e[m"
			if [[ -f Makefile ]]; then make clean ; fi
			./bootstrap-2.12 --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install || break
	popd
fi

if [[ $level > 1 ]];then
	#olive
	for i in olive 
	do 
		if [[ -d $i ]] ; then
			echo -e "\e[1;31m pushd $i \e[m"
			pushd $i
			echo -e "\e[1;31m "
			echo 'make clean ; ./configure  --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install'
			echo -e "\e[m"
			if [[ -f Makefile ]]; then make clean ; fi
			./configure  --prefix=$MONO_PREFIX  && nice -n 20 make && nice -n 20 make install || break
			popd
		fi
	done
fi

if [[ $level > 1 ]];then
	#banshee
	if [[ -d banshee ]];then
	pushd banshee
		cvs up
		./configure --prefix=$MONO_PREFIX --disable-helix && nice -n 20 make && nice -n 20 make install || break
	popd
	autogenAndMake $BANSCHEEPLUGINS
	fi
fi

popd

Don’t mix ANY and x86 build and forget about it

you will get a very stupid error like this:

System.BadImageFormatException : Could not load file or assembly or one of its dependencies. An attempt was made to load a program with an incorrect format.

In my case, I was getting it in a TestFixtureSetUp method.

DUH. I had set the class library project to force to x86, because I’m on x64, but I wanted to use edit and continue. I never changed the Test project, so it was still running x64. It failed with a “BadImageFormatException” when it loaded the references x86 project.

Don’t be stupid like me.

Mix09 Downloadable Content – All Days

I’ve seen it posted a couple of times, now, but each post just shows a series of links which I am required to click. I’m not savvy enough to figure out plugins like “DownloadThemAll”, so I wrote this script.

The script throws all of the day one content into a directory called day1, but for days two and three, I thought it would add to the feeling of being there if I broke them out into time slots. Within the day2 and day3 directory there are directories for the start time of each session.

#these first sessions are snagged from: http://blogs.msdn.com/nigel/archive/2009/03/20/mix09-day-1-sessions-available-now.aspx
#day1
mkdir day1
pushd day1
#add key01 to this list if you want the keynote
for i in t66m t71m t25f t40f t19f c10f c24f t04f t41f t72m t65m c28m t12f t61f t79m t09f t05f b02m t46f t45f t24f b04m t43f c02f c12f c27m t38f t07f t14f t26f ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd

#the rest are snatched from : https://content.visitmix.com/2009/Sessions/
#missing from the above originally were t10f, c19f, t07f, t37f

mkdir day2
pushd day2
mkdir 10.30am
pushd 10.30am
for i in t17f c09f t52f c14f c04f c18f c08f t09f t84m t75m t76m
do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 1pm;pushd 1pm;
for i in t33f t51f t20f t56f t23f t31f t54f t01f b05m t85m b01m  ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 2.30pm;pushd 2.30pm;
for i in c05f t12f c01f c03f t28f t49f c22f t02f b03m t82m t74m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 4.15pm;pushd 4.15pm;
for i in t48f c21f t50f t13f c13f t60f c23f t03f t69m t86m t68m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
popd

mkdir day3
pushd day3
mkdir 9am;pushd 9am;
for i in t39f t59f t44f t42f t18f c17f t11f t06f t67m t78m t81m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 10.45am;pushd 10.45am;
for i in c20f t27f t62f t58f t29f c07f t35f c16f t63m t64m t80m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 12.30pm;pushd 12.30pm;
for i in t32f t55f t22f c26f t34f t36f t30f t16f c29m t77m c30m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd
mkdir 2pm;pushd 2pm;
for i in c15f t57f t47f t15f c06f c11f t87f t53f t83m t70m t73m ; do
        curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd

popd

You can grab this script here: http://workwith.net/scripts/mix09fetch.sh

It requires only a cygwin install with bash and curl.

Mix09 Downloadable Content – Day1

Nigel Parker put together and awesome list of all the day1 mix content that is downloadable.

http://blogs.msdn.com/nigel/archive/2009/03/20/mix09-day-1-sessions-available-now.aspx

I want to watch it all over the coming weeks(or months), but I don’t like clicking, so I put together this little shell script to pull it for me.

$curlget=”curl -L -g -C – -b cookies.txt –O”
#day1
mkdir day1
pushd day1
#add key01 to this list if you want the keynote
for i in t66m t71m t25f t40f t19f c10f c24f t04f t41f t72m t65m c28m t12f t61f t79m t09f t05f b02m t46f t45f t24f b04m t43f c02f c12f c27m t38f t07f t14f t26f ; do
    $curlget http://mschannel9.vo.msecnd.net/o9/mix/09/wmv-hq/$i.wmv
done
popd

So if you have a linux system, or cygwin, or even just bash and curl installed, this script pulls them all for you.

Windows 7

I received an email from a local Unix/Linux user group email list this morning with a link to this “review” of Windows 7.

http://jhansonxi.blogspot.com/2009/02/linux-users-review-of-windows-7-beta.html 

These are my thoughts.

Having upgraded from Vista to Win7 beta, on 1.5yr old crappy hardware. I can assure you that many of the points in this "review" are absolute lies.

"Windows 7 Setup …Like in XP it is single-task based so that every partition edit is immediately applied while most Linux installers queue up a series of operations and then perform them in a batch"
is an absolute lie. Its as if the reviewer paid no attention to what he was doing.

"The installation and updates took a long time with several reboots"
Exactly the opposite of my experience. After answering all of the questions up front (see previous lies about the installer) on a Friday night, I went to bed and awoke to a rebooted, ready to use Windows 7. There was no login/reboot experience in the install.

The 2nd paragraph of section 2. Installation, the reviewer babbles on about his awesome Linux setup. And while I agree, all of those things are cool, they have nothing to do with a Windows 7 beta review. This gem hidden in that paragraph pretty much sums up the section: "I haven’t tried any of them but …".  Yeah, I can tell.

In section 3. Storage, he does point out some VERY good differences between LVM and RAID setup at install time in Windows 7 versus Linux.

In section 4. Encryption, He continues to trash BitLocker difficulties in installing. Honestly, this is a great detailed comparison of a feature that I’ve never known anyone to use.

In section 5, he fairly says, "I would have to use the superbar a lot to know if I like it better."

in section 6, he makes a bunch of totally irrelevant points like the lack of an office suite. He calls Windows Media Player 12 a "bloated fatware compared to foobar2000, Media Player Classic, and VLC (on Ubuntu I use Rhythmbox). That is cute, except that it is a total lie. WMP12 is awesome. It starts fast. It is still the only media player that does play speed adjustment well. Its only real comparisons on Windows are iTunes, which it smokes, and Winamp. Yes, I still use foobar2000 and Media Player Classic and VLC. Depending on which type of media I am playing back, I use all of the above. But one things that WMP12 is NOT is bloated. It is damn fast.

He continues to point out that DivX MPEG-4 code doesn’t ship with Windows 7. It’s a good true point. WMP doesn’t do a good job of resolving codec problems. Yes, that sucks. Man up and go grab ffdshow from sourceforge. http://ffdshow-tryout.sourceforge.net/ Do not use k-lite codec pack like he suggests. It actually replaces better codecs in the Direct Media codec configuration.

Finally he ends section 6 saying "There are several similar PVRs for Linux" when pointing out Windows Media Center. Seriously??? I think it would be better to just say "I’m stupid and I don’t know what I am talking about." If you have ever setup and run MythTV and compared it to Windows Media Center, you are seriously on drugs. There is no comparison. The ONLY valid comparison is the cost, and the hours you spend on WMC is easily worth the dollars spend on WMC. At least make the freetard "I am free to modify it" argument here. Because as far as functionality, there is no comparison.

In section 7 he confused WOW64 with 32bit and 64bit. I think he thought he was in the 32bit ODBC setup. It is easy to get confused here, because in 64bit windows, there is still a system32 directory, and worse, system32 contains 64bit binaries. There is also a SysWow64 directory, and it contains all the 32bit compatibility binaries. But again, it shows the ignorance of the reviewer. http://msdn.microsoft.com/en-us/magazine/cc300794.aspx

Same section, next paragraph, there is more ignorance, "The Jet engine is the abandoned offspring of the SQL Server team". Anyone even remotely familiar with the history of product groups at Microsoft knows that this is false.

Section 8 talks about UAC, PC Safeguard and parental controls, but does not really say anything.

I have no idea what Section 9 is even doing in a review article. It tries to make some security point and talks about some XDG security concern on Linux and tries to convince you that it is a concern on windows too with lnk files. Read the paragraph and see if you can find the attack vector. I can’t. Its definitely not email, because Windows 7 doesn’t ship with an email client. The suggested download email client, Windows Live Mail, doesn’t let you execute any attachments.

Section 10 starts off saying "I wanted to try recreating the common Linux practice of separating user files from the rest of the OS with /home as a mount point for a separate partition."  Guess what?  Windows isn’t linux. If you want to run Linux, run Linux. Do things the linux way. If you want to run Windows, do things the windows way. Any Windows administrator will tell you, don’t do what you are trying to do! There is this neat thing call the registry which exists in your home directory. Its not going to be portable across windows installs, so don’t bother with that separate home partition. If you want backups, use backups. A separate partition is not getting you anything here. No idea what the next 3 paragraphs have to do with a Windows 7 review.

Section 11 starts off talking about Homegroups, but doesn’t explain them. See the PDC keynote, and then read these two paragraphs and tell me what is wrong here. The 3rd paragraph attempts to describe Libraries. Again, see the PDC video example and compare to this paragraph.  The remainder of this section tries to declare these new features useless. Ok, I know I already use the Library feature. If you don’t like it, don’t use it. I also found another gem, "I think that when a file transfer occurs it’s done under the AlphaUser account and then the ownership is changed on arrival."  You think do you?  Shouldn’t you KNOW?

Still in section 11, I just had to address this comment, "A default install of Ubuntu doesn’t include SMB file sharing although it’s easily enabled by adding Samba." Yes, so true, and resetting every password for all of your users so that they can access the file share. That is fun to do every time, isn’t it? Again, the rest of this paragraph talks all about Ubuntu features, which is strange for a review of Windows 7.

Section 12 is the epilogue which I wish I had read first so that I didn’t waste my time feeling compelled to write this commentary. "Windows 7 beta seemed relatively stable but I wasn’t really installing much or putting it under continuous use." Yes, I can tell. You haven’t used it at all.

"I had a lot more problems with the initial release of Vista." Fair enough, it just happens to be the opposite experience I had. In fact, I have had almost no issues, at all. On what many would consider a more unstable configuration: I upgraded from Vista.

"I did manage to crash Explorer a few times without trying but this is a "beta" which is the equivalent to an "alpha" for most other software projects." WOW, did you use this thing at all? This "beta" is more like an RC3 for most other software project. I installed Win7 a month ago, on my ONLY computer. Not this VM in which you tested it. I use it every day, and it is the ONLY computer I use. I do 100% of all of my work on it.

"This is the reason why many system administrators wait until the first Windows service pack before mass deployments." No they won’t. They will flock to it because its far better than Vista.

"Some readers may get the impression that I’m against commercial software or closed-source." And he proceeds to give reasons why he is objective. I’ll never consider you objective. But I’ll definitely consider you looney and nuts and wacco.

Finally, my own comments: "Reviews" like this aren’t reviews at all. They are comparisons, and they are not valid comparisons. they provide absolutely no value to anyone. They portrait Linux as better. I love Linux. I use Linux daily — in a server environment. Comparisons like this suggest that Linux doesn’t need to improve or play catch up to Windows. I run Windows on my primary use computer laptop, the one I spend 10+ hrs a day at, because it is just plain better. Linux is better in so many ways, ways that this article touts. But Windows is also better in many ways. Depending on how you use your computer determines which ways are most important to you and which is a better choice. I think I just defined pragmatic in there.

Eric Sink doesn’t understand DVCS

Sorry Eric, but that is what I read when you trash Git’s index.

Eric says “One of the best practices I suggest in Source Control HOWTO is to never use a version control feature which encourages you to checkin code which you have never been allowed to compile and test.”

I agree 100% with that “best practice” when using a centralized version control system. But Git, Bazaar and friends are not centralized version control systems. They are decentralized.

With a decentralized version control system, your commits are decoupled from your compiles and tests. Your commits are decoupled from your build server. Your commits are decentralized from any team process or build process system. Remember that “check in” or “commit” in a distributed version control system just means “hey, I might want to come back to this at some point in time”. They key word there being “I”. Its about you as a single developer. Its about being selfish. Its about being able to work without barriers. Its about being able to make large sweeping changes and then rollback all but some of them and then merge only those changes which you want with your teams branch.

Ok, so Eric ends his post saying “That doesn’t mean Git or its index are bad.  I’ll agree that "git add –p" is a very powerful feature that has its place.  But in this respect, Git is a bit like C.”

Fair enough Eric. Eric knows that this is a powerful tool to be used carefully by a skilled craftsperson. That is what we are as programmers.

If C’s killer feature is casting an int to a pointer, then Ruby’s killer feature is open classes. Sure, it can be a confusing tool to the uninitiated, but ask a well versed programmer to not use that tool and I would encourage them to call you crazy.

Don’t fear your tools. Master them. They will server you longer than any fear will.

I’m very glad that Eric says that “no book on this topic can be truly credible these days without covering distributed version control tools”. He says these tools aren’t mainstream yet, but with ruby forge, linux and github using git and with Ubuntu, mysql, Mainman and squid using Bazaar, DVCS is mainstream.

Using Decentralized Version Control for Your Presentation Demos

Occasionally, I present some slides and demo code at a local user group or event. Sometimes these demos include some code that I write throughout the presentation. By the end of the presentation, there is a bunch of code there that I did not start with.

Problem: next time I give the presentation, invariably, I have forgotten to go back and reset things so that my code files don’t have the code to be written in them.

Solution: use a decentralized version control system for presentation demo code.

It doesn’t matter if it is Bazaar (the best), Git or Mercurial (the others). Use one of them!

Use looks like this:

  1. Prep your demo project with the files to be handled when you start your presentation
  2. bzr init ; bzr add ; bzr commit –m ‘initial check in’
  3. Give your presentation. When you are done, run bzr revert. You need not even commit the changes.

Sure, its pretty much the same as making a backup, but when you invariably forget to remove the post-presentation written code, you can simply type bzr revert. Typing one command that is not prone to error is a lot easier than doing the “remove folder, rename folder” dance.

For longer presentations like 1/2 day or day long tutorials or multi day classroom training, you can break your code into stages. Just commit at every stage. Then you can bzr revert –r 2 to get to your 2nd stage code, bzr revert –r 4 to get to the tail end of the day, etc.

You can even use loggerhead to show your class participants differences between revisions, or even just bzr diff output. You will be able to show people a different view of the evolution of your project, which hopefully will give them new understanding and more clarity. This is why we use these tools, right?

Precious Screen Real Estate

Your value has not fallen like the housing market.

Jennifer Marsman retweeted a link to 50 seriously useful Windows 7 tips. I liked number 16 “recover screen space” which suggests using small icons on your taskbar.

This inspired me to dig through the new display control panel and find how to do my favorite “recover screen space” tweak, shrinking window borders.

  1. Right click on your desktop and select personalize
  2. From the bottom row, select the second item “Window Color”
  3. Click the “Advanced appearance settings…” link
  4. The 3rd item in the Item: drop down (combobox) is Active Window Border, but its size is already 1. Instead, select the 5th item “Border Padding” and reduce it to 2 or 1. I like 1.
    (This dialog should look familiar to users of Windows 3.0, it hasn’t changed much since then.)

Real Estate is always precious, especially when it is on my screen. Yes, a Border Padding of 1 is still useable on a 15.4” 148DPI display (that’s 1920×1200).

Updated Life Goals

I was tagged regarding annual goals. I’ll just call them life goals, since I am not going to tie them to a specific year. If I do them right, they will change my life permanently. Of course a goal needs a timetable, so knowing how hectic my Decembers are, I’ll say these should be done 10.5 months from now.

  1. Create More
    I consume a lot of content. Some things will remain strictly consumable for me. Music, television and movies are not something I’m willing to create. Books, articles, blog posts and other forms of writing are something I should create more. I consume them greatly. I probably read too many blog posts, but I’m ok with that. Code is something I should definitely be creating. Sending patches to my favorite open source projects or trying to port them to Win32 are my goals.
  2. Watch Less Television
    I’m a Law and Order addict. I never watched it when it was new. I still don’t watch the original. Criminal Intent and Special Victims Unit are always on and I’ve never seen them. Its 10+season of episodes that always waste my time. I’ll probably also watch less sports too.
  3. Trash ten pounds
    I lost the twenty pounds I gained while my wife was pregnant, but the twenty I gained throughout my 20s are still with me. I’d like to keep some of the weight as muscle and shed the fat, but I think I’d be better at 165 pounds instead of 175 pounds.
  4. Get Out of Debt
    I hate debt. I’ve got the debt-free bug. I want out. I want to owe no one. My family religion even calls debt slavery. "The rich rule over the poor, and the borrower is slave to the lender." (Proverbs 22:7)
    Last year my wife and I paid off the last of our non-housing debt. The ambitious goal this year is to pay off our mortgage. Its a huge goal, but it is one that I am passionate about. Think your house is an investment? Maybe yours is. Mine will never be. Think a house is a good investment? Well, consider you bought in 2003? How is that “good investment” performing for you?

I’m not calling anybody else out. If you read this, consider yourself called. What areas of your life are you currently working to improve?