WCF Service Host implementers beware

Jon Flanders has a great and very detailed post about implementing your own ServiceHost by extending ServiceHostBase instead of ServiceHost when using WCF.

He points out what I believe to be a serious flaw, not with the design of WCF and ServiceHost, but rather the implementation of the two. It turns out ServiceHostBase relies on specific behavior of ServiceHost. This is a typical OO no-no. Jon describes it very well and calls it a “magic method”. I just call it a mistake.

It is disappointing to think that there was not a CanInheritFromServiceHostBase test case out there in WCF-land somewhere. <snarky>I guess it was too difficult to write with mstest, and they weren’t allowed to use NUnit or MbUnit</snarky> 🙂

update: A link to the post would be helpful. 🙂

Ben Scheirman hates MSTest too!

I’m not the only one!

http://feeds.feedburner.com/~r/flux88/~3/141244484/DearVSTSTestIHateYou.aspx

Ben, since MSTest‘s functionality dwarfs that of NUnit’s I’ve found it relatively easy to have most of my tests run in both MSTest and NUnit. Of course, where I used NUnit’s more advanced functionality, I just disable the test in MSTest. Yes, its kind of a hack, but it works.

Each of my test files have using statement sections like this:

#if NUNIT
using TestClassAttribute = NUnit.Framework.TestFixtureAttribute;
using TestMethodAttribute = NUnit.Framework.TestAttribute;
using TestInitialize = NUnit.Framework.SetUpAttribute;
using NUnit.Framework;
#else
using TestFixtureAttribute = Microsoft.VisualStudio.TestTools.UnitTesting.TestClassAttribute;
using TestAttribute = Microsoft.VisualStudio.TestTools.UnitTesting.TestMethodAttribute;
using Microsoft.VisualStudio.TestTools.UnitTesting;
#endif

 

Then I maintain two project files. One is a MSTest project, and only opens in Team System edition. The other is a standard Class Library project – the typical NUnit test project. The later defines NUNIT in its project settings. By using type aliases like this, I can use either NUnit or MSTest attributes in my test fixtures. Of course any tests using NUnits Constraint Model Assertions won’t work in MSTest so I’ll either #if def out those tests, or I won’t include the entire fixture in the MSTest project.

Wunda’s Changing Community

Wendy Friedlander, aka “wunda” has an awesome post over at her site. She says:

The development environment of many institutions is molded to the vision of software provided by people with no insight.

Pushing irrelevant solutions and deadlines.

Traveling a rote technology line.

Where does that leave the rest of us?

We go with the flow.

We find a place that embraces agile.

We work to change our environment.

It takes a great deal of energy to foster change in your environment.

It takes time, effort, trust and many other qualities that make going with the flow look better and better as time goes by.

You are not alone.

Your company software policies were not created by your company.

They are the practices of the community.

A community that conforms to business and technology providers’ desires and not the interest of creating the best solutions.

I love the sentiment, and I love that she is talking about ALT.NET.

Visual Studio 2008 beta 2 high-dpi dialog oops.

Every development shop that ships software should try their software on 1920×1200 15.4″ displays with the DPI set to its real world value of 148. Ideally things would be tested at all DPI settings. A lot of stuff doesn’t display properly.

DebuggingNotEnabled

Oops. This is the default dialog any new ASP.NET Web Application Project will get in Visual Studio 2008 beta 2. I think I am supposed to click OK. 🙂

It isn’t about the tools. It is about what the tools get you.

“It’s not the tools, it’s the solution.”

This is the phrase that David Laribee used in his ALT.NET post back on April 10th.

I agree, it isn’t the tools. I agree, it is the maintainability of the solution. After “does it work?” Maintainability is the number one concern when I write software.

That said, sometimes “the better way” IS about the tools. I won’t go so far as some others, or as I did previously, BUT (see it is upper case, that means it is a BIG BUT) there are a couple items on the list that deserve to be upper case, or bold, or H1 or Heading 1 or screamed at the top of your lungs while standing on the highest table in the auditorium.

I won’t call it HOT or NOT, I’ll call it Acceptable and Inexcusable.

There are two often used tools which in my opinion have absolutely no place on a well meaning developers project. I’ve given it a bit of thought and I cannot come up with any use case where these tools are the right solution. The answer “they are good enough for our team” is heard by my ears as “we aren’t interested in doing a better job.”

VSS. Yes Visual Source Safe. Lock-Modify-Unlock? What??? At the Visual Studio 2005 launch event I nearly fell out of my chair when the Team System demo showed its Copy-Modify-Merge system. Not because I was impressed. No, I was dumbfounded that the CVS I had been using for 10 years and the SVN I’d been using for a few were so far ahead of what many development groups were using. What wasn’t clear was that VSS2005, while it had a new version, it did NOT change the versioning model.

VSS is inexcusable. If your development organization is using VSS don’t just consider switching. Considering alone would not be enough. DO IT. Switch to ANYTHING! The popular move seems to be subversion, and don’t let the open source fool you. You CAN buy the product and support from Collabnet. It isn’t JUST open source. You get MORE from Collabnet than you do from MS with VSS because if your SVN crashes, they will help you recover and help to find the bug for why it crashed. VSS crashes and is full of known bugs and the answer is to backup your repositories regularly. I’d go on here, but a quick google looking for others citing these VSS flaws shows that Jeff Atwood has already covered this ground. It has been a year so I guess it is time someone else mention it.

I will mention that you might consider changing your SCM paradigm and looking at one of the Distributed tools now available. Bazaar runs great on Windows. Git runs in Cygwin. Mercurial has a windows installer.

MSTest is inexcusable. Yes, the Visual Studio integration is very pretty, but after using NUnit, trying to move to MSTest is painful! It hurts. It hurts bad. When you pile on NUnit’s latest Constraint based testing model, there are MANY things that NUnit can test in only one line of code, that MSTest would leave your writing many many lines of code.

I won’t rant on MSTest nearly as long as I did VSS because I haven’t suffered under its tyranny for nearly as long. Maybe (although I doubt it), with a little more time under MSTest and I’ll be able to bear it, but I stand by my original assertion, it is the wrong choice.

LinqDataSource Select N+1

David Hayden over at CodeBetter has a post on performance and LinqDataSource.

The problem that David Hayden describes so well is known as Select N+1. Those of us using NHibernate have been dealing with this since we started using our favorite ORM. There has been much written on this problem.

It is my belief that this is going to be the biggest problem developers are faced with when using any of the LINQ to database technologies. Entities, DataSets, or SQL, it doesn’t matter which LINQ to database you choose. You will have to understand what is going on under the hood a little. David is using the design time features of the LinqDataSource control and he solves the N+1 problem by limiting his functionality.

Not that it solves any of these problems, but MonoRail just keeps looking better. MR doesn’t have a LinqDataSource, but enabling edits and deletes is no different and treated at a different layer in your typical MonoRail w/ ActiveRecord application. These are different concerns. MonoRail w/ ActiveRecord does the right thing and treats them as separate.

Thank you Castle Project!

NUnit tolerance on double arrays

Yesterday I wanted to compare two arrays of doubles in a unit test to make sure I didn’t change some resulting calculations when refactoring.

var expected = new double[] { 1.00000000000000006 , 2.00000000000000005 ,
3.00000000000000008 , 4.00000000000000007 , 5.0000000000000005 };
Assert.AreEqual(expected, process.GetResults());

It turns out that when I printed my expected results to get these constants, my print statement rounded them down to the displayed 15 decimal places. Rather than go print a longer constant to be used in my test, I wanted the ability to test within a given threshold.

It wasn’t until today that I decided to dive into NUnit and implement the needed methods when I browsed the NUnit source and discovered that this functionality already exists. The key is using the correct method overload. the above ends up with no overload. Indeed any arrays passed end up calling to Assert.AreEqual(object,object…) style overloads.

The answer is simple: use NUnit’s Constraint Model assertions.


Assert.That(expected, Is.EqualTo(process.GetResults()).Within(0.0000000001));

Works great! Thank NUnit!

I wish it were obsolete

Sometimes when dealing with legacy code (as in code with no unit tests which cannot be refactored easily), you run into cases that you wish were obsolete. The .NET Framework has a nice attribute called ObsoleteAttribute. You can put it on any members of your types, or even entire types and the compiler will let you know when this “obsolete” member is used. Its like a usage finder for people who don’t have CodeRush’s awesome Find All References feature.

Well, I recently ran across a case where I had to remove an ObsoleteAttribute because while I really wanted it to go away, I found out a case where it wasn’t so easy to remove it. I’m just not quiet ready to put in the work to refactor to the point I can call this property obsolete. So I’m proposing an IWishItWhereObsoleteAttribute.

public class IWishItWereObsoleteAttribute : ObsoleteAttribute {
public IWishItWereObsoleteAttribute(string reason) : base(reason)
{ }
}

But alas, my hopes and dreams are crushed by the creators of the framework

Error 1 'IWishItWereObsoleteAttribute': cannot derive from sealed type 'System.ObsoleteAttribute'

Major bummer.

C# has duck typing

C# 2.0 has duck typing.

No really, I swear!

Krzysztof Cwalina has a post on Duck Notation which I find very interesting.

Normally in .NET and thus C# to iterate you would use IEnumerable. You can call GetEnumerator yourself and call MoveNext from the implemented IEnumerator interface on the returned IEnumerable. Or 99 times out of 100 you use foreach.

The foreach operator in C# doesn’t use IEnumerable. I was so curious I just HAD to try it.

List<string> list = new List<string>(); 
foreach (string item in new C()) { 
	list.Add(item); 
} 
public class C { 
	public E GetEnumerator() { 
		return new E(); 
	} 
} 
public class E { 
	int x = 0; 
	string[] values = { "hello", "world", "foo", "bar", "baz" }; 
	public bool MoveNext() { 
		return x < values.Length; 
} 
	public string Current { 
		get { return values[x++]; } 
	} 
} 

Neither IEnumerable nor IEnumerator are used here, in fact IEnumerator would require the Reset method to be implemented. It is not present here. (Even though Reset is only there for COM interop and often just throws NotSupportedException, it is at least there.)

Also interesting is that there IS some type checking here. If you make the Current property of type object you get no type checking and in this case you would get a runtime InvalidCastException if you tried to add the result to a List<SomethingNotString>.

Is this useful? I don’t know. Maybe in edge cases. I still like IEnumerable<T> and with the extension methods coming in .NET 3.5 I REALLY want IEnumerable<T> to be a lot of places that it might not be if the above were used extensively.

The biggest problem with C# 3.0 is that it isn’t out yet. The second biggest problem with C# 3.0 is that the language features enable use of awesome things in .NET 3.5 and it isn’t out yet either. The third biggest problem with C# 3.0 is that all those awesome .NET 3.5 things need to be distributed so rollout of Framework 3.5 to servers and desktops might discourage your average organization from using an awesome piece of technology.

CI Factory with Subversion in a custom path

The CI Factory docs are pretty good, but being a first timer (ci factory noob!), I struggled a bit with this.

I found this FAQ entry which says to modify files in an already installed CI Factory project directory. It was easy enough to find CCNetServer.Bat in my Default directory that came in the CI Factory zip file, but the run.bat script was still failing.

It turns out that Nant needs the path to your svn.exe (well, of course it does!) and that everything CI Factory defaults to finding things in c:\Program Files\Subversion\bin. Well I don’t keep my SVN there. I keep it in c:\devtools\svn-win32-1.4.3\bin. Just add a PATH statement to the top of your run.bat script.

I gave my run.bat a nice little header that looks like this:

PATH=%PATH%;c:\devtools\svn-win32-1.4.3\bin

cd “c:\devtools\CI Factory”

The cd is because CI Factory’s run.bat (main install script) likes to setup IIS stuff. That is neato in a foolish Windows environment where you run as admin, but not so great when you aren’t running as admin and thus can’t modify IIS configuration. Not to mention I don’t even haven IIS installed!

It was at this point that I browsed my SVN tree again and noticed that CI Factory ACTUALLY WRITES BACK TO MY SCM!!! This is completely unacceptable and what is worse is that I didn’t read it ANYWHERE in the Installation or Introduction documentation. This is the kind of behavior which should have HUGE all upper case read warning text!

So much for CI Factory. I guess Its CC.NET manually for me.