Alan wrote about profiling MonoTorrent.
He has some specific cases where GC is causing him issues. He gives some examples in the form of “I want to GC here”. I hate to be a nay sayer, but… well… NAY! 🙂 I’m a big fan of allowing GC to function. If you think you can outsmart GC, then you are wrong. If you think you need to, then there is something wrong with your model. I say don’t call GC.Collect().
Now in Allan’s first example, ignoring memoization as purhaps a better way, if the issue is that Example() is going to be called a large number of times, allocating many MyObject instances, then purhaps using an advanced C# feature is in order. C# does allow for stack based allocation. It requires the code to be marked unsafe, but it is likely that this use case is EXACTLY why this feature is included in the language. This isn’t Java that we are dealing with here! (small jab-hehe)
public unsafe static void Example() {
MyObject *pa = stackalloc MyObject[1];
MyObject a = *pa;
a.x = 15;
a.DoACalculation();
}
Of course, this requires that MyObject be a value type and not a reference type (struct, not class). And in my testing on Mono, I could not get this to perform better than simpy using new but using a struct instead of a class. So maybe refactoring a class to a struct would help performance. It maybe a good first step to see if it helps before going unsafe.
I must admit that I’m not sure I’m benchmarking this properly. I’m really just using this program. If anyone can shed some light on best practice here, I would appreciate it.
using System;
public class StackAllocExample {
public static void Main() {
Console.WriteLine("stackalloc");
DateTime start = DateTime.Now;
Console.WriteLine(start);
for(int i = 0 ; i < 100000000; i++ ){
Example();
}
DateTime stop = DateTime.Now;
Console.WriteLine(stop);
Console.WriteLine(stop-start);
start = DateTime.Now;
Console.WriteLine(start);
for(int i = 0 ; i < 100000000; i++ ){
SafeExample();
}
GC.Collect();
stop = DateTime.Now;
Console.WriteLine(stop);
Console.WriteLine(stop-start);
Console.WriteLine("heap alloc struct");
start = DateTime.Now;
Console.WriteLine(start);
for(int i = 0 ; i < 100000000; i++ ){
SafeStructExample();
}
GC.Collect();
stop = DateTime.Now;
Console.WriteLine(stop);
Console.WriteLine(stop-start);
}
public unsafe static void Example() {
MyObject *pa = stackalloc MyObject[1];
MyObject a = *pa;
a.x = 15;
a.DoACalculation();
}
public static void SafeExample() {
MyRObject a = new MyRObject();
a.x = 15;
a.DoACalculation();
}
public static void SafeStructExample() {
MyObject a = new MyObject();
a.x = 15;
a.DoACalculation();
}
}
struct MyObject {
public int x;
public void DoACalculation() {
ResultFromCalculation = x+1;
}
public int ResultFromCalculation;
}
struct MyRObject {
public int x;
public void DoACalculation() {
ResultFromCalculation = x+1;
}
public int ResultFromCalculation;
}
There’s one or two flaws in your argument there (but bear in mind i agree 100% that GC.Collect() should rarely (if ever) get called).
Firstly, I think the original complaint was that the biggest allocations were from methods internal to the framework and so there’s no chance of changing those methods to decrease allocations. With sockets, you pretty much have to use Async IO for high performance IO involving a lot of active connections. It lets you use IOCP under windows which is a pretty damn good reason to not use the classic Socket.Select way of doing things.
Secondly, stackalloc can’t be used on classes as you correctly pointed out, it can only be used on primitive types *and* on structs. Now, you say you noticed no difference between stackalloc’ing a struct and just declaring it, that’d be right. There is no difference. Structs are allocated on the stack, not the heap. Classes are allocated on the heap. However, an array of structs is allocated on the heap (as all arrays derive from System.Array or Array). If you want to see the benefits of stackalloc, change the code above to the following: MyObject *pa = stackalloc MyObject[100]; and compare it to MyObject[] pa = new MyObject[100]; called 100,000 times.