I don’t get it. <strong>Anatoly</strong> discusses how he (along with other people he quotes) don’t use the debugger in C# or any other managed code and letting the runtime find the exception and let it “deal” with it.
Now I don’t know these people and they could probably be correct in the context of the applications they write, but I don’t agree with their viewpoint. Sure there are applications that would not warrant the use of a debugger (like HelloWolrd *grin*), but most of the applications I design and work on are fairly complex and the debugger is a very invaluable tool! Exceptions are very expensive (as discussed <strong>here</strong> ) and should be used wisely! Of all the features of Whidbey, the one I am most excited about is the huge improvements in the debugger and also how you an extend the debugger with GUI to inspect your complex data types. The point also holds for any managed language and not just .NET. What do you think?