Archive for the ‘.NET’ Category

Language Integrated Query (LINQ)

Wednesday, September 26th, 2007

Last night I attended a San Diego .NET Users Group meeting where the topic was LINQ. The presentation was done by Julie Lerman (http://www.thedatafarm.com/blog and http://blogs.devsource.com/devlife).

Since I have an interest in ORM (see here) I've done some reading on LINQ in the past. It's always amazing to me how much more you seem to learn from a presentation. Especially when the speaker is well organized and knowledgeable and provides an engaging delivery. Great job Julie!

LINQ is very impressive. The new .NET Framework Orcas (VS 2008) language features include:

  • Automatic Properties
  • Extension Methods
  • Lambda Expressions
  • Query Syntax
  • Anonymous Types

These not only provide a foundation for ORM work, but are also powerful .NET language and tool additions for just about any programming task.

A .NET code sample: Real-time data streaming and control.

Monday, September 24th, 2007

This is a follow-up to the Developing a real-time data flow and control model with WCF post. My original plan was to write a full-fledged article on this. I've gotten some requests for the code, but it does not appear that I'm going to have time to complete the article in the near future. So I thought I'd just give a brief description here of what I've done so far and provide the code as is.

Please Note: The description provided is very brief and only meant as an overview. None of the implementation details are included here. It's not a very complicated project. If you have some VS2005 development experience and are willing to dig into the code, you shouldn't have a problem figuring it all out. Working through this code would provide a good first tutorial on developing with WCF. Some set-up is required (service installation) and is described below.

Motivation:

I originally conceived of this project because of some questions I'd heard from real-time C/C++ developers. They wanted to know about migrating from MFC/VC++ to .NET managed code. The primary concern was about the use of legacy device level code and how to manage future mixed development projects.

So my first thought was to demonstrate how easy straight forward it is to incorporate COM components into managed code with .NET wrappers. There are already many good articles on integrating old Win32 projects into .NET. e.g. Do you COM? Dealing with Legacy Projects. This project is a concrete example of how that can be done.

It also illustrates a model of the type of real-time data streaming and control typically required by a physiological monitor.

To extend that model, I wanted to show how WCF could be used as a network transport for that same data stream. Hence the previous post. The addition of a WCF client application that provided a real-time display of the data stream was only logical.

There are a number of directions that I had planned on taking this project, but that will have to wait for another day. I'm sure that you'll come up with your own ideas along with numerous improvements.

The Code:

The download (below) is a Visual Studio 2005 solution, called RealTimeTemplate, with 6 projects (one is C++, all the rest are C#). Here is a diagram of the projects and their relationship. The horizontal arrows show the data flow as described above.

Real-time Template Components

The projects are:

  • SineWaveGenerator: This is a C++ COM component that generates buffers of multi-channel sine waves.
  • SineWaveGenerator.Test: The NUnit test functions for SineWaveGenerator
  • SineWaveWCFLib: This is the WCF server component.
  • SineWaveWCFServer: This the Windows service that hosts the WCF service (SineWaveWCFLib).
  • SineWaveWCFService.Test: The NUnit tests functions for SineWaveWCFService.
  • RealTimeDisplayWinForm: This is the Windows Form class, and WCF client, that controls and displays the sine wave data provided by the service. The graphical display is done using the ZedGraph library.

Here is what the Windows Form looks like when the application is running.

Real-time Template WinForm

Service Installation:

In order to run the sine wave display application, you'll first have to install and start the SineWaveWCFService.

  1. Build the solution in Debug configuration.
  2. Execute the InstallSineWaveWCFService.bat in SineWaveWCFService\bin\Debug. The service can be un-installed with the UninstallSineWaveWCFService.bat script.
  3. Run the Windows Services Manager (services.msc) and Start SineWaveWCFService.
  4. Run the RealTimeDisplayWinForm project. Use the Add button to add sine wave displays.

The source code can be downloaded here:

RealTimeTemplate_20070923.zip (210K)

Enjoy!

More on using a Named Mutex in Vista

Friday, September 14th, 2007

This is a follow-up to the Kernel Object Namespace and Vista post. Those previous findings were made using an administrative user in Vista. When I tried creating a 'Session\AppName' Mutex as a non-administrative user though, the application hung!

Just to be clear, here is how (simplified) I'm creating the Mutex:

The hang occurs when the Mutex is created. By hang I mean that the process just spins its wheels sucking 50-60% of the CPU and will continue until it's killed. Based on WinDbg analysis it's either stuck in the underlaying Win32 CreateMutex() call or CreateMutex() is being called repeatedly. It's probably the later.

When 'Local\' or 'Global\' are used, the Mutex is created fine! As noted before, 'Local\' doesn't work for other reasons so I'm stuck using the 'Global\' namespace. Go figure?

Kernel Object Namespace and Vista

Monday, August 20th, 2007

Just a quick development note:

According to Kernel Object Namespaces objects can have one of three predefined prefixes -- 'Local\', 'Global\', or 'Session\'. For Win2K/XP I've always used the 'Local\' prefix, which works fine. My primary use is with a Mutex to determine that a single instance of an application is running (like here). I also use the Mutex from a system service to discover if a GUI application is available for messaging. When trying to run the some code on Vista I found that the 'Local\' namespace does not work when Mutex.OpenExisting() is called from a the system service which is owned by a different user (from the same user, it works fine). So it appears that the 'Local\' prefix in Vista has a different behavior for the client session namespace than it does in Win2K/XP.

I searched around for a solution, but was unable to find a definitive answer. I did find a post about the Private Object Namespace which alludes to Vista kernel changes, but that's all. Here's what I determined empirically:

[TABLE=2]

The NO entries in the table mean that the namespace did not work. So, it appears that in order to support all three Windows versions I'd have to use the 'Global\' namespace. This is not a good solution. Unless I find another way, I'll have to determine the OS version and select the appropriate namespace at runtime ('Session\' for Vista, 'Local\' for Win2K/XP).

Developing a real-time data flow and control model with WCF

Saturday, August 11th, 2007

A Windows Communication Foundation (WCF) service is defined through its operations and data contracts. One of the major benefits of WCF is the ease with which a client can create and use these services through the automatically generated proxy classes. The service side is only half of the communications link though. Discovering the correct WCF configuration options that allow a solution to operate properly was not as easy as I thought it would be.

This post describes a specific WCF-based data control and streaming architecture. The primary goal of this service is to provide a continuous data stream (as buffers of short values) from a real-time data acquisition source. The client would then be able to display the data as it became available or store the data when directed by the user. In addition, the service allows the client to both get status information (Getters) and control certain attributes (Setters) of the underlying data source. This is illustrated here:

Real-time architecture

The DataBufferEvent is defined as a one-way callback and continuously delivers data to the client. The IsOneWay property is valid for any operation that does not have a return value and improves network performance by not requiring a return message. The Getters and Setters [for you Java folks, this has nothing to do with JavaBeans] can be called at any time. Changing a data source attribute with a Setter will probably affect the data stream, but it is the responsibility of the data source to ensure data integrity. The underlying transport binding must support duplex operation (e.g. wsDualHttpBinding or netTcpBinding) in order for this scenario to work.

Here is what an example (a sine wave generator) service interface looks like:

The service class is implemented as follows:

The InstanceContextMode.PerSession mode is appropriate for this type of interface. Even though there is probably only a single data source, you still want to allow multiple service session instances to provide data simultaneously to different clients. The data source would be responsible for managing the multiple data requesters.

With the service side complete, all the client needs is to do is create the proxy classes (with either Visual Studio or Svcutil), setup the DataBufferEvent callback and call the appropriate control functions. My first client was a Winform application to display the data stream. The problem I ran into is that even though the data callbacks worked properly, I would often see the control functions hang the application when they were invoked.

It took quite a bit of searching around before I found the solution, which is here. You can read the details about the SynchronizationContext issues, but this caused me to spin my wheels for several days. The upside is that in trying to diagnose the problem I learned how to use the Service Trace Viewer Tool (SvcTraceViewer.exe) and the Configuration Editor Tool (SvcConfigEditor.exe, which is in the VS2005 Tools menu).

So after adding the appropriate CallbackBehavior attributes, here are the important parts of the client that allow this WCF model to operate reliably:

The first take-away here is that WCF is a complex beast. The second is that even though it's easy to create proxy classes from a WCF service, you have to understand and take into account both sides of the communications link. It seems so obvious now!

That's it. This WCF component is just part of a larger project that I'm planning on submitting as an article (with source code) on CodeProject, someday. If you'd like the get a copy of the source before that, just let me know and I'll send you what I currently have.

Update: Proof of my first take-away: Callbacks, ConcurrencyMode and Windows Clients. Thanks Michele! 🙂

SolutionZipper Updated

Friday, August 3rd, 2007

I've updated my SolutionZipper source and installer to version 1.3 on CodeProject. Here are the changes:

  • Fixed a bug that was causing SZ to fail during the "Handle external projects" phase.
  • Ignore VC++ Intellisense Database files, i.e. *.ncb.
  • Ignore hidden files and folders.

I originally wrote this last year as simply a convenience function. Even though I use a source code control system (Subversion) at work, I still need a quick way to snapshot and backup my personal projects at home.

I recently started a solution that included a C++ project and noticed some problems. First was that there was no need to backup the VC++ Intellisense database file. The second problem might be related to one of these:

  • Microsoft Visual Studio 2005 Professional Edition - ENU Service Pack 1 (KB926601)
  • Visual Studio 2005 extensions for .NET Framework 3.0 (WCF & WPF), November 2006 CTP
  • Microsoft ASP.NET 2.0 AJAX Extensions 1.0

I don't know which one caused the problem, but after one of these was installed VS2005 had project list items that were not file system based -- a project called <MiscFiles>? Anyway, this caused the search for external projects to fail.

There was a request to ignore Subversion (.svn) directories. This was a good idea so I just ignore all hidden directories and files. This also means that VS Solution User Option files (.suo) are not included in the zip file.

First Look: Microsoft Health Common User Interface (CUI)

Tuesday, July 24th, 2007

My initial impression is that the current implementation of the Microsoft Health CUI (v1.0.114.000) is strong on depth and weak on breadth. Because of the limited number of components available this software is too early in its implementation of be of much use in a real product. For example, the first thing I would need is a 'PatientGrid' component, which doesn't exist yet. This is just the first CTP, so missing features are be to expected.

Here are the component lists for WinForms and Web applications:

CUI WinForm Components CUI Web Components

The design guidance documents are the most impressive aspect of this project. Each control has its own complete document that includes sections like 'How to Use the Design Guidance' and 'How Not to Use the Design Guidance'. The higher level terminology and accessibility guidance documents are equally as comprehensive. As a software developer that has had to work from ambiguous requirements specifications, nothing is left to the imagination here. The requirements for some components (e.g. AddressLabel) are written to UK specifications, but that's be expected since CUI is being developed there.

The Visual Studio integration is good and the source code for the individual components appear to be well constructed and documented.

The CUI Roadmap isn't very specific, but I like the design guidance driven approach. All of the up-front design work makes me think of my previous post on Agile development. The CUI Delivery Lifecycle is described as iterative, but I doubt it's actually being developed using one of the Agile methodologies. In any case, I'll continue to watch the progress of this project and look forward to future releases. It could be my excuse to actually use (instead of just playing with) WPF someday!

Microsoft Health Common User Interface (CUI)

Monday, July 23rd, 2007

This looks like an interesting initiative. Links (originally found here):

Microsoft Health Common User Interface (CUI)

Controls and Library from CodePlex

From the CodePlex site:

The Toolkit controls developed for this release conform to the recommendations contained in the Design Guidance documents. The Toolkit is a set of .NET 2.0 controls that help Independent Software Vendors (ISVs) build safe, consistent user interfaces for healthcare applications. The Toolkit controls have been created for use in both ASP.NET AJAX applications and WinForms.NET. The web versions of the controls are based on the ASP.NET AJAX Toolkit.

I'll take some time and investigate the controls and library and how well it integrates into an existing .NET 2.0 WinForms application.

First impression: The CodePlex download is 62MB!!

I'll let you know what I find.

Selecting an ORM for a .NET project: A real-world tale.

Thursday, July 19th, 2007

This is a story that I'm sure is being written over and over again by software designers like myself that need to develop relatively complex applications in a short amount of time. The only realistic way to accomplish this is to bite off major chunks of functionality by leveraging the work of others. Reinventing the wheel, especially wheels you know hardly anything about, can't be justified. Because of that many of the architectural decisions you have to make revolve around figuring out how to get all of those 'frameworks' and libraries and GUI components to work together. But I digress...

This post simply tells the tale of my journey though picking a framework for a project, how the landscape has changed over time, and what that means for the future. In case you don't want to read on, the ending is happy -- I think.

For any project that requires the storage and orderly retrieval of data there's pretty much no way to get around using a database. Also, if you have a requirement (like I did) that the user is able to select from a variety of SQL back-end databases then the use of some sort of Object-Relational Mapping (ORM) framework software is the only sane way to implement your application.

I'm not going to get into the details of ORM software or try to compare the features of the vast array of frameworks available. One current list of object-relational mapping software includes 34 entries for .NET (and another 27 for Java). One problem with making a selection is that many of these frameworks are constantly evolving and are thus moving targets.

On a side note, Microsoft has squarely thrown their hat into the ORM ring with the introduction of Language Integrated Query (LINQ) which is scheduled to be released with the 'Orcas' version of Visual Studio 2008 and .NET Framework 3.5 (see a summary here and a great series of articles on ScottGu's Blog). LINQ won't be released until next year so it couldn't even be considered for my project.

So, the question is: If you're developing a modest .NET application, how do you go about selecting an ORM framework? I'll walk you through the selection process I used.

General filtering criteria:

  1. Non-commercial: I had no money to spend. This cut the possibilities about in half.
  2. Open source: Even though I didn't want to touch the implementation, it was critical for long-term security that I had all the source code and build tools so it could be modified if needed.
  3. SQL back-end selection: The framework needed to support at least four of the most popular SQL databases. I wanted the user to be able to select local and server-based databases that were both free (MySQL and SQLite3) and commercially available (Microsoft Access and MS-SQL Server).
  4. Ease of use: The learning curve for some frameworks is steep. I wanted something that was easy to learn and to get up and running quickly.
  5. Documentation and example code: Having a well documented API is important, but it's the example code and tutorials that teach you how to use the framework.
  6. Support: This was the most subjective piece of the puzzle. The key is to get a sense of the level of developer and community involvement in the project. This comes from looking at the release frequency and history (in particular, look at the release notes and the number and type of issues resolved) and the level of Forum and/or Wiki activity. It's pretty easy to detect a dead project. Foreshadow: Just because it's an active project now doesn't mean it won't die in the near future!

To make a long story short, I ended up evaluating two open source ORMs: NHibernate for .NET and Gentle.Net. As noted above, I did this evaluation about a year ago (Summer 2006) and things have changed since since then. In any case, at the time I felt that the two were comparable for the functionality I needed which included just simple class to table mapping. The two major competing factors were ease of use (#4) and support (#6). NHibernate was clearly the more active project, but the Gentle.Net developers (in particular, Morten Mertner) were responsive to questions and quickly resolved issues that came up. I found Gentle.Net much easier to get up and running. In particular, in NHibernate you need to create a matching mapping file for every database class. This was daunting at the start and I felt like the split-file definitions would create a long-term maintenance problem. With Gentle.Net, I was able to use MyGeneration to create the .NET class which included the database field decorators, and that was it. Also, RE: #4 the NHibernate core is a 25MB download while Gentle.Net is less than 7MB. This is not only telling of the relative complexity of the frameworks, but it also means that there would be significantly more baggage that I would have to carry around with NHibernate -- and install on customer systems.

Based on that analysis it shouldn't be a surprise that I picked Gentle.Net. One year later I'm still happy with that decision. Gentle.Net continues to meet my needs and will for the foreseeable future.

OK, I'm happy, so where's the problem? The problem is that it appears development has stopped on Gentle.Net. I'm using the .NET 1.1 assemblies (1.2.9+SVN) with my .NET 2.0 application. The features in the next generation Gentle.Net (2.0) development looks promising, but at this point it doesn't look like it will ever be released.

In retrospect, I suppose that even with my concerns, selecting NHibernate might have been the better long-term choice. I think it was primarily the ease of the initial Gentle.Net implementation that had the biggest sway on my decision. Does this mean I made a wrong decision? You may think so, but I don't. There may be some cognitive dissonance going on here, but I did what I consider a reasonable evaluation and analysis and selected the best solution for my needs under the circumstances.

So here we are today. At some point in the future I will probably need to update or add a new SQL database that Gentle.Net doesn't currently support. I'll have to do something. Here are my options:

  • Add the needed functionality to the existing Gentle.Net code base. This might not be unreasonable to do, but it's just putting off the inevitable.
  • Re-evaluate current ORM frameworks and select one like I did last year. Déjà vu all over again... Microsoft side note: I can also wait for .NET Framework 3.5 to be released and use LINQ to SQL. My concern with LINQ is support for non-Microsoft databases. There's already some effort to develop LINQ providers (MySQL, etc. see here) but how long will it take to get reliable support for something like SQLite3 (or 4 by then) if it happens at all?
  • Justify the expense and purchase a supported product.

I think my little tale is pretty typical. Any developer that's using open source components is, or will be someday, in the exact same situation as I am. Doing good software design to mitigate these types of issues is why they pay us the big bucks! 🙂

UPDATE (6/23/08): Here's a good take on selecting an ORM: Objectively evaluating O/R Mappers (or how to make it easy to dump NHibernate).