SQL Saturday 61 Coming to DC

November 23, 2010 1 comment

If you are a regular reader then you know I spent some time this year learning SQL Server Integration Services (SSIS).  I’ve used it in several projects and have been very happy with the results, even if I have made a few mistakes along the way.  In fact, I was so enamored of what I was able to do in such a short time that I submitted to present not one, but two sessions at the upcoming SQL Saturday DC (#61), and to my surprise they were both selected!

Below is some information about the sessions, I hope to see you there!

SSIS for real: a walk through a real world project

Abstract: In this presentation I will walk though a complete real-world SSIS project that pulls data from an AS/400 and converts it to a SQL Server destination. In addition to specific AS/400 conversion issues I will also cover topics like dynamic OLEDB connections, and creating an effective work flow. Along the way we’ll take a look at how I use Derived Columns, Conditional Split, Lookup, and Script components to solve every day conversion issues.

The specific problem domain that SSIS solved for me was converting data from the EBCDIC world of the AS/400 (iSeries, System i, name-du-jour) to SQL Server.  Previously this had all been done with straight ADO.NET applications and performance was horrendous, to put it nicely.  Now with SSIS we have a solution that implements the Incremental Load pattern that is extremely performant. I’ll be discussing this project from start to finish.

Intro to C# for the SSIS Script Component

Abstract: The Script Component is an extremely powerful element in SSIS because it brings in the full capability of the .NET Framework. With first-class development tools and languages like C# you can solve problems that previously required very complex SQL or Expressions. If you’d like to learn enough C# to more effectively use the Script Component for Transformations, then this session is for you.

As a C# developer I was very happy to learn about the Script Component in SSIS 2008.  When I had to port my Package back to 2005 I discovered that it only supported Visual Basic.  I was able to make do, but it led me to the conclusion that there are probably plenty of SSIS folks who would like to know more about C#.  The session will include some C# basics and focus on things that you would find useful for data transformation, like string manipulation, data conversion, Regular expression matching, and more.

Categories: .NET

Simple, Blendable, DI driven ViewModels

November 19, 2010 3 comments

I hope you won’t think less of me, but I need to admit something: Dependency Injection still confuses the heck out of me.  The first time I met my good friend Kevin Hazzard he was talking about Castle Windsor with someone at an after party for a Microsoft event in Washington, D.C.  I was new to the community scene and had never met Kevin or most of the people in the room.  Wanting to fit in with the crowd I made the mistake of asking “what’s Castle Windsor?” which soon led to me asking “What’s Dependency Injection?” which even sooner led to my eyes glazing over and my brain retreating to its happy place.  I spent the next hour bobbing my head up and down, pretending to keep up with the conversation, but truthfully I was completely lost.

In the years since I’ve tried to learn DI: I’ve read articles, downloaded samples, gone to presentations, and had conversations until I was blue in the face.  I’ve made progress but I am still not totally comfortable with the whole idea.  Don’t get me wrong: I believe there is value there, and I have used it successfully in a couple of projects, but there is no guarantee I’ve done it properly.  It still feels like Black Magic but I figure if I keep plugging away it will eventually sink in.

On a side note, isn’t “Dependency Injection” a terrible name?  Almost as bad as “Inversion of Control”.  Neither one of these really describes what’s happening.  It sounds like what we are doing is injecting dependencies, meaning actually inserting dependencies, which would create more dependencies!  Really, we are resolving dependencies, or injecting dependent objects, but I guess Dependent Object Resolution is a little long winded.

The ViewModelLocator Pattern

I watched the video of John Papa’s PDC10 presentation Kung Fu Silverlight: Architectural Patterns and Practices with MVVM and RIA Services with great interest.  I’m working primarily in Silverlight now, using RIA Services and of course MVVM.  I downloaded the source code and have been working through it primarily focused on the ViewModelLocator pattern.

I’ve been aware of the pattern for a while but never understood how it worked before, so I wanted to give it a try.  The approach certainly works but it felt heavy and confusing, I’m sure that would pass as I grow more familiar with the pattern. Then again, it could just be me, but it seems there are lots of moving parts and misdirection.  Let me see if I can map it out for you as I understand it:

ServiceProviderBase: this is an abstract class that holds a static instance of the active ServiceProvider.

– A ServiceProvider is a class that inherits from ServiceProviderBase that has a reference to a Data Service instance (the class that manages all the data interactions via RIA services).  For this exercise there are two Service Providers, one for Design time and one for Run time.  It’s a little confusing, but the run time class is simply named ServiceProvider.  The design time class is called DesignServiceProvider.

Let me stop here for a second: ServiceProviderBase is inherited by the two Service Provider classes, and it’s primary function is to decide which class to instantiate and return from a static method, which upcasts the results to itself:

public abstract class ServiceProviderBase
    public virtual IPageConductor PageConductor { get; protected set; }
    public virtual IBookDataService BookDataService { get; protected set; }

    private static ServiceProviderBase _instance;
    public static ServiceProviderBase Instance
        get { return _instance ?? CreateInstance(); }

    static ServiceProviderBase CreateInstance()
        // TODO:  Uncomment
        return _instance = DesignerProperties.IsInDesignTool ?
            (ServiceProviderBase)new DesignServiceProvider() : new ServiceProvider();

        // TODO:  Comment
        // return _instance = new ServiceProvider();

This strikes me as convoluted, but let’s move on.

– The ViewModelLocator, the namesake of the pattern, is a class that holds a set of ViewModel properties.  These ViewModels require (or have a dependency on) the Data Services that are stored in the ServiceProvider instance returned from ServiceProviderBase.  When a ViewModel is requested from the ViewModelLocator, it uses ServiceProviderBase to retrieve the current Data Service and uses it in the constructor to create the ViewModel.  This is a form of Dependency Injection called Constructor Injection and is the most popular by far since many people use it without realizing they are using DI!

Here is the code from the sample:

public class ViewModelLocator
    private readonly ServiceProviderBase _sp;

    public ViewModelLocator()
        _sp = ServiceProviderBase.Instance;

        // 1 VM for all places that use it. Just an option
        Book = new BookViewModel(_sp.PageConductor, _sp.BookDataService);

    public BookViewModel Book { get; set; }
    //get { return new BookViewModel(_sp.PageConductor, _sp.BookDataService); }

    // 1 new instance per View
    public CheckoutViewModel Checkout
        get { return new CheckoutViewModel(_sp.PageConductor, _sp.BookDataService); }

This is more understandable, like a central repository of ViewModels for the entire application.  This approach assumes that they all use the same Data Service, but I’m totally cool with that because it is extremely likely.  What’s nice here is that the ViewModel is always the same regardless of which Service Provider is currently active.  The bad news is that it breaks some aspects of Blendability because there is no empty constructor, but more on that later.

– Now we need to access the ViewModelLocator, so create a ResourceDictionary that contains an instance declaration of the locator and add a reference to the dictionary in the App.xaml MergedDictionaries section.

<local:ViewModelLocator x:Key="Locator" />
<ResourceDictionary Source="Assets/ObjectResourceDictionary.xaml"/>

– Finally, bind the UserControl’s DatContext to the correct ViewModel property in the ViewModelLocator.

DataContext="{Binding Book, Source={StaticResource Locator}}"

If you are confused after reading the above, don’t worry because you are not alone.  I struggled for some time to sort this out and I’m not entirely convinced I ever totally got it right.


What I came up with

I know it’s easy to play desk chair quarterback, so before I begin let me say that it took people a heck of a lot smarter than me to come up with this and it works, so if you are already successfully doing this I’m not telling you to switch. 

Trying to grok what was going on, I just kept staring at it and thinking there should be a cleaner way, so I played with it until I came up with what I’ll share in the next section.  It’s quite possible that my approach has serious problems, so please feel free to leave comments below.

I basically set out to do two things: first, I prefer to see the DataSource listed in the Data tab in Blend.  Using the above approach you have to add the reference to the ViewModelLocator StaticResource defined in the ObjectResourceDictionary manually. The idea is that the ViewModelLocator acts as a Singleton, because there is only one instance created for the entire application.  Unless you NEED to enforce a Singleton, then this is not necessary in my mind. 

Because it is a Static Resource, the bound DataContext object will not show up as a Data Source in the Data tab. It will, however, show the appropriate properties in the DataContext panel on the bottom of the Data tab, which may be sufficient for you.

If you want to use the Data tab in Blend, however, you can remove the UserControl DataContext reference and create a local instance of the ViewModelLocator object using the standard Create Object Data Source tool in Blend.  Then you can drag the appropriate ViewModel property and drop it on the LayoutRoot to bind it to the DataContext.

The result of the second approach is that I can see ALL the ViewModel objects, which I may or may not want.  It also requires an extra level of nesting to get to the desired ViewModel object.  To get back to the more traditional Blend approach, we need to be able to bind directly to a local instance of the ViewModel itself.  Of course, doing so breaks the ViewModelLocator pattern, but I’m no purist. 🙂


The Non-locator Locator Pattern

My solution does away with the ServiceProviderBase and ViewModelLocator classes.  My reasoning is pretty straight forward: all I really need to be able to do is change what Data Service class my ViewModel uses based on certain scenarios.  I want a dummy service for design time but the real deal to execute at run time.  And I may want to create a special service for testing scenarios.  Getting back to how I started this post, this sounds like a job for Dependency Injection!

Since I’m writing a Silverlight application I need to make sure the IoC container I choose supports it.  I’ve been using StructureMap but it doesn’t support Silverlight yet (rumor has it V3 will add Silverlight support). Unity 2.0 for Silverlight does however, so I’m using this project as an excuse to try it out.  It shouldn’t matter what framework you use.

I added the following code to the Application_Startup method in App.xaml.cs:

var iocContainer = new UnityContainer();
if (DesignerProperties.IsInDesignTool)
    iocContainer.RegisterType<IClientService, ClientServiceMockData>();
    iocContainer.RegisterType<IClientService, ClientService>();
Resources.Add("IocContainer", iocContainer);

This creates a UnityContainer and registers the appropriate Data Service.  I then add the container as an Application level resource, so I can retrieve it from anywhere in the application.  This uses the same DesignerProperties. IsInDesignTool approach that the previous ViewModelLocator used.  Now we head to the ViewModel itself.

The ViewModel

In the ViewModel I add a property for IClientService with a backing field.  In the get block, if the backing field is null, I access the IocContainer application resource and use that to extract the IClientService

private IClientService _clientService;
protected IClientService ClientService
        if (_clientService == null)
            var ioc = Application.Current.Resources["IocContainer"] as UnityContainer;
            _clientService = ioc == null
                ? new ClientServiceMockData()
                : ioc.Resolve<IClientService>();
        return _clientService;
    set { _clientService = value; }

NOTE: I do want to share a problem I had here: ideally, ioc.Resolve<IClientService>() should work for both Design and Run time.  This solution works as desired at run time, but at design time the IocContainer resource is null.  To solve this and get run time data I hardcoded the creation of ClientServiceMockData class if the resource is null.  This of course adds a dependency on the ClientServiceMockData class, so if you have any suggestions on how to solve this problem I would appreciate hearing them.

This approach is probably wrong somehow, but I’m accessing the IoC container from inside the object that has the dependency.  I suppose this adds a dependency on the container class itself.  I did it this way to complete the Blendability.  In order for Blend to recognize the ViewModel as an Object Data Source at run time it has to have an empty constructor, so the ViewModel now has two constructors:

public ClientViewModel()

public ClientViewModel(IClientService clientService)
    ClientService = clientService;

You can see that the second constructor still allows Constructor Injection, so if you wanted to use special data for testing all you would have to do is pass in a specific Data Service to the constructor.

At this point we now have a completely Blendable ViewModel that supports design time data in far fewer objects and steps, and as an added bonus it can increase your geek factor because it uses Dependency Injection!



It seems to me that this is a much simpler way to achieve the same result, but I’m sure the Patterns and Practices folks will be able to spot the holes in it.  I’d really like to learn this stuff better so please feel free to leave your comments below.  What is wrong about this idea?  What is good?  Is there an even simpler way to achieve it?  Let’s hash it out: enquiring minds want to know. 

SSIS and DelayValidation

October 14, 2010 6 comments

If you’re a regular reader then you’ll know I’ve recently become enamored of SSIS.  I’m absolutely blown away by this product, I wish I learned about it 5 years ago!

Of course, being new to such an extensive technology means dealing with landmines along the way.  Unfortunately, I’ve run into a couple of them already.  The first one was my own fault: I developed a project in 2008 but the client has Sql Server 2005.  To solve it I had to install Sql Server 2005 and create a 2005 version of the Package.

AcquireConnection Method Error

The second issue I’ve encountered was hardly as intuitive.  I was at the client site yesterday deploying my first SSIS Package outside my own network.  With the help of Andy Leonard, my SSIS mentor, I had worked through all the necessary steps to make the Package dynamic, a process worthy of a blog post of its own.  At the client site I followed my cheat sheet to the letter, explaining every step to the client along the way.  After solving a missing prerequisite problem on the server, we fired up the Package only to be quickly stopped with a Validation failure message:

The AcquireConnection method call to the connection manager failed with error code 0xC0202009.

I assumed there was a problem with the dynamic connection strings. I spent about an hour diagnosing the problem only to find out there was no problem, at least not with the dynamic connections.  The problem, it turns out, was in a thing called DelayValidation.

Dynamic vs. Design Time Connection Strings

When you design an SSIS Package, you need to define connections to data sources and destinations that you can access at design time. Without these connections, SSIS would be pretty useless. So these design time data connections have to be hard coded somewhere: I put mine in package variables and bind the ConnectionString property of the ConnectionManager objects to the variables.  These variables are then overridden at runtime by external values, making the connections dynamic and portable.

What I was unaware of was that each ConnectionManager object and each DataFlow Task has a property called DelayValidation.  When the Package begins, it validates all of the DataFlow tasks prior to executing any of them.  If DelayValidation is set to False, then the validation process uses the design time connection objects.  In other words, it tries to connect using that connection string before it has a chance to be set by the dynamic process.  And here’s the kicker: False is the default value, so this is the default behavior.


Working on my own network, I never even noticed this was happening.  On the client network, however, it immediately blows up because the tasks cannot connect to my design time databases.  I found a good article online that explained the problem and it actually makes sense.  The solution is to set DelayValidation to True, which will delay the validation process until the task begins to execute, giving it time to be set dynamically.

I started by setting the property on every ConnectionManager object. At first I believed that this would be sufficient but it didn’t stop the validation error: I had to do the same thing to every DataFlow task as well, after which the Package ran properly.  This was a real pain because I had so many to change. I could select multiple tasks within the same container, but I could not select tasks across container boundaries, and I have a lot of containers.

While I understand what DelayValidation is doing and why it does it, I don’t understand why False is the default value.  If nothing else, it seems to me this should be a Package level setting.  If you know of a better or easier way to handle this problem, please share it in the comments below.

Categories: Database, SQL Server, SSIS

Richmond Code Camp X

October 11, 2010 2 comments

This past weekend, about 200 geeks, developers, and other techies converged on the Parham Road campus of J. Sargeant Reynolds Community College in Richmond, VA for Richmond Code Camp X.  I think it’s great when an event has been around long enough to begin using Roman Numerals. 

Planning and Organization

Regular readers already know I am a huge fan of Code Camps and #RichCC is as good as they come.  Of course, you could claim that I am a little biased: I’ve been on the planning committee for the event for the last two installments.  But believe me, this was a great event long before I got here!

In fact, one of the things I wanted to address today was how the Richmond Planning Committee is a great model for others to adopt.  Andy Leonard wrote about some of this on his blog:

There’s a lot that goes into planning an event of this magnitude. I commend the Code Camp Leadership Team for their hard work, but the team possesses a quality that will ensure many more Richmond Code Camps to come: absence of ego.

There’s no penalty for stepping up or down on our team. Life happens, people move, get more and less busy, change jobs, and just have other plans. Why punish people for that? Especially in a volunteer organization? The on-point person doesn’t try to "edge out" previous leaders. That’s because we’re all pretty secure individuals. It’s a great group to be part of, and I love every member of our team.

I think he really hit the nail on the head when he used the phrase “absence of ego.”  In the spring, Kevin Griffin and I were invited to join the committee for RCC2010.1.  This was/is a serious group of community activists: Kevin Hazzard, Justin Etheredge, Darrell Norton, Robin Edwards, Susan Lennon, Frank La Vigne, Andy Leonard, Kevin Israel, and please forgive me if I left anyone out. 

Each of these folks is a super star in their own right, so it would shock no one to find politics, backstabbing, prima donnas, or any of those other things that happens with a group this size of seriously accomplished folks.  The truth, however, is more shocking: there isn’t a territorial attitude in the bunch.  Kevin and I were immediately welcomed and assigned tasks that obviously others had done before we came along.

The best thing of all: no one “owns” anything.  If you need to jump tasks to make sure something happens, do it.  If you need to ask for help you get it.  And best of all, if you can’t do something, someone else will.  When something goes wrong or contrary to plan, there is always someone there to fill in the gap and address the problem.  Since everyone is empowered, this committee is the epitome of the “high functioning team”: it is truly Agile. 

In short, this committee ROCKS, and the result is an event that consistently raises the bar.  I just wanted to share some of that and say “Thanks” for letting me be a part.

Community Megaphone Podcast

I’ve had the opportunity to be involved with the Community Megaphone Podcast on more than one occasion since its inception: I was the 2nd guest and the 1st guest co-host.  I also got to sit in on the Speaker Horror Stories panel in CodeStock earlier this year.  I have an absolute blast every time I get to participate on this show.

One unfortunate thing about this weekend is that Philly Code Camp (another favorite) was scheduled the same day.  One of the show hosts, Dane Morgridge, lives in Philly and so he went there instead of coming to Richmond.  With Andrew Duthie, aka The Devhammer, going to Richmond, they decided to do live recordings at both events, another first for the podcast.  Of course, this meant Andrew was on his own, so I was really pleased to sit in again as guest co-host.

Frank LaVigne

We did two sessions with Frank La Vigne, one of the original founders of Richmond Code Camp.  During one of those sessions I came dangerously close to ensuring I’ll never be invited back: it turns out some people are a little sensitive when you make jokes about VB (who knew!?)  The good natured ribbing aside, we had a great time talking with Frank about the early days of RCC, his community work since moving to Northern Virginia, Windows Phone 7, and even his infant son Jake who made an appearance at the event and was the hit of the Speaker’s Dinner. 

Jim Pendarvis

We had a great chat with Jim Pendarvis, founder and organizer extraordinaire of Southern Maryland Give Camp.  The 2nd installment of SOMDGC is scheduled for March 25-27th and they have a new goal this year of 150 developers and 25 non-profits.  Jim talked about the National Day of Give Camp coming up on Martin Luther King day, saying the reason for moving SOMDGC back is the unpredictability of the weather.  I just think he doesn’t want to shave his head in a snow storm!

And speaking of last year’s show, this year we have TWO new items of interest.  First, if they meet their goal of 150 developers, not only will Andrew Duthie get a mohawk, he will get it dyed in a color of our choosing!  There is also talk of a henna tatoo…

Secondly, we have a challenge going now between Jim, Kevin Griffin, and myself.  To paraphrase the immortal words of Tommy Callihan from the movie Tommy Boy, the three of us have “what the doctors refer to as a little bit of a weight problem.”  So here is the deal: whichever one of us who loses the largest percentage of his body weight by Give Camp will be determined the victor.  The two losers will be required to dress up on the last day of Give Camp as a character from Harry Potter.  The characters will be chosen by the attendees, and I’m sure hilarity will ensue.

Van” Van Lowe

“Van” Van Lowe is a community organizer, blogger, and speaker from Northern Virginia.  He spoke with us about making the transi
tion from attendee, to speaker, to organizer within the community.  We also talked about attending Code Camps versus the larger national conferences like PDC and MIX.  And in a CMP Exclusive, he revealed his real name and told us the story of how he started going by “Van” as a result of his speaking activities.  Sorry – I’m not going to tell you the story here, you’ll just have to listen to the podcast yourself!

And don’t forget the others…

There were some other interviews as well that I wasn’t in on: Kevin Griffin took the co-host seat for a session and interviewed Rob Zelt, President of INETA.  And then later, new Azure MVP David Makogon filled in while I went to give my presentation and they interviewed up and coming community speaker Stuart Leitch from Charlottesville.  Stuart is a friend of mine and I can’t wait hear that session for myself!

A New Presentation

I did have the chance to unveil a new presentation during the last session of the day: Expression Blend and the Visual State Manager: A Deep Dive.  I’m never sure how a session is really going to go over until I give it for the first time. Recognizing that Blend is still unfortunately a niche topic, I’m also never sure how well attended a session will be, so I was pleasantly surprised to find a packed room: there were even a couple of people standing in the back for a little while!

This one was a lot of fun: I wanted to show some Blend coolness that I don’t typically get to cover in the introductory sessions.  We spent some time going over Templating basics and then used the VSM to solve some problems with Templates.  The second half of the presentation was all about creating Custom Visual States and how to use them in creative ways.  We covered Transitions, Effects, and Multiple Custom States, and it was very well received.  My thanks to everyone who sat in and for all the nice comments afterwards.

A Brief Respite

Without question we can add Richmond Code Camp X to the history books as an extremely successful event.  We’ll take a very brief hiatus, but before long we’ll start the process all over again for Richmond Code Camp XI.  You can go ahead and put it in your calendar now for May 21st, 2011, I hope to see you there!

Categories: Community

How the Atlassian Acquisition of Bitbucket Changes Things

September 29, 2010 1 comment

A while back I posted about selecting and implementing a source control system for Project Greenfield.  I outlined how I came to the decision to use Mercurial (Hg) and so far I have been very pleased with my decision.

One of the key factors in my decision was Bitbucket, an online repository service that I am using as a central source control server.  The original free service came with a single private repository, not exactly ideal for an ISV, so I purchased an account that gave me a handful of private repositories and more storage space.  At a few dollars a month the price was more than reasonable.

Yesterday, however, they announced that Bitbucket has been acquired by Atlassian, a development and tooling company.  Naturally, I was immediately concerned, but this turns out to be really good news.

What’s Changed

The most important change is this: small accounts like mine are now free.

Bitbucket’s pricing scheme was based on the number of private repositories and storage space, but under Atlassian the pricing is all about users.  An entry level account is for 5 users and includes unlimited public and private repositories with unlimited storage space.

According to the website, a user is defined as “Someone with read or write access to one of your private repositories.”  This means that most small development shops won’t have to pay anything.  I think that is just awesome.

What if I’m not so small?

Another great bit (pun intended) of news is that the price for pay accounts is very small. A 10 user account is only $10/month, 25 users is $20/month, 50 users is $40/month, and for $80/month you can have unlimited users.  Again, this includes unlimited private repositories and unlimited storage space.  In my mind those prices are great regardless of your team size.  They also have an introductory offer: if you sign up for a 10user account before Oct. 3rd your first year will be free.

How does this affect Open Source projects?

Since users are only counted for private repositories, you can have unlimited users on public repositories.  This means you should be able to manage any open source project on the standard free 5 users account.

What about Github?

This change got me thinking about Github.  When I was selecting a DVCS I almost chose Git because of Github even though everything else had me leaning towards Mercurial.  In my research it seemed people were more fanatical about Github than Git itself and I really wanted hosted repositories.  In the end it was finding Bitbucket that finalized my choice.

The only reason I even bring this up is I wonder if this will be a game changer for Github.  When I signed up, Bitbucket and Github prices were practically identical.  Will people researching DVCS begin choosing Hg over Git because Bitbucket is now free?  Github pricing is still based on the number of private repositories, will it change its pricing model? 

Github already has unlimited free public repositories and “collaborators”, so I don’t see this affecting the open source crowd.  And I certainly don’t think people will be switching from Git to Hg because of this: it’s the new adopters I’m curious about.  It just seems to me that Github will have to do something to respond to this: I’ll be curious to see how it plays out.

Categories: News, Source Control

A Centered and Resizable Text Header

September 27, 2010 4 comments

Tweeted by @Pete_Brown recently:

Attention WPF (and SL4) Devs: Get back to blogging. You’re getting lost in the amazing amount of #wp7dev content 🙂

Well, when Pete says “Jump!”… I’ve actually been meaning to post this for a few months, so thanks to Pete for giving me a push 🙂

The Problem

A friend of mine is learning Silverlight and in prototyping a simple app he wanted to just use a TextBlock for his header.  When the application has a fixed size, it works fine, but when the size is flexible he ran into one of the issues with TextBlock: it doesn’t resize or scale.  When you set the properties for a TextBlock, you set the size of the font and it never changes.

Here is the default layout we are discussing:


And here it is expanded to full screen on a large resolution:


The text stays centered, but the size stays static.  It makes the header seem small and out of proportion.  And similarly when the window is much smaller the text seems too large and out of proportion.  If you get extreme you can even see some weird results:


Some Ideas

One way to solve this would be to listen to the UserControl’s SizeChanged event and do some math to calculate the new FontSize, but that just feels so WinForm. I’d much rather find a way to do this without code behind.

You could try to bind the FontSize of the TextBlock to a property in the ViewModel, but you still have to find a way to trigger the action.  If you bound the UserControl width and height to the ViewModel you could have its Set method raise the PropertyChanged event for the FontSize property.  And of course, you’d still have to write all the code to calculate it, which I’m sure would include measuring the text, calculating buffer zones and margins, etc.

These are just ideas, I haven’t tried either approach.  You may find a situation where you need to do one of these things or come up with something different, but honestly, these ideas just somehow feel wrong in a XAML world.  In this particular case, where the Text is static, I have a better solution.

Convert Text To A Path

The solution starts with taking advantage of the vector graphic nature of XAML. While Text may not expand and contract as desired, a Path certainly will, so the first step is to convert the Text to a Path.

In Blend, select the TextBlock item and right click, select Path –> Convert to Path (or go to Object –> Path –> Convert to Path).  This will convert the text into a Path object (outlined in Red in the screen shot below).  You’ll also notice the Horizontal and Vertical alignments have both been changed to Stretch and the Margins have been set (outlined in Yellow).


If you reset the Margins to 0, you will see the Text take up the entire space.  If you change both the alignments to Center it will look OK in Blend, but when you execute the application you’ll see we actually get the same behavior as the Stretch.  This is because of Width and Height are set to Auto, which is what we want: if we set these to fixed sizes we are right back where we started.

The good news is that if you resize the window now, either bigger or smaller, you’ll see the header resize itself, so we must be on the right track!


Margins and Proportions

At least in this case, we don’t want the text bumping up against the edges of its Border: it’s distracting and not very clean.  Instead, we’d like a little space surrounding it on all sides. 

You might be thinking “No big deal, I’ll just add a Margin” and you wouldn’t be totally wrong.  The problem is that hard coding the Margin, like hard coding the Text’s FontSize, means it can never change.  So a Margin that looks good when the window is small doesn’t necessarily look good when the window is large. 

What we want is the effect of a Margin, but we want that Margin to be proportional to the available space.  We really can’t solve this with the Margin property, at least not without a lot of work and calculation, which I’m just too lazy to figure out.  So the real solution is not Margins, or even in the Text (now Path) itself: the real solution is in Layout.

Solving the Problem Using Layout

One of the things I see developers new to XAML struggling with is the power of layout.  I’ve started labeling my approach “Container Driven Design” which really relies on the containers to manage the size and spacing of it’s child elements.  It frequently involves nested containers, which is what we are going to use to solve this problem.

What we really want is for our Margins to float and resize in proportion to their parent container.  Fortunately we have a container type built in that is perfect for this: the Grid.  With a Grid, we can specify percentage based sized rows and columns.  (NOTE: Yes, I know they are not *really* percentage based, but an explanation of the Star system is beyond the scope of this article.)

So to solve this problem using layout we are going to wrap our header in a 9-celled Grid: three rows and three columns, with the center cell holding our header.  Right click the Path and select Group Into –> Grid.  If you look at your Objects and Timelines panel you will see the Path is now the child of a Grid:


With Grid selected, you can use the blue bars along the top and left to position the rows and columns:


While I avoid editing XAML, there are a few times that it is simply faster and easier: editing Grid row and column sizes is one of those times.  In the screen shot below, you’ll see that I’ve effectively created floating margins by defining star sizes for the top and bottom rows and right and left columns.  The center row and center column have no size definition, so they will take up the remaining available space.


Execute this and you’ll find that as you resize the window the margins will resize themselves proportionally, the text will remain nicely centered and will also resize itself proportionally.


Wrapping it Up

So there are a couple of lessons I would want you to take away from this exercise.  First, the problem we were having was with static text, so we solved that by turning that text into something else.  We found a graphical solution to our graphical problem! 

Second, we had a problem with Margins, so we used grid rows and columns instead of the Margin property.  We solved that issue by relying on a Layout Container instead of a single property.

In both cases, we found simple and elegant solutions by thinking outside the box.  I’ll grant that this example is not overly complex, but it does illustrate the power of XAML to solve design problems.  And of course, a chance to play around in Blend is always welcome!

Categories: .NET

Project Greenfield: Learning SSIS

September 17, 2010 Comments off

While Expression Blend remains my favorite tool and Visual Studio my most productive tool (with a little R# love), Sql Server Integration Services (SSIS) has moved solidly into the #3 slot. 

I frequently have tasks that require me to move data from one location to another, perform conversions and transformations, create new tables with the resulting columns, etc.  In the past, this meant a lot of ADO.NET and a lot of coding.  These processes can be exceedingly slow and time consuming to produce and the performance is frequently less than desirable.  With what I’ve learned about SSIS, most of that work will be a thing of the past.  I can now do conversion work in a fraction of the time it took before and the resulting product, a Package in SSIS lingo, can execute in far less time.

It seems that SSIS is thought of as a DBA’s tool, but I believe that as Blend is not just for Designers, SSIS is not just for DBAs.  This post is not going to be a how to or any kind of definitive work: what I want to accomplish is to introduce my fellow developers to the glory of SSIS and highlight some of the reasons I think you should be learning this technology.

Project Greenfield and SSIS

For Project Greenfield, one of the primary tasks is to convert data from the legacy IBM *insert nom du jour here* midrange server database to the new Sql Server database. 

This is far more than pushing data from once place to another: the structure is completely different.  Relationships are defined now that previously were unenforced and large tables are broken into dozens of smaller, more normalized tables, often in different schemas.  Fields that were previously fixed length and Numeric types are now varchars and ints.  In some cases single fields have been broken into multiple fields, and in some cases multiple fields have been combined.  In all cases, data coming out is Unicode but is being stored as ANSI.

Obviously, this conversion represents a significant body of work in its own right.  One of my recent tasks was to provide enough of a conversion that I could start prototyping (fake data just wasn’t what we wanted.) The amount of work I was able to do in a week would have easily taken over a month to write using ADO.NET.  And best of all, now that I have a solid framework in place making changes is very easy.

Getting Started with SSIS

In order to start with SSIS, you will have to have it installed.  More accurately, you will need the SQL Server Business Intelligence Development Studio installed, also known as BIDS.  This is found as an option when installing SQL Server and I’m pretty sure it is not available below SQL Server Standard

The current version of BIDS runs in Visual Studio 2008.  If you already have VS2008 installed you will find a new Project Type category called Business Intelligence Projects added to your existing install.  If you do not have VS2008 BIDS will install a Visual Studio 2008 Shell, even if you have VS2010 installed.

To start a new project, select the Business Intelligence Projects category and Integration Services Project in the create new project dialog.  Once it is created, opening it and working with it is basically the same as any other solution.

Work Flow in SSIS

BIDS itself is the first application I’ve seen that serves as a compelling example of a Workflow driven application. The Package Designer workspace is organized in tabs, the only two of which I’ve needed so far are Control Flow and Data Flow.

All tasks are defined as compartmentalized units of work.  The visual blocks for those are all shown in the Control Flow tab.  These tasks may or may not be grouped into containers such as Sequence Container or Foreach Loop Container.  You may define as many containers as necessary to organize the Package.  So far I have preferred Sequence Containers as they allow me to organize tasks procedurally.  Except for the simplest Package, I would not define tasks outside of containers.

There are many different task types available, but I have only needed three so far: Data Flow Task, Execute SQL Task, and Script Task.  And now that I have better knowledge of what I am doing, I could get by without the Execute SQL Task.

Data Flow Task

At the heart of SSIS is the Data Flow Task.  The basic formula is this: read data from a data source, manipulate/transform that data, then write the transformed data to the target destination.  Data sources can be ADO.NET or OLE DB database connections but can also be Excel, Flat, or XML Files.  There are even more options for Target Destinations.

In between the source and the target are the Data Flow Transformations which really represent the power of SSIS.  Here is a brief list of the transformations I have so far found most useful.

Conditional Split – Evaluates the data in the current columns and creates logical subsets which can then be handled differently.  Each subset effectively becomes it’s own data source at that point.

Derived Column – In my mind, the most important transformation of the bunch: derived columns are the new (or replacement) columns built by converting or transforming the source data.  SSIS includes a highly evolved “Expression Language” that is used to convert the data at runtime.  String manipulation, type conversion, mathematical operations, and much more are all supported. 

Lookup – Second in importance only to Derived Column, this transformation allows you to perform lookup actions against other tables.  This is essential for preventing invalid foreign key insertion.  It also is great for performing Incremental Loads: basically, this means only inserting records into the database if they don’t already exist.  This becomes important for several reasons, the least of which not being that I want to be able to execute the Package as often as possible, especially during development.

Multicast – Multicast creates multiple copies of the current data set, so you can perform multiple writes to multiple destinations.

The Script Task

The Script task allows you to write code to do things.  Primarily I have used this to work with package variables, a whole topic in its own right, and for making OLE DB connections dynamic.  I see substantial potential in the Script Task though as it really opens up the entire .NET Framework to the process.

Final Thoughts

Obviously, this barely scratches the surface of SSIS.  BIDS is primarily a graphic tool, but there are distinct functions that developers could really leverage.  The other place where developers could shine in SSIS is in process flow: once I understood things like Sequence Container and Conditional Split I really felt like I could make SSIS sing.  This kind of flow is exactly what we code day in and day out, so I think developers can pick up SSIS quickly.

I may write some more about SSIS going forward, but if you are interested and looking for info now I recommend you check out Andy Leonard’s blog.

Book Review: .NET Compact Framework 3.5 Data-Driven Applications

August 18, 2010 Comments off

In 2005 and 2006 I did a significant amount of Compact Framework development for my company.  I wrote a very large, intensely data-driven application for field workers to confirm and collect real estate assessment data.  We used SqlServerCE on the device (and later tried SqlLite), and ultimately the data had to sync back to an existing database on the AS/400.  This was a real mother of a project and ate up almost a year of development time, largely because there was very little real help available.

I just got done perusing Edmund Tan’s book .NET Compact Framework 3.5 Data-driven Applications and let me say I *really* could have used this book back then!  This is an extremely thorough coverage of the current Compact Framework for Windows Mobile 6.0.  The book walks the reader through developing a series of real world applications, which seems to be PACKT’s preferred style.  Unlike many books about Microsoft technologies, this one is not limited to Sql Server: the author gives Oracle Lite equal coverage throughout.

As the title suggests, this title is very data centric.  Data topics include building the data tier, parameterized queries, full-text searching, data synchronization, and more.  I would have killed for the guidance provided in the Performance and Packaging & Deployment chapters.  There are several other topics, like SMS and Security that are just icing on the cake.  As I said before, I would have loved this book 5 years ago.

It’s obvious Mr. Tan knows his topic well, the material is very accessible and well written.  If you have any Windows Mobile 6.0 or Compact Framework projects then I’m pretty confident there is something here for you.  The publisher has made the chapter on “Building Integrated Services” available for free, so you can download it and check it out for yourself.

Categories: Book Reviews

Blend-O-Rama Update

August 18, 2010 Comments off

Hi Folks!  Hard to believe it has been almost a month since the first ever Blend-O-Rama event!  I’ve gotten lots of questions about the videos and the website, so I wanted to put this together to give everyone an update.

The BOR Videos

When I wrote about the LiveMeeting experience, I failed to discuss recording the sessions.  Kevin recorded the sessions, but the videos themselves are really small.  I’m sure this is partly because I had to set my resolution to 1024×768 for LiveMeeting to display well, and partly to limit storage size.  The actual WMV files came out as 704×528 – like I said, they are small.

They are so small that on my 1600 monitor they are almost unreadable. They view acceptably at 1024×768, so I hate to recommend it but you may want to resize your display in order to watch the videos.  They will be available for download on the BOR site when it goes live.

I hate to make people wait any longer, so the good news is you can download them now in an all-in-one Zip file.

The BOR Website

I’m in the process of moving service providers, so the BOR website has been on hold while I get that set up.  I’m also hampered a little bit by my lack of ASP.NET experience.  Fortunately, I have friends who know a little something about all this stuff! I can happily report that I am about 98% there, with the help of Kevin Griffin and the great team at OrcsWeb.

I have a handful of videos already out there but many more to produce.  Most of the content from the BOR sessions will be available in a series of shorter videos.  I’ll also take suggestions and requests, so for now if you have any put them in the comments here. 

At any rate, progress has been made, albeit not as swiftly as I would have liked.  Such is my never ending story.  Thanks as always to all the supporters out there, you guys make it all worthwhile!

Categories: .NET

Richmond Code Camp 2010.2

August 6, 2010 Comments off

Hey folks! I just wanted to take a quick moment to announce that Richmond Code Camp 2010.2 has been scheduled and is only 2 months away!  On Oct. 9th,  .Net, SQL Server, SharePoint and other IT Professionals will once again converge on the J. Sargeant Reynolds Community College Massey campus in Richmond, VA, for one of the best Code Camps going.  RCC consistently gets rave reviews and I am proud to be associated with this event.

Registration is open now and there may be limited space available this time so don’t hesitate.  You can also click on the banner in the right sidebar –>

Ever thought about Speaking?

The call for speakers is also open but will close on September 10th.  Have you ever presented?  If not, would you like to?  Code Camps are the perfect venue for getting your feet wet.  Don’t get hung up on not being “an expert” at something.  I’ll tell you a secret: most of us aren’t! We are, however, passionate about the technology we use and we have a sincere desire to share our passions and help our fellow developers.  If that sounds like you, then you should give it a try, it is a truly rewarding experience.

For you Blend fans out there, I have submitted a couple of new talks for the event: Advanced Topics in Expression Blend and Expression Blend and the Visual State Manager: A Deep Dive. I’ll be sure to announce it if they get selected.

I hope to see you there!