So as I sit in the airport, hoping that one of the beyond-overbooked flights has an opening so I can get home early and not have to go through Vegas, I thought I'd share my thoughts about the Windows Live platform that we picked up during partner training here in Redmond.
Without a doubt, the coolest feature enabled by the platform is single sign-on. And, while we're at it, the way it's implemented is really cool.
When a user goes to a site that uses the single-sign-on feature (that is, Windows Live ID integration), the user is presented with a user login screen that is typically fairly consistent across sites. I've seen two user experience-style screens that present it: www.zune.net has a themed version of the older, Passport-style login screen, while others (particularly those that host Live Controls) display a more Windows Vista-style login screen.
When a user needs to login, they are redirected to login.live.com to provide the authentication credentials via HTTPS. Once authentication is complete, they are redirected to the original website with a login token which provides information about that user. The cool thing is that the login token is the same for that user on that site across sessions, but not the same across sites, so you're able to identify the same user during multiple visits. This makes integration with the profiling API fairly straightforward.
Unfortunately, I can see issues with it; signing up or signing in can make it unclear to the users that they're still logging into the site, and people who are not Live ID users might be hesitant to sign up for a service that might not be directly related to the site they're working on. The sites aren't particularly customizable right now, either, so they might not really get the user to trust that they're still working with your site.
Contacts and Contacts Control
The Control may or may not be appropriate for a given site, but fortunately, you can also simply query the data via a REST interface. Very slick, and definitely usable. A good example in which this was appropriate was Buxfer - a site that helps you track your money. You might notice along the top of the page:
Clicking on the Windows icon pops up a dialog asking you to enter your Windows Live ID credentials. Very slick! I believe that, using this information, you're able to invite other friends (maybe the site was an example of single sign-on - it's starting to blur together).
Also cool (but in this case, probably more for the "cool to nerd developers" factor) is the ability to include a fully functional Windows Live Messenger client in your website. Want to sign in? No problem! Just pop open a new window with your account information and new windows for your conversations, just like the Windows client.
I don't really see the value in this for most websites, but there are a couple great exceptions:
- A site that wants to support Live Chat support can use this service to allow even anonymous guests to present the user with a chat window to the support person who is signed into Messenger. There are scalability concerns for this, but all told, it's a pretty neat way to do it.
- Message board or other community software might allow users to allow access to the user's Live Messenger account so that users can contact them through the web. The actual Live ID is shielded and not presented; an obfuscated, site-specific ID is used in its place.
I was a bit unsure about using Live Spaces, but the way it's discussed for example sites makes it sound like a bit of an easy storage solution similar to SharePoint. It might be worthwhile, especially for our clients who want to focus on virality.
We got some cool info that we can't talk about (nor can the Microsoft guys in charge), but it's really exciting nonetheless.
I'm not sure that Terralever can use the platform effectively, but it depends on seeing what kinds of technologies our clients can leverage, and how much Microsoft is willing to work with us.
Another one of the attendees voiced this concern, and it was something I'd thought about before we even got here - a lot of this technology is too late-to-market (my thoughts were "too little, too late"). People are sharing their thoughts on Digg, their photos and profiles on Facebook, photos on Flickr, documents on Google, and a lot of other stuff is already done. Single sign-on is GREAT - but I'm not sure that it's enough to make people buy-in to Live. Still, maybe it's a moot point; with 400MM monthly Live ID users and 100MM monthly Live Spaces users, perhaps greater propagation doesn't matter.
For more information about the Windows Live development platform, check out their website!
I'm publishing the PowerPoint slides and sample code from the C# 3.0 Best Practices talk I gave tonight at the Arizona .NET Users Group meeting. Once again I'd like to thank Scott Cate for extending the opportunity to me, Lorin Thwaits for pinch-hitting for Scott, and to Hudson IT Staffing for the event sponsorship.
Please note that the code sample isn't really annotated, and was designed to generate the code in the slides and the Reflector output shown. The theme of the presentation wasn't so much "How can we do these things the best?" but more "How is this implemented, so I can choose the best way to do something?" So the code isn't by any means a beacon of effectiveness - it's meant to go with the slides.
I'd welcome feedback and comments, as well as suggestions for additional content. Enjoy!
Tomorrow after I present at the Arizona .NET Users Group, I'm getting on a plane to head to Redmond, WA for some first-party training on the Windows Live platform. I'm not particularly certain what to expect; I think by and large, most of us are a bit ignorant about what the platform has to offer, and while some of the components are incredibly cool (the Virtual Earth API, for example), others seem... not so cool (such as the web-based gadget hosting platform).
One thing is certain - with the Windows Live platform, Microsoft is making a huge effort at hitting web-connected technologies. Sometimes I wonder if it isn't too little, too late (Blogger, Facebook, Google Maps, and Flickr come to mind as functional precursors for the corresponding Windows Live services), but at the same time, the ability to have everything integrated into one service could be advantageous. To that end, we'll need to see how things play out.
Stay tuned for new information about the Windows Live platform, direct from Redmond.
In a recent project that I've been working on, I had the need to access a singleton object from multiple callers on multiple threads, but then restrict access to the singleton during a change operation, during which the singleton was changed out and replaced with a new one. New requests must block until the singleton change is complete, but existing requests must complete before the singleton change occurs.
A WaitHandle - specifically, an EventWaitHandle with the reset mode set to Manual - is an effective way to allow an arbitrary number of threads go through the primitive until its reset condition is met. In this case, however, because we needed to perform custom behaviors for using Set() and Reset() for domain-specific reasons, I contained the underlying EventWaitHandle within my derived class and simply derived the class from WaitHandle. This class is called SynchronizedReferenceCounter:
1: using System;
2: using System.Threading;
4: namespace Terralever.Threading
6: public class SynchronizedReferenceCounter : WaitHandle
8: private EventWaitHandle m_handle;
9: private int m_refCount;
11: public SynchronizedReferenceCounter()
13: m_handle = new EventWaitHandle(true, EventResetMode.ManualReset);
16: public void AddReference()
20: Interlocked.Increment(ref m_refCount);
23: public void RemoveReference()
25: if (Interlocked.Decrement(ref m_refCount) == 0)
31: public override bool WaitOne()
33: return m_handle.WaitOne();
36: public override bool WaitOne(int millisecondsTimeout, bool exitContext)
38: return m_handle.WaitOne(millisecondsTimeout, exitContext);
41: public override bool WaitOne(TimeSpan timeout, bool exitContext)
43: return m_handle.WaitOne(timeout, exitContext);
46: public void AllowNewReferences()
51: public void StopNewReferences()
56: public event EventHandler ZeroReferencesInUse;
57: protected virtual void OnZeroReferencesInUse(EventArgs e)
59: if (ZeroReferencesInUse != null)
60: ZeroReferencesInUse.BeginInvoke(this, e, null, null);
In this class, I override the WaitOne() methods and marshal them to the contained EventWaitHandle's WaitOne() methods as appropriate. Reference additions are blocked whenever the contained EventWaitHandle is in a Reset state, when defined by a call to the StopNewReferences() method. An object that contains this reference counter should call AddReference() whenever it is accessed, and RemoveReference() whenever it is released. For example:
1: public class ReferenceCountedSingleton
3: // ...
4: public static ReferenceCountedSingleton GetCurrent()
7: return s_current;
10: public static void ReleaseReference()
15: public static void InitiateChange()
20: private static void s_counter_ZeroReferencesInUse(object sender, EventArgs e)
22: // finish the change
This shell class assumes a few things, but should be fairly straightforward.
There is some information that you should know about, though. One thing that might strike you is that if ReleaseReference() isn't called, you can run into a deadlock scenario. This can be solved by ensuring that you encapsulate this functionality in a try/finally block:
3: ReferenceCountedSingleton instance = ReferenceCountedSingleton.GetCurrent();
4: // perform operations that could be risky here.
As shown here, it is important to access the reference-counted variable only once, and so you should take a local variable of the instance. To do so, declare the local variable as an instance.
The issue you can potentially run into here is that singletons tend to do their magic via the Instance property like so:
If you do this perilous series of lines, you'll increment the counter three times. If you only call ReleaseReference() once, you'll run into a deadlock scenario where you'll never get to a zero-reference count, and you'll never finish your action.
Hopefully this helps someone. It took a lot of research (coming soon) to find the right synchronization primitive to use for this scenario.
Mass Effect, the latest installment of Bioware's epic RPGs, promised us the world, nay, the galaxy. It spoke of a universe of incredible depth, a future world in which humanity has discovered the means by which to travel to distant stars, interact with aliens, and eek our our place in the galaxy. It's got an incredibly compelling story, in which you are selected as humanity's first "Spectre," an agent of the galactic government empowered to do anything you see fit to preserve galactic stability. The visual artwork is compelling; the game has a "graininess filter," to make it look like an old 80's sci fi movie, and the music fits right into that genre (it makes the game feel all nostalgic). The use of HDR is gorgeous, and while at times it might feel a little overused, in general it is fantastic.
Although graphical quality is excellent, there is one exception: load times. Please, take five more seconds to load the textures and the bump maps.
As you can see in the photo to the left (I apologize, they were taken from my digital camera pointed at my TV), the bump map and texture map on the ground didn't load immediately. Take a few extra seconds and load the bump maps before I get into the game. In my opinion, it's incredibly immersion breaking, and should have been unacceptable to Bioware. Unless load time was one of the highest-priority requirements, which I can't imagine (and I'll say why in a second), it amazes me that it made it to production.
One of the COOL things about loading in this game, though, is the elevator load style. Loading happens in a lot of areas by going into elevators - definitely one of the least immersion-breaking aspects of the game. That was a fantastic move, and in fact while we're on the Citadel, you can pick up information on new missions and get a little more information about your squad members. It was definitely one of the strongest parts of the game.
This is without a doubt one of the best games I've ever played in terms of control. It is incredibly easy to control the characters, the selection of dialogue is fantastic, and the menu interface is great (with the single exception of - surprise, surprise - inventory management).
One thing I'm curious about - the E3 2006 dialog depicted to the left - where did it go? "A billion lives are hanging in the balance here. I won't let some piss-ant bartender slow me down." That seemed like such a great scene. I'm disappointed that it's not in the game.
There is one downer - the vehicle is ridiculous in terms of control. I'm on my third playthrough, and I haven't figure out how to control it steadily. One thing I totally loved, though, was the running-over the Geth. It was fun to run them into the lava on the planet, incidentally, shown in the image with the bad load times.
I also don't like the mission assignments. Besides the main quests, it's annoying to hear from the Systems Alliance admiral whom you never meet, who just says "You're a Spectre, and you answer to the Council, but you're still a human." Come on, give me a break. Seth Green, give me a chance to say no - don't just say "Transmission comin' in - patchin' it through" every freaking time I look at the Galaxy Map. PLEASE, SETH GREEN, LET ME DO MY OWN THING!
The music is absolutely fantastic. It's one of the best soundtracks I've ever heard; it's absolutely distinctive. And it was inexpensive to buy on iTunes! Other than that, I can't say much.
Finally - the Universe
I think this is possibly Mass Effect's greatest weakness, as well as the greatest strength. While places like the Citadel are built up very well, the Citadel feels like about 60% of the universe's civilization. That's pretty sad considering that they indicate a few million (maybe seven million) people live on the Citadel, and there have to be hundreds of billions, if not trillions, of people across the galaxy. The planets you can visit outside of the main quest feel forced and random, not to mention barren. It's supposed to be the galaxy - why do I hardly ever run into people who aren't trying to kill me?
But on the other hand, the way the other races are introduced and described is incredibly thorough. It's exactly how I'd like to see a game introduce the races it has. I can't sing the praises of the story or the universe (in this sense) enough.
Go buy it, if you haven't yet. It's worth the cash. Get it new - support BioWare.
One of the more obscure things about the .NET Framework is the Disposable pattern used throughout the framework, supported via the IDisposable interface. This pattern is so pervasive throughout .NET, that C# intrinsically supports it via the using keyword. There is also a standard pattern for implementing the interface that the interface just can't express (perhaps because interfaces can't specify protected methods; maybe that's a C# 4.0 Wishlist part 6 item?).
We can use an IDisposable object with the using keyword like so:
1: using (SqlConnection con = new SqlConnection(WebConfigurationManager.ConnectionStrings["DbConnection"].ConnectionString))
2: using (SqlCommand cmd = new SqlCommand("[dbo].[GetAllItems]", con))
4: cmd.CommandType = CommandType.StoredProcedure;
7: using (SqlDataReader reader = cmd.ExecuteReader())
9: return GetAllItemsFromReader(reader);
In this example, SqlConnection, SqlCommand, and SqlDataReader all implement IDisposable, because they interoperate with unmanaged code. By using the using blocks, the C# compiler actually transforms these into try/catch blocks:
1: List<Item> _compilerGeneratedResult;
2: SqlConnection con = null;
3: SqlCommand cmd = null;
6: con = new SqlConnection(WebConfigurationManager.ConnectionStrings["DbConnection"].ConnectionString;
7: cmd = new SqlCommand("[dbo].[GetAllItems]", con);
9: cmd.CommandType = CommandType.StoredProcedure;
13: SqlDataReader reader = null;
16: reader = cmd.ExecuteReader();
17: _compilerGeneratedResult = GetAllItemsFromReader(reader);
26: if (cmd != null)
28: if (con != null)
32: return _compilerGeneratedResult;
Yeah, if it was left up to the C# users to use this pattern correctly, we'd never do it (not without the using keyword, anyway). But the question is, how do we implement it?
Traditionally, we implement it by creating a protected, virtual method (or private method if the class is sealed), that accepts a boolean value that indicates whether we're calling this method via Dispose() or via the destructor. The Dispose() method calls this with true, and we also implement a destructor that calls it with false. The Dispose(bool) method then implements the cleanup logic, and if the parameter is true, also tells the garbage collector to not perform the invoke the finalizer (the destructor) on this object. Here's a sample:
1: using System;
2: using System.Runtime.InteropServices;
4: namespace DisposableSample
6: public class HGlobalPtr : IDisposable
8: #region IDisposable Members
15: public void Dispose()
20: protected virtual void Dispose(bool disposing)
22: if (disposing)
This is the most basic implementation of the IDisposable pattern. We're going to evolve it a bit, actually add use to it, and also look at Static Code Analysis (SCA) output from this. First, to the SCA (using FxCop).
This basic example generates two warnings, incidentally, from the same rule. Here they are:
warning : CA1816 : Microsoft.Usage : Change 'HGlobalPtr.Dispose()' to call 'GC.SuppressFinalize(object)'. This will prevent unnecessary finalization of the object once it has been disposed and it has fallen out of scope.
warning : CA1816 : Microsoft.Usage : 'HGlobalPtr.Dispose(bool)' calls 'GC.SuppressFinalize(object)', a method that is typically only called within an implementation of 'IDisposable.Dispose'. Refer to the IDisposable pattern for more information.
If you look at help for this rule, you'll see that to properly implement this change, you actually call GC.SuppressFinalize(this) within the Dispose() method (as opposed to Dispose(bool) method). Doing so ensures that GC.SuppressFinalize is called via IDisposable.Dispose even if derived classes override the virtual Dispose(bool) method.
Here's a more complete implementation of HGlobalPtr, with IDisposable fully implemented:
1: public class HGlobalPtr : IDisposable
3: private IntPtr m_ptr;
5: public HGlobalPtr(int size)
7: m_ptr = Marshal.AllocHGlobal(size);
10: #region IDisposable Members
17: public void Dispose()
23: protected virtual void Dispose(bool disposing)
25: if (disposing)
27: // free the state of any contained objects
28: // we don't contain any other objects!
30: // free my own state
31: if (m_ptr != IntPtr.Zero)
34: m_ptr = IntPtr.Zero;
Now, there are other code analysis problems; I need to handle security warnings for calls to Marshal.AllocHGlobal and Marshal.FreeHGlobal, which have a LinkDemand permission set on them. I should also consider replacing m_ptr with a SafeHandle. But aside from these, IDisposable is correctly implemented.
Just remember - if your object is IDisposable - please, use the using () statement!
For more information on implementing IDisposable, refer to the Microsoft documentation article on Technet.