Expert Texture Home Contact me About Subscribe Digipede Connect on LinkedIn rwandering on Twitter rwandering on FriendFeed

rwandering.net

The blogged wandering of Robert W. Anderson

Digipede + Velocity

Last week Microsoft released the first CTP of the Microsoft Distributed Cache (code-named Velocity). deatlefast2

I am definitely excited about this release.  While Microsoft is not breaking new ground here, the addition of a distributed cache to .NET is a great addition to the platform.  Certainly there are competing technologies, but Velocity will be a very simple choice for developers and ISVs because we’ll be able to count on its availability. 

This ISV is interested, so we tried it out.

We have many customers who use our Executive pattern to load and cache job-specific data for compute-intensive jobs on the Digipede Network.  These data are often fetched through WS calls or directly from SQL databases.  Often this is performed in the Executive.Start method.  Before Velocity, the code might look like this:

protected override void Start() {
    // read the CBOData object from the database.
    _cboData = ReadCboData(cache.Get(JobTemplate.Parameters["CBODataStore"].Value));          
}

Including Velocity in this example is really easy.  The following snippet adds use of the Velocity cache:

protected override void Start() {
    // get cache 
    CacheFactory factory = new CacheFactory();
    Cache cache = factory.GetCache("CBOCache");
    // see if our CBOData object is already there
    string key = JobTemplate.Parameters["CBODataKey"].Value;
    _cboData = (CBOData)cache.Get(key);
    // if not, read it from the database.
    if (_cboData == null) {
        _cboData = ReadCboData(cache.Get(JobTemplate.Parameters["CBODataStore"].Value));
        // store it in the cache for later use
        cache.Put(key, _cboData);
    }          
}

With a few lines of code, we reduce the load on the database server and network and spend more time computing.  (I’m making an assumption with this simple code that all Executives don’t start at once, an assumption made obsolete by seeding the cache from a master application).

Of course, this is a simple example, but there are many other use cases.  For example,:

  • Digipede-enabled applications can share results; 
  • master applications can load the cache with job-specific data; and,
  • others where baking Velocity deeply into the Digipede Network start looking pretty interesting.

I have seen many posts on “must-haves” for a Velocity RTM.  I mostly agree with the lists I have seen.  I’ll have a list too mostly from the ISV perspective.

Cool stuff.

Tags: , , ,

Innovation Partner of the Year Winner!

Last night, Dan and John accepted the 2007 Microsoft ISV/Software Solutions Innovation Partner of the Year Award. 

Why Digipede?  According to the press release:

The ISV/Software Solutions Innovation Partner of the Year Award recognizes an ISV that has developed an innovative, new approach to solving a new or existing business or consumer need utilizing Microsoft’s latest server or client technologies. The winning partner has demonstrated leadership in the areas of innovation, market potential, media/analyst buzz, investor value creation and customer adoption with a least three active business customers or 1,000 consumers.

Thanks, Microsoft!

Tags: , , ,

Innovation Partner of the Year Finalist?

Yes.

WPC 07 Finalist

Digipede has been named a finalist (one of three) for the Microsoft Innovation Partner of the Year. 

Microsoft has a whole bunch of very innovative partners.  To be named one of the top three is quite gratifying.

Thanks, Microsoft!

Tags: , , ,

2nd Microsoft ISV CTO Summit

I’m coming back from the 2nd Microsoft ISV CTO Summit up in Redmond (I blogged about the first one here).  A good trip with worthwhile content.  I’m not sure any of it was really new, but I did see some cool stuff:

Expression Blend

The tool for designers to design and build WPF projects.  Definitely cool. 

I had two questions for one of the presenters (Eric Zocher) afterwards: when will Visual Studio look as good as blend (e.g., Blend uses WPF and allows smooth scaling of its own UI).  Answer: um, maybe never. 

Since Blend is basically a developer tool (for designers) that can create and edit Visual Studio, does it integrate with TFS?  Not yet.

Of course, these answers don’t take away from Blend at all (and certainly TFS will eventually come even though outsourced designers may get little value from that). 

I’m no designer, but I’m  looking forward to playing with it.  Though they haven’t announced this part yet, I expect it will be made available through MSDN Maximal (or whatever they are calling it now) or through our Gold Certified ISV Competency.

WPF/E

This stuff is very cool.  Actually, Scott Guthrie demoed the WPF/E Vista emulator that Savas recently linked to.  The great thing here is the unification of the presentation story here.  I won’t go further into the roadmap because it is never clear to me at these NDA events what is open knowledge and what requires the secret-squirrel decoder ring.

AJAX ASP.NET

I tracked this as a really good thing (to greatly simplify AJAX for .NET devs), but I hadn’t taken the time to look at it or the demos.  It is really cool.  Aside from all it can do, the coolest thing is how easy you can enable it for existing ASP.NET applications.  I would have tried it out already (i.e., in our Digipede product), but I stayed out too late last night to get into it.

WinFx dead?

I had a chance to ask Scott Guthrie directly about whether the WinFx name change was an indication of the death of the managed Windows API (as I argued here).  His response, basically, naaah.  Just a marketing change.  I still disagree, as long as the managed API rides atop Win32, it isn’t the actual Windows API.  In this case the managed API is either dead or were waiting for Singularity.

Swag

These events always come with some swag.  This time we got a strange floppy neoprene folder (for small laptops here) and what I think is a screen cleaning cloth (though looks like a compressible handkerchief). 

Cheers, though, to Microsoft for not giving us a bunch of junk for the landfill — I include in this: lamps, USB speakers, travel clocks.  Also, I think it is great that they didn’t give us a whole bunch of resource CDs, trials, betas, etc.  Last time they did and these are mostly useless.  Not for the content, but because we all already have this content in MSDN or available through other partner programs.

They did give us one useful thing, though: a Vista Ultimate DVD/license.  Frankly, that is my kind of swag.

Tags: , , , , , , , , , ,

PowerShell for ISVs

Since my conversation with Jeffrey Snover at the ISV CTO Summit (blogged here), I have wanted to post on some benefits of adopting PowerShell as an ISV.

Many products can be greatly improved by the addition of a rich commandline interface. Obviously this enables a user to type commands at the console, but this can also enable more advanced scripting of your application. The typical way an ISV provides such an interface is to develop a custom set of console application(s). Alternatively, a Microsoft ISV can build on top of PowerShell (using CmdLets) and get so much more.

Here are some pros and cons of these two approaches (with a nod to J. LeRoy for using Mind Maps in posts):

cmdoptions4.png
Custom (console applications)

Pro Multi-platform: includes any version of Windows (or anything else for that matter) that you target.
Con Design “from scratch”: Every aspect of the commandline (e.g., argument naming and syntax, formatting, etc.) needs to be designed. As there are few clear guidelines and virtually no consistency on the Windows platform, you will have to design this with a “least-worst” mindset.

Con Lots of coding: You have to write a fair amount of code for even simple use cases to enforce / interpret your arguments and to format your a report properly.

Con Advanced use cases hard (Lots more coding): Enabling complex user scenarios through your commandline may require a lot of development. For example, if you already have a rich API and want to expose this through your commandline, you may need to painstakingly map new commands to existing methods.

PowerShell CmdLets
Con
Recent Windows only: Windows XP, Windows 2003, Vista, Longhorn Server (and specifically not Windows 2000).
Pro Easy design: Since PowerShell includes clear guidance on CmdLet design (e.g., argument naming) it is much easier to design your entire commandline with few design decisions.
Pro Object pipeline. PowerShell supports an object pipeline. This greatly eases the scripting use cases. For example, instead of having to make sure that your command outputs a piece of text that can be interpreted for use in a subsequent command, you simply need to return a .NET object.
Pro Formatting. The output formatting is not only automatic but plugs directly into other PowerShell utilities to format your output as tables, lists, to sort it, etc.

Pro Advanced use cases are easy. Combining the object pipeline with PowerShell’s ability to script .NET objects, you get a commandline that enables advanced use cases without you having to design for them directly. For example, the Digipede API already enables advanced use cases. With PowerShell we don’t have to design in each of those use cases. By making it easy for a PowerShell user to get an instance of our DigipedeClient object, we have enabled all (or nearly all) of the use cases enabled through our API.

Pro Simpler docs. Less documentation needs to be written to explain the basic use cases for your interface, because of the PowerShell design guidelines.

Conclusion

PowerShell is a really good choice for Microsoft ISVs. It delivers a lot of value to the ISV in exchange for relatively little effort.

From my perspective, PowerShell is yet another example of how Microsoft provides a terrific platform for ISVs: we get a great command line for the Digipede Network with little effort and can focus on the core value our product brings to the market.

Tags: ,