Friday 19 December 2008

The best IT blog of 2008

End of each year is a great moment for all sort of reviews, people tend to publish various different rankings like "The Best of ..." in all possible categories. This year I will add my 2 cents in a category which I find the most interesting: "The best IT blog of 2008".

I follow a few blogs as well as sites aggregating articles in some particular area (Agile Development, .Net, EPiServer). During this year I found lots of interesting and usefull posts there so choosing the best blog was difficult. My intention was to avoid nominating blog talking about one particular area or product. Main reason behind that decision was that people move from one project to another, from one job to another, I wanted to find a blog which can be relevant for us all the time.

Okey, let's move to the point ... in my opinion the best IT blog of 2008 is Jeff Atwood's Coding Horror!

There are a few reasons:

  1. New posts every 2-3 days
  2. Vast number of comments, Jeff's posts are just a start for a discussion.
  3. And at last but not least .. lots of great, thought-provoking posts ... my personal top 5:
Which blog is your "The best of 2008" ?

Wednesday 17 December 2008

Fluent NHibernate - Integration Tests

In last two posts I have covered the Flunet NHibernate introduction and Coneventions together with AutoPersistenceModel. In this part I would like to show how Fluent NHibernate can speed up writing integration tests which should guarantee that mappings are correct. If there is a problem with mappings then for sure that is something you would like to want know about as fast as possible therefore some sort of automated tests are necessary.

Lets check what Fluent NHibernate has to offer:

1) SessionSource class which helps to deal with NHibernate configuration, session management and recreating database schema. Why is it useful? "Just" to check if mappings are correct you don't need to have any test data, you need only clear schema and database to connect with. You can use SQLite to create in-memory database. This is a neat and robust solution and moreover it eliminates external dependency -- we don't need any external database any more. Check the example:



   1:  [SetUp]
   2:  public void SetUp()
   3:  {
   4:      // create and configure persistance model including changes in Conventions
   5:      var persistenceModel = new PersistenceModel();
   6:      persistenceModel.Conventions.GetTableName =
   7:          type =>
   8:          String.Format("[{0}.{1}]", type.Namespace.Substring(type.Namespace.LastIndexOf('.') + 1), type.Name);
   9:      persistenceModel.Conventions.GetPrimaryKeyNameFromType = type => type.Name + "ID";
  10:      persistenceModel.Conventions.GetForeignKeyNameOfParent = type => type.Name + "ID";
  11:      persistenceModel.Conventions.GetForeignKeyName = prop => prop.Name + "ID";
  12:  
  13:      // add mappings
  14:      persistenceModel.addMappingsFromAssembly(typeof (Product).Assembly);
  15:  
  16:      // configure nhibernate using SQLite database
  17:      var config = new SQLiteConfiguration()
  18:          .InMemory()
  19:          .ShowSql();
  20:  
  21:      sessionSource = new SessionSource(config.ToProperties(), persistenceModel);
  22:  
  23:      // create NHibernate session
  24:      session = sessionSource.CreateSession();
  25:  
  26:      // recreate schema
  27:      sessionSource.BuildSchema(session);
  28:  }


Here is an implementation of BuildSchema(...) method:



   1:  public void BuildSchema(ISession session)
   2:  {
   3:      IDbConnection connection = session.Connection;
   4:  
   5:      string[] drops = _configuration.GenerateDropSchemaScript(_dialect);
   6:      executeScripts(drops, connection);
   7:  
   8:      string[] scripts = _configuration.GenerateSchemaCreationScript(_dialect);
   9:      executeScripts(scripts, connection);
  10:  }


As you can see, it uses NHibernate methods to generate DDL for dropping and creating tables for selected dialect (SQLite in this case). One thing worth noticing is that DDLs will be as accurate as mappings. So if you skip some information (like nullable fields) then don't expect to see it there!

2) Thanks to the SessionSource class we can query in-memory database and we have access to NHibernate Session ... it's time to check our mappings, we can use PersistenceSpecification class to do that:



   1:  [Test]
   2:  public void ProductReviewTest()
   3:  {
   4:      new PersistenceSpecification<ProductReview>(session)
   5:          .CheckProperty(x => x.Comments, "some nice comment")
   6:          .CheckProperty(x => x.EmailAddress, "test@test.com")
   7:          .CheckProperty(x => x.ModifiedDate, DateTime.Today)
   8:          .CheckProperty(x => x.Rating, 4)
   9:          .CheckProperty(x => x.ReviewDate, DateTime.Today)
  10:          .CheckProperty(x => x.ReviewDate, DateTime.Today)
  11:          .CheckProperty(x => x.ReviewerName, "test name")
  12:          .CheckReference(x => x.Product, CreateNewProduct())
  13:          .VerifyTheMappings();
  14:  }

Under the hood PersistenceSpecification class will save the object (ProductReview) to the database and then using another connection it will fetch the object back to make sure that all properties have correct values set. It's not a revolution but for sure it can save lots of time and thanks to neat and readable code it will increase maintainability.

Source Code

As always you can download the source code, and the whole sample web application. You will find there separate project for tests. I have created an AbstractTestBase class which is responsible for configuration and it also exposes NHibernate session. There are also tests for mappings, which use exposed NHibernate session and at least on my machine all tests pass ;)


(EDIT: Examples in this post have been updated on 8.02.2009 to reflect changes in Fluent NHibernate API)

Useful Links:
Other related posts:

Monday 15 December 2008

Fluent NHibernate - Conventions and AutoPersistenceModel

Last time I have introduced the Fluent NHibernate, this time I would like to move further and show you how to use Conventions and AutoPersistenceModel.

But before we move to the point lets take a quick look on things that have changed in our sample web application:

1) Global.asax

NHibernate SessionFactory is created only once, on application start, then it is reused to create Sessions for each request. This approach allows us to use NHibernate Session in code behind of ASP.NET pages.

2) To demonstrate more interesting things I had to add one more table, so now we are playing with two tables:


I am, of course, still using AdventureWorks database. With a new table and relations between those two tables, our mappings have changed:

  • Product:


   1:  public class ProductMap : ClassMap<Product>
   2:  {
   3:      public ProductMap()
   4:      {
   5:          WithTable("Production.Product");        
   6:          Id(x => x.Id, "ProductID");
   7:   
   8:          Map(x => x.SellEndDate);
   9:          .
  10:          .
  11:          .
  12:          Map(x => x.ModifiedDate).Not.Nullable();
  13:   
  14:          HasMany<ProductReview>(x => x.ProductReview).WithKeyColumn("ProductID").AsBag().Inverse();
  15:      }
  16:  }

  • ProductReview:



   1:  public class ProductReviewMap : ClassMap<ProductReview>
   2:  {
   3:      public ProductReviewMap ()
   4:      {
   5:          WithTable("Production.ProductReview");
   6:   
   7:          Id(x => x.Id, "ProductReviewID");
   8:   
   9:          Map(x => x.Comments).WithLengthOf(3850);
  10:          Map(x => x.EmailAddress).Not.Nullable().WithLengthOf(50);
  11:          Map(x => x.Rating).Not.Nullable();
  12:          Map(x => x.ReviewDate).Not.Nullable();
  13:          Map(x => x.ReviewerName).Not.Nullable().WithLengthOf(50);
  14:          Map(x => x.ModifiedDate).Not.Nullable();
  15:   
  16:          References(x => x.Product, "ProductID");
  17:      }
  18:  }


Please note that I have skipped some irrelevant parts of the mappings but you can download the sample project to get source code. A bit of clarification:
  • Product can have 0 or multiple reviews. From the code point of view, Product class has additional IList property.
  • Review is about a product, therefore there is a not null foreign key in the ProductReview table.


Conventions

Now we can move to the point ... as you can see in above mappings, in some places it's required to specify column name or table name. This is because AdventureWorks database doesn't follow the default convention. (check Convention over Configuration design pattern) Differences:
  • table name is different then class name (Product vs Production.Product)
  • id property has different name then primary key column (Id vs ProductID)
  • properties representing links between tables have different names then foreign key column names (Product vs ProductID)
Luckily for us we don't have to repeat the same changes for all our mappings ... we can change the default convention ... here is how it can be done:



   1:  var models = new PersistenceModel();
   2:   
   3:  // table name = "Production." + class name
   4:  models.Conventions.GetTableName = type => String.Format("{0}.{1}", "Production", type.Name);
   5:   
   6:  // primary key = class name + "ID"
   7:  models.Conventions.GetPrimaryKeyNameFromType = type => type.Name + "ID";
   8:   
   9:  // foreign key column name = class name + "ID" 
  10:  //
  11:  // it will be used to set key column in example like this:
  12:  //
  13:  // <bag name="ProductReview" inverse="true">
  14:  //   <key column="ProductID" />
  15:  //   <one-to-many class="AdventureWorksPlayground...ProductReview, AdventureWorksPlayground, ..." />
  16:  // </bag>
  17:  models.Conventions.GetForeignKeyNameOfParent = type => type.Name + "ID";
  18:   
  19:  // foreign key column name = property name + "ID"
  20:  //
  21:  // it will be used in case like this:
  22:  // <many-to-one name="Product" column="ProductID" />
  23:  models.Conventions.GetForeignKeyName = prop => prop.Name + "ID";
  24:   
  25:  models.addMappingsFromAssembly(typeof(Product).Assembly);
  26:  models.Configure(config); 


and our mapping can be simplified to this:



   1:  public class ProductReviewMap : ClassMap<ProductReview>
   2:  {
   3:      public ProductReviewMap()
   4:      {
   5:          Id(x => x.Id);
   6:   
   7:          Map(x => x.Comments).WithLengthOf(3850);
   8:          Map(x => x.EmailAddress).Not.Nullable().WithLengthOf(50);
   9:          Map(x => x.Rating).Not.Nullable();
  10:          Map(x => x.ReviewDate).Not.Nullable();
  11:          Map(x => x.ReviewerName).Not.Nullable().WithLengthOf(50);
  12:          Map(x => x.ModifiedDate).Not.Nullable();
  13:   
  14:          References(x => x.Product);
  15:      }
  16:  }

EDIT: API for conventions was changed completely therefore code which is above is no longer valid, you can find update in this post - Conventions After Rewrite


AutoPersistenceModel

In fact ... in our mappings, there is not much left ... for sure you won't find there anything particularly creative therefore why not get rid of it completely? Yes ... it's possible ... if everything is 100% in accordance with the convention then you can simply use the following code to get NHibernate configured:



   1:  var models = AutoPersistenceModel
   2:      .MapEntitiesFromAssemblyOf<ProductReview>()
   3:      .Where(t => t.Namespace == "AdventureWorksPlayground.Domain.Production" );
   4:   
   5:  models.Conventions.GetTableName = prop => String.Format("{0}.{1}", "Production", prop.Name);
   6:  models.Conventions.GetPrimaryKeyNameFromType = type => type.Name + "ID";
   7:  models.Conventions.GetForeignKeyNameOfParent = type => type.Name + "ID";
   8:  models.Conventions.GetForeignKeyName = prop => prop.Name + "ID";
   9:   
  10:  models.Configure(config);


And that is all what you need ... a few POCO objects representing database tables, AutoPersistenceModel and you are ready to go. For sure it allows you to start very fast with development but what worries me is that there is no way to say that some properties are mandatory or have length limit. Specifying those additional data may help you to discover data related problems faster and moreover, it should increase performance of the NHibernate ... but is it worth it? What do YOU think?

(EDIT: Examples in this post have been updated on 6.02.2009 to reflect changes in Fluent NHibernate API)

Links:
Other interesting posts about Conventions and AutoPersistenceModel:

Thursday 11 December 2008

Fluent NHibernate introduction and quick start guide

I'm sure that lots of people is familiar with NHibernate and had a chance to work with it. I had a chance and I have to admit that it was a good time. But there are certain things which I hate about NHibernate like ... XML files, configuration, mappings ... to get it running tons of XML has to be produced.

Right, it's true that once this is done you don't have to go back there very often ... unless you want to do some refactoring. If you would like to rename some classes, move them to a different namespace or maybe rename some of the properties then you are in a big trouble ... Visual Studio and ReSharper will help you with refactoring of your code but you have to maintain XML files on your own.

Introduction to Fluent NHibernate

Apparently someone finally got pissed off and decided to change it ... goal was to get rid of all that maddening XML files. In other words, idea was to replace this:


   1:  <?xml version="1.0" encoding="utf-8" ?>
   2:  <hibernate-mapping xmlns="urn:nhibernate-mapping-2.2"
   3:      namespace="Eg" assembly="Eg">
   4:  
   5:      <class name="Customer" table="Customers">
   6:          <id name="ID">
   7:              <generator class="identity" />
   8:          </id>
   9:  
  10:          <property name="Name" />
  11:          <property name="Credit" />
  12:  
  13:          <bag name="Products" table="Products">
  14:              <key column="CustomerID"/>
  15:              <one-to-many class="Eg.Product, Eg"/>
  16:          </bag>
  17:  
  18:          <component name="Address" class="Eg.Address, Eg">
  19:              <property name="AddressLine1" />
  20:              <property name="AddressLine2" />
  21:              <property name="CityName" />
  22:              <property name="CountryName" />
  23:          </component>
  24:      </class>
  25:  </hibernate-mapping>


with that:


   1:  public CustomerMap : ClassMap<Customer>
   2:  {
   3:      public CustomerMap()
   4:      {
   5:          Id(x => x.ID);
   6:          Map(x => x.Name);
   7:          Map(x => x.Credit);
   8:          HasMany<Product>(x => x.Products)
   9:              .AsBag();
  10:          Component<Address>(x => x.Address, m =>
  11:                 {
  12:                     m.Map(x => x.AddressLine1);
  13:                     m.Map(x => x.AddressLine2);
  14:                     m.Map(x => x.CityName);
  15:                     m.Map(x => x.CountryName);
  16:                 });
  17:      }
  18:  }

(example from James Gregory's blog)

In this form you can refactor your code as much as you like, it won't break your mappings. Your application is much more readable and maintainable. Thanks to Fluent NHibernate (FN) you can forget about XML files.

Quick Start Guide

We are still waiting for the first official release of Fluent NHibernate (FN) so at the moment the only option to get it is to use SVN client, check out the project and build dll on your own.

Having FluentNHibernate.dll you can simply add it to your project and start using it.

To make it easier I have prepared very basic web application which has FN set up, you can download it here. It uses popular AdventureWorks database.

Steps to get FN working:
  1. First of all you need to configure NHibernate which includes general configuration (connection string, dialect, driver etc) and mappings. In terms of general configuration, you have two options:
    • configure NHibernate without any XML files


         1:  // of course, in real life, connection string should be externalized
         2:  const string connectionString =
         3:  @"Data Source=MAREK-PC\SQLEXPRESS;Database=AdventureWorks;User Id=marek;Password=marek;Network Library=DBMSSOCN;Max Pool Size=400;";
         4:  
         5:  // configure nhibernate
         6:  Configuration config = MsSqlConfiguration.MsSql2005
         7:    .ConnectionString.Is(connectionString)
         8:    .UseReflectionOptimizer()
         9:    .ShowSql()
        10:    .ConfigureProperties(new Configuration());
        11:  
        12:  // load mappings from this assembly
        13:  config.AddMappingsFromAssembly(Assembly.GetExecutingAssembly());
        14:  
        15:  // build factory
        16:  ISessionFactory sessionfactory = config.BuildSessionFactory();


    • or you can decide to keep nhibernate.cfg.xml:


         1:  // read hibernate.cfg.xml
         2:  Configuration config = new Configuration().Configure();
         3:  
         4:  // load mappings from this assembly
         5:  config.AddMappingsFromAssembly(Assembly.GetExecutingAssembly());
         6:  
         7:  // build factory
         8:  ISessionFactory sessionfactory = config.BuildSessionFactory();


  2. Mappings, in this case my goal was to keep this example as simple as possible and elaborate it in future therefore for now only one table is mapped:
    • this is a POCO object representing a product:


         1:  public class Product
         2:  {
         3:      public virtual int Id { get; set; }
         4:      public virtual string Name { get; set; }
         5:      public virtual string ProductNumber { get; set; }
         6:      public virtual bool MakeFlag { get; set; }
         7:      public virtual bool FinishedGoodsFlag { get; set; }
         8:      public virtual string Color { get; set; }
         9:      public virtual int SafetyStockLevel { get; set; }
        10:      public virtual int ReorderPoint { get; set; }
        11:      public virtual int StandardCost { get; set; }
        12:      public virtual int ListPrice { get; set; }
        13:      public virtual String Size { get; set; }
        14:      public virtual int DaysToManufacture { get; set; }
        15:      public virtual String ProductLine { get; set; }
        16:      public virtual String Class { get; set; }
        17:      public virtual String Style { get; set; }
        18:      public virtual DateTime SellStartDate { get; set; }
        19:      public virtual DateTime SellEndDate { get; set; }
        20:      public virtual DateTime ModifiedDate { get; set; }
        21:  }
    • and mapping:


         1:  public ProductMap()
         2:  {
         3:      // table name is different then class name
         4:      WithTable("Production.Product");
         5:  
         6:      Id(x => x.Id, "ProductID");
         7:  
         8:      Map(x => x.SellEndDate);
         9:      Map(x => x.ReorderPoint);
        10:  
        11:      Map(x => x.Name).WithLengthOf(50).Not.Nullable();
        12:      Map(x => x.ProductNumber).WithLengthOf(25).Not.Nullable();
        13:      Map(x => x.MakeFlag).Not.Nullable();
        14:      Map(x => x.FinishedGoodsFlag).Not.Nullable();
        15:      Map(x => x.Color).WithLengthOf(15);
        16:      Map(x => x.SafetyStockLevel).Not.Nullable();
        17:      Map(x => x.StandardCost).CanNotBeNull().CustomSqlTypeIs("money");
        18:      Map(x => x.ListPrice).CanNotBeNull().CustomSqlTypeIs("money");
        19:      Map(x => x.Size).WithLengthOf(5).Not.Nullable();
        20:      Map(x => x.DaysToManufacture).Not.Nullable();
        21:      Map(x => x.ProductLine).WithLengthOf(2);
        22:      Map(x => x.Class).WithLengthOf(2);
        23:      Map(x => x.Style).WithLengthOf(2);
        24:      Map(x => x.SellStartDate).Not.Nullable();
        25:      Map(x => x.ModifiedDate).Not.Nullable();
        26:  }

      In this case table name is different then class name that is why I had to specify it but in general FN assumes that class name equals table name. The same approach is applied for properties and columns. Again you can specify that a column has different name then a corresponding property if necessary. Also it's not required to specify generator type for primary key column. FN checks property type and uses default one.

  3. Testability - significant advantage is that you don't have to remember about keeping XML files up-to-date with your code and database schema, but also FN offers number of ways to facilitate creation of unit tests and integration test. I will cover that in a next post. (edit: post regarding integration tests)

I encourage you to keep an eye on this project. In a meantime you can start playing with the Fluent NHibernate by checking the sample web application.

The AdventureWorks database you can download from Codeplex.

(EDIT: Examples in this post have been updated on 8.02.2009 to reflect changes in Fluent NHibernate API)

Related articles:

Thursday 4 December 2008

The Story of the Ribbon

Yesterday I had a chance to attend the 14th Poznan .Net User Group meeting. One of the guests was Mariusz Jarzębowski from Microsoft with his presentation "The Story of the Ribbon".


Microsoft Word 2007 and the Ribbon user interface

This excellent presentation was demonstrated for the first time during the MIX 2008 by Jensen Harris. (Edit: Presentation which we saw was inspired by both Jensen Harris' blog post and his presentation during the MIX08) Thanks to Mariusz we got a chance to see it live and "experience" evolution of Microsoft Office user interface (UI) from the very beginning - version 1.0 to the latest one - Office 2007. I think it's safe nowadays to say that The Ribbon started new era in UI design world. Mariusz gave us lots of insights regarding the process and decisions which have changed the way we think about the UI. Microsoft has invested years of work to check number of concepts, build prototypes, analyse data ... very detailed data.

For instance there are at least three ways to save document in MS Word
  1. Ctrl+S,
  2. save on toolbar and,
  3. menu File->Save.

Based on gathered data Microsoft realized that roughly 80% of people use Toolbar to save documents, 20% use Ctrl+S and almost noone uses the File->Save option. It was even more interesting that the same rule applies to other commands, hence that they can get rid of the old-fashion menu as people don't use it anyway!

Even though the numbers (exact number of clicks for each option) were convincing, still changing approach and switching from ordinary menu to the Ribbon was a risky decision. Microsoft took that risk and released the Ribbon which in my opinion (and not only my) makes them one of the most innovative company in the IT industry.

Mariusz also presented some other cool and relatively new concepts/features like:
  • Deep Zoom which allows users to explore collection of (usually) high resolution pictures without downloading them all at once to the client. Microsoft released Deep Zoom Composer which users can use to create presentation of images and then preview it with Silverlight 2 and Deep Zoom feature. check Seadragon project for more details and demos. Also check Hard Rock website for great example of Deep Zoom in action.
  • Photosynth is yet another cool concept which allows users to generate 3D model based on the pictures taken with any ordinary digital camera. Great stuff and very spectacular, it also uses Deep Zoom to enrich user experience. Check this presentation of Blaise Aguera y Arcas showing Deep Zoom and Photosynth.



  • pptPlex - Deep Zoom can also be used to change your Powerpoint presentations, this plug-in enables this feature.

For me this meeting was very interesting and I hope that Mariusz will visit us again at some point in the future. In a meantime let's enjoy all that new tools and keep eyes open for next innovative ideas as I'm sure that they will come.

Monday 1 December 2008

Problems with EPiServer 5 R2 and MS Vista

Here are a few problems which I have encountered while installing the latest EPiServer 5 R2 on MS Vista. It's nothing major but hopefully it will help someone.

So what can possibly go wrong?

1) In my case whole installation went smoothly, the first problem came out when I was expecting to see the normal start page of public templates. Error message was:

Compiler Error Message: CS0016: Could not write to output file 'c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\xxxxxx\xxxx


Luckily problem wasn't difficult to solve, it was a matter of permissions ... detailed solution can be found here.


2) Second problem was a bit strange -- I coudn't access edit and admin modes even though I was one hundred percent sure that credentials were okey and I was able to log in.

If you were playing with EPiServer before then probably you know that only certain roles can access edit/admin modes and it's configured in web.config. By default WebAdmins, Administrators can access both admin and edit modes; for WebEditors only edit mode is accessible. So the obvious thing to check was if my user belonged to any of those groups.

Surprisingly, I didn't have on my Vista group called Administrators ... why? Because I use localized version of Vista and group Administrators is localized and in my case called "Administratorzy" :)

Solution in such case is modifying the web.config by adding/updating the Administrator group name.

And that was it ... EPiServer 5 R2 was running fine :)

3) In the end I wanted to open Public Templates project to take a look if everything is as it used to be ... and yet another surprise:



Well, unfortunately Windows Vista Home Premium can't have Windows Authentication installed. It's a big shame and it was stoping Visual Studio from opening the project. I just wanted to take a look, I didn't need to debug this application on IIS but still Visual Studio was determined to keep the project closed.

The fastest workaround is to open project file in the notepad and modify the UseIIS property:



<WebProjectProperties>
<UseIIS>True</UseIIS>
<AutoAssignPort>True</AutoAssignPort>
<DevelopmentServerPort>50713</DevelopmentServerPort>
<DevelopmentServerVPath>/</DevelopmentServerVPath>
<IISUrl>http://localhost/</IISUrl>
<NTLMAuthentication>False</NTLMAuthentication>
<UseCustomServer>False</UseCustomServer>
<CustomServerUrl>
</CustomServerUrl>
<SaveServerSettingsInUserFile>False</SaveServerSettingsInUserFile>
</WebProjectProperties>



If you set UseIIS property to False, Visual Studio won't complain about missing Windows Authentication module anymore, easy?

Wednesday 3 September 2008

Edit Page - Shortcut/External link tab


Shortcut/External tab in edit mode is a useful thing. You can create different types of links there, you can also specify target frame which allows you for example to open external links in a new window. It all sounds great but do you really use it and support the target frame property in your code?

I have to admit that my code hasn't support that. Luckily for me it has changed recently but to be honest EPiServer doesn't have the best support for those properties. If you think that you will find some dedicated methods/properties to access selected values programmatically then I will save your time ... you won't!

On the other hand, all what you have to do is to get properties called 'PageShortcutType' and 'PageTargetFrame' which is not a rocket science assuming that you have a vague idea that those two properties exist.

So here is a short code snippet which I hope will save your time in future:



// get shortcut type
PropertyLinkType linkType = (PropertyLinkType) pageData.Property["PageShortcutType"];

// get terget frame
PropertyFrame targetFrame = (PropertyFrame) pageData.Property["PageTargetFrame"];


// use target frame only if we are dealing with external link
if (LinkType.External.Equals(linkType))
{
linkTitle.Target = targetFrame.FrameName;
}

Sunday 31 August 2008

ASP.NET and JQuery = powerful combination

Over last few years number of interesting JavaScript frameworks emerged. My reaction to JavaScript used to be quite allergic, for me there was no such thing like a maintainable JavaScript code. But apparently things have changed drastically. At the moment we can choose from many different frameworks:

All of them are pretty mature, well documented and definitely worthwhile.

Recently I was forced to play with JQuery a bit ... believe me, we were trying everything to avoid it but in the end we gave up. Why? Partly because we couldn't replace JQuery with ASP.NET Ajax in a efficient way and partly because after a profound investigation I changed my mind about Javascript. In fact I was having one of those 'Ah-ha!' moments ... I was really amazed how many cool things you can do with just a few lines of code.

I don't pretend to be a JQuery expert but let me show you just a simple example, simple use case which I used as a opportunity to learn more about JQuery.

Okey, let's start ... we want to create a personal contact list. Applications will list all people with a telephone number and some additional details. Important part of this use case is user experience, our aim is to use ajax calls as much as we can in order to avoid reloading the whole page. Let's say that this is how the application will look like:



I have used one of WUFOO's css thames and this is the only reason why this application looks not so bad ... UI and CSS design is not my strong side ;)

Anyway, interesting part now ... after clicking on details, table should expand and we should get some additional information regarding selected person. Expandable table should be implemented using ajax calls. Final view should look like this:



Assumptions

  • Normally, to create application like this, you need to have some way to persist data, it can be anything ... database, xml file, text file or anything else. In this case our data will be hardcoded (check source code) as this post is focused on user interface.
  • Number of controls can be used to generate table like this ... personally I would use repeaters to do the job but other options are also good and again ... that's not the point.

JQuery magic

JavaScript in this example is responsible for:
  • making sure that when 'Details' link is clicked our function will be invoked.
  • getting all required details for selected person with ajax call and showing them to the user.

All of that can be done with this piece of JavaScript:



$(document).ready(function () {
// find our 'Details' links and bind click event with our function
$(".aDetails").click(function (e) {

// we will use this link to get contactID
var link = $(this);

// find parent row
var row = link.parent('td').parent('tr');

// remove focus from the link
link.blur();
e.preventDefault();

// remove selection from other rows
$(".on").removeClass("on").next('tr.detailsPanel').remove();

// select this row
row.addClass("on");

// get details - ajax call
$.ajax({
url: applicationPath + "/utils/ContactDetails.aspx?contactID=" + link.attr("contactID"),
success: function(html){
row.after(html);
}
});
});
});

In-code comments should give you the idea about what is going on ... if something is still not clear then
  • check how the HTML is structured, it should help you understand what exactly functions like next('tr.detailsPanel') are doing,
  • and of course check JQuery documentation

Why ASP.NET Ajax is not a good alternative?

It's not a big challenge to get the same results in terms of user experience with ASP.NET Ajax, the simplest way is to put control responsible for rendering the table to a UpdatePanel like this:



<asp:UpdatePanel ID="UpdatePanel1" runat="server">
<ContentTemplate>
<asp:Repeater ID="Repeater1" runat="server">
....
</asp:Repeater>
</ContentTemplate>
</asp:UpdatePanel>


It works but in fact each ajax call refreshes whole table ... and in this case it means that almost whole site is refreshed instead of one row. As long as number of records is small it's fine ... but if you have huge number of records it might turn out to be a problem. I haven't found any sensible way to update just one row ... maybe you can suggest something?

I hope that this little application shows that JavaScript is back in the first league. Developers should be familiar with at least one JavaScript framework ... pick the one which you like. Frameworks like JQuery provide many very flexible and powerful features which can help enhance your sites.

My plan is to continue enhancing this application, keep learning and experimenting ... stay tuned as next parts of my adventures with JQuery will show up. Source code for this part can be downloaded from here.

Saturday 23 August 2008

Scrum - why extending sprint (iteration) length is usually not a good idea

I have to admit -- I'm a big fan of short iterations and I have plenty of reasons for it! But what does short mean? In most cases 2 week iteration is a good start. You should consider shorter iterations only if rule of having at least four, five iterations within a project can be broken, hence any project shorter then 8-10 weeks is a potential candidate. On the other hand, big, 6 months and more, projects should consider 3 week iterations ... but only ... I repeat ONLY ... when you are sure that scrum and in general agile software development process works for you and you are happy with the results.

How can you know that scrum works for your team?

One thing has to be clear -- you don't have to use pure scrum to make it work for you. Sometimes applying only some aspects of agile software development is the best approach and it doesn't make it any less agile. Fundamental thing about being agile is improving the process, so get rid of things that doesn't work, experiment with new ideas and learn!
It's often the case that people feel that there is something wrong but they can't really say what exactly therefore I highly recommend watching this presentation: "10 Ways to Screw Up with Scrum and XP". Also you can find more materials on Henrik Kniberg's blog. Check if you find your problems on this list, check if you follow fundamental scrum rules.

Why short iterations are better?

I stated before that I have plenty of arguments ... so here is a list:

  1. The most important one - short iterations allow you to find problems early, always look for problems and remember about the rule: visible problem = killable problem. If your team is aware of the problem then there is a good chance that they can cope with it. Treat problems like a opportunity for improvement.
  2. With short iterations new ideas and improvements can be applied quicker. After each iteration you should have a retrospective meeting to identify problems and figure out how to improve the process for next iteration. Short iterations mean that you can improve more frequently.
  3. About certain problems you can learn only after full iteration ... for instance common problem is that people don't work as a team ... each person has it's own task and at the beginning of a iteration all stories are in progress. Sounds good as it means that whole team is working and is moving forward but actually it's not good as you can't really say how many user stories will stay in progress to the end of iteration. In worst case all stories are 90% done by the end of iteration which means that from a client perspective this iteration didn't add any new functionality, any releasable code, any new value.
  4. Estimations and velocity - at the beginning of a project how can you provide reliable estimations? How do you know how long it will take to deliver whole system? At the beginning you don't know much about team's performance, you don't know how cooperation between the team and a client will be looking like. So probably you can't provide any reliable estimations ... you can only guess. But first few initial iteration should give you good idea about team's velocity which should result in much more reliable estimations ... now you can figure out what should be done in next two, five or ten months. Sounds like a useful thing?

Are you convinced? I hope you are ... if not ... then challenge me and post your arguments! Maybe you can persuade me that I'm wrong :)

kick it on DotNetKicks.com

Sunday 17 August 2008

EPiServer, MultipageProperty -- don't use SelectedPages property!

I don't know how it works for you, but I can't imagine life without MulipageProperty. I use it in most of our projects and that is great because I love flexibility which it offers. Recently while checking MulitpageProperty source code I found something worrying -- if you check how PropertyMultiPage.SelectedPages property is implemented then you will find that for each internal page following method is invoked:



private PageData GetInternalPage(PageData pd, MultipageLinkItem multipageLinkItem)
{
//make sure pagedata object is writeable
pd = pd.CreateWritableClone();

// We must have detected a change, and have a link text
if (multipageLinkItem.EditorHasChangedLinkText == true &&
string.IsNullOrEmpty(multipageLinkItem.LinkText) == false &&
multipageLinkItem.LinkText.Trim().Length > 0)
{
AddPropertyHelper(pd, "PageName", new PropertyString(multipageLinkItem.LinkText));
// We should also be able to test if the link text has been changed
// from the outside, so we add a flag that indicates that the PageName
// has changed
AddPropertyHelper(pd, "LinkTextHasChanged", new PropertyBoolean(true));
}


PropertyFrame frameProp = new PropertyFrame();
frameProp.FrameName = multipageLinkItem.Target;
AddPropertyHelper(pd, "PageTargetFrame", frameProp);

AddPropertyHelper(pd, "PageLinkToolTip", new PropertyString(multipageLinkItem.Tooltip));

// It needs to have published status, or it may be filtered away
AddPropertyHelper(pd, "PageWorkStatus", new PropertyNumber((int)VersionStatus.Published));

// It should look like something we just fetched
pd.IsModified = false;

return pd;
}



What is the problem with that method? Problem is here:


pd = pd.CreateWritableClone();


Do we really need to create writable version of PageData for each page? Well, as you can see in above code, page is "extended" with additional properties, which you may know from MulipageProperty's dialog. Probably, without creating writable version of page, "extending" is not possible. It's of course quite cool feature but you have to be aware that there are performance consequences of calling CreateWritableClone() method.

So what should I do?

I'm far from saying that we should rewrite this piece of functionality or stop using MulitpageProperty at all ... that would be crazy!
But I would like to suggest using SelectedLinkItems instead of SelectedPages property. PropertyMultiPage.SelectedLinkItems property returns MultipageLinkItemCollection object, single MultipageLinkItem object contains all necessary page related data including URL, link target, link text etc.

Take a look on PropertyMultiPage.SelectedPages property implementation to see how you can use MultipageLinkItem object to get everything what you need:



PageDataCollection SelectedPages
{
get
{
....

// All link items
MultipageLinkItemCollection linkItems = this.SelectedLinkItems;

foreach (MultipageLinkItem multipageLinkItem in linkItems)
{

PageReference pageref = PageReference.ParseUrl(multipageLinkItem.Url);
PageData page = null;

// ParseUrl also work for fully qualified urls (http://...) which
// we will never have for our own pages. To qualify as an internal
// EPiServer page, the parse must be successful, and the url must
// start with "/". If we cannot load these pages, they will be
// removed from the collection altogether.
if ( ! PageReference.IsNullOrEmpty(pageref) && multipageLinkItem.Url.StartsWith("/"))
{
// get the page with error handling for
// access denied or deleted page
try
{
// Could be language sensitive
if (EPiServer.Configuration.Settings.Instance.UIShowGlobalizationUserInterface)
{
// First we check if we have a specific language to load
if (string.IsNullOrEmpty(multipageLinkItem.LanguageId) == false)
{
// Load page, with specific language
page = EPiServer.DataFactory.Instance.GetPage(
pageref, new LanguageSelector(multipageLinkItem.LanguageId));
}
else
{
// Load page, with master language fallback
page = EPiServer.DataFactory.Instance.GetPage(
pageref, LanguageSelector.AutoDetect(true /* enableMasterLanguageFallback */));
}
}
else
{
page = DataFactory.Instance.GetPage(pageref);
}

}
catch (PageNotFoundException notFoundEx)
{
...
}
}

....
}
return _selectedPages;
}
set
{
_selectedPages = value;
}
}


I hope that it gives you a good idea how to use MultipageLinkItem class and how to improve performance of your applications. Maybe that will also inspire someone to spend some spare time tweaking PropertyMultiPage.SelectedPages implementation to not use CreateWritableClone() method.

Sunday 10 August 2008

Mary Poppendieck -- The role of leadership in software development

Recently I keep finding lots of interesting stuff about team management. This time talk of Mary Poppendieck “The role of leadership In software development” came to my attention. I found it on Google's Tech Talks Channel. But what is it all about?

When you look around, there are a lot of leaders recommended for software development. We have the functional manager and the project manager, the scrum master and the black belt, the product owner and the customer-on-site, the technical leader and the architect, the product manager and the chief engineer.

Clearly that's too many leaders. So how many leaders should there be, what should they do, what shouldn't they do, and what skills do they need?

This will be a presentation and discussion of leadership roles in software development -- what works, what doesn't and why.


For me that introduction was interesting enough to spend 1 hour and 30 minutes watching the talk and after all I have to admit that it was absolutely worth it therefore I recommend you doing the same!




My main take-aways are:

  • general overview how the concept of leadership was evolving staring from 1850 to present times. I was surprised how closely it was connected with army and how many important breakthroughs were triggered by wars.
  • what really makes organisations work is not one standardized process and people that do exactly what is written down. In that model you can forget that people will be interested in process improvement, it's impossible to make a full use of people potential. Basically, it doesn't work!
  • leader is a person with vision, leader's job is to communicated the vision, help the team members to understand it. Leader should act like a teacher, it's not his job to tell people what to do, his job is to tell people how things should work.
And finally, Mary was talking about three different kinds of leadership, here is a high level overview:
  • Product leader -- it's a person which merges marketing knowledge (understands customers needs) and fairly high level technical expertise. Product leader should work closely with developers and is responsible for releases planing and making necessary tradeoffs.
  • Functional leader -- this person should preserve knowledge and hold technical expertise leadership. Person like that should be responsible for solving the most difficult problems in all projects. Other important part of this role is to train people, help them getting better and grow to their full potential.
  • Project leader -- funding, scheduling and tracking -- those are the main objectives for this role.
Those are the things which were particular interesting for me. Watch the talk and if you care to share then let me know what aspects were interesting for you.

Wednesday 6 August 2008

ASP.NET Web Application debugging and timeouts

While developing web applications it's absolutely normal that at some point it's necessary to debug a code to check variable's value, execution flow for some weird input data and so on. Before running application in debug mode Visual Studio will prompt you with a following dialog:



For developers answer is simple ... of course that we want to enable debugging! So you click OK button and everything works fine. But do you know where is it saved and what are the implications? If you are not sure then probably you should read this post carefully ;)


<compilation debug="false"/>


This part of web.config determines if debugging is enabled or not. Also this part will be changed if you click OK button. What are the implications beside the fact that you can debug the code? Well, literally, consequences are extremely HUGE, here comes a short list:

  1. The compilation of ASP.NET pages takes longer (since some batch optimizations are disabled)
  2. Code can execute slower (since some additional debug paths are enabled)
  3. Much more memory is used within the application at runtime
  4. Scripts and images downloaded from the WebResources.axd handler are not cached


Moreover, timeouts are turned off, which is necessary to debug the code but can be potentially very dangerous on production sever.

I don't want to write yet another post about debug="true" attribute, production servers and performance impact as there are already lots of really good posts about it, check at least those two:


But one additional thing is worth pointing out ... even if you forget about the debug="true" issue then you can do something on production servers to make sure that web.config settings will be ignored:

in machine.config file set:



<configuration>
<system.web>
<deployment retail="true"/>
</system.web>
</configuration>


You will disable the debug="true" switch, disable the ability to output trace output in a page, and turn off the ability to show detailed error messages remotely. Note that these last two items are security best practices you really want to follow (otherwise hackers can learn a lot more about the internals of your application than you should show them).


Debug mode disabled and timeouts

In the optimistic version, when debug is turned off you can control your application timeouts with this attribute:



<system.web>
<httpruntime executiontimeout="40">
</httpruntime>


executionTimeout specifies the maximum number of seconds that a request is allowed to execute before being automatically shut down by ASP.NET.


But remember that:

This time-out applies only if the debug attribute in the compilation element is False.


I hope that this post made those things clear for you, check your configuration and make sure that web.config settings are not slowing your application down.

Tuesday 5 August 2008

Power of the Retrospective

After over a year developing a number of EPiServer projects we finally managed to get everyone in one room and do the retrospective. It was great to realize that actually EPiServer team is not so small anymore and that the people did lots of cool stuff over a last few months.

But first things first ... what is a retrospective? It's a special meeting which is held to discus the previous iterations and during which people should try to find ways to improve process in future. Retrospective meetings are actually quite fundamental part of agile software development process which should evolve and improve with each iteration.

We did quite high level review of all our EPiServer projects and it was very positive that people didn't complain much about EPiServer itself. Of course, we had to deal with a number of issues (like performance) but in the end we managed to overcome all the problems. Not surprisingly the most difficult part was communication between clients and developers. Usually when people say something like this, they think about communicating requirements to the developers ... and they are right, but ... in many cases problems were also on our side. We haven't done the best job in terms of showing the client how to use our page types, custom properties and so on. That's something which we definitely need to improve. It's a challenge to document the page types in a way which is friendly to the end user. (How do you deal with this problem in your projects?)

Thanks to this meeting we were able to identify top priority problems which need to be solved but also it was a great opportunity to share knowledge. It's literally impossible to be up-to-date with all the ongoing projects within the Cognifide therefore it's quite often the case that some really cool stuff implemented in one project can be used in another one but people simply don't know about it. Retrospective meetings are the key to raise people awareness about 'already solved problems' and warm up internal communication.

Monday 28 July 2008

PlugIns and DataFactory Event Handlers

Inspired by great post of Allan Thraen about When and Where to attach DataFactory Event Handlers I decided to keep digging into this subject. What seems really cool about it is that you can attach to DataFactory events and moreover you can do it in a pluggable way.

In this post I wanted to show how Allan's PlugInAtttibute class can be refined to read configuration from the EPiServer.

But first of all ... real life example ... to justify existence of this post ;)

Task: I want to be able to hide some pages in edit mode, tweak left side tree to not show some of my pages. And additionally I want to have it configurable so that I can change ID of the page which will be hidden without recompiling source code, changing web.config etc.

Step 1 - Create GuiPlugIn

In our case we will use it only to store data. Class can look like this:


[GuiPlugIn(Area = PlugInArea.None, DisplayName = "Settings")]
public class PluginSettings
{
protected string pagesToHide;

[PlugInProperty(Description = "Page ID", AdminControl = typeof(TextBox))]
public string PagesToHide
{
get { return pagesToHide; }
set { pagesToHide = value; }
}
}


and thanks to that in admin mode we will be able to define page ID.



Step 2 - Attach to the FinishedLoadingChildren event

In this step we will follow Allan's approach:



public class RegisterEventsPlugIn : PlugInAttribute
{
public static void Start()
{
//Attach to the right events
DataFactory.Instance.FinishedLoadingChildren += Instance_FinishedLoadingChildren;
}


static void Instance_FinishedLoadingChildren(object sender, ChildrenEventArgs e)
{
// /UI/to/edit/EditTree.aspx
if (HttpContext.Current.Request.Url.ToString().StartsWith(Settings.Instance.UIUrl.ToString()))
{
PluginSettings settings = new PluginSettings();
PlugInSettings.AutoPopulate(settings);

foreach (PageData child in e.Children)
{
if (child.PageLink.ID.ToString() == settings.PagesToHide)
{
e.Children.Remove(child);
break;
}
}
}
}
}



One thing which might be new are those two lines:


PluginSettings settings = new PluginSettings();
PlugInSettings.AutoPopulate(settings);


Those two lines do the trick of reading the configuration. And basically that's it, everything should be working now.

I was also thinking about different scenario where this approach might be perfect -- we use SortIndex to order pages quite often, but it's a hassle for editors to remember about it. Solution might be to create plugIn which would automatically, based on some custom rules, assign correct SortIndex for newly created pages. It's definitely doable and I'm sure that editors would love it!

Monday 14 July 2008

Missing features of EPiServer

In our everyday work we encounter from time to time missing "things" in EPiServer which would make our life easier. I think it's worth talking about this stuff to let EPiServer team know that there is something on our wish list :)
Sometimes it can be something really small and relatively easy to add. I have something like this -- I would like to be able to add my configuration options to custom properties.

Here is how the edit property page looks like at the moment:



All what I'm asking for is a possibility to define new fields on this page, maybe in the same way as we can define custom plugin properties ...



[PlugInProperty(Description = "Some nice description", AdminControl = typeof(TextBox))]
public string CustomProperty
{
get { return customProperty; }
set { customProperty= value; }
}



Why I need it?

Well, whenever you are building more sophisticated properties, usually you want to make them reusable and that's when you need to have a way to configure it. Having that possibility, you can have different configuration for each page type.

You need a real life example? Here it comes ... we all use and love multipage property. But the truth is that adding multiple pages to the selection can be very annoying. To add additional page to the list every time you have start from root and browse down. It would be so much nicer if we could configure start page for this property isn't it? Adam was writing about this issue some time ago.

Sometimes small changes make big difference, do you have more examples/requests like this?

Sunday 13 July 2008

The One Minute Manager ... don’t miss it!

Have you ever been trying to figure out how people work best with other people? When they produce good results and are happy about their job, company and other people? If you are one of those who feel that this is important and interesting then I can highly recommend you this book: The One Minute Manager. It's a very well-written book which you can finish in one evening. The main message is that:

People who feel good about themselves produce good results.


But in the same time it's important to keep the organisation productive. Authors are trying to show that those two objectives can be achieved in the same time, moreover, the best way to achieve good productivity is through people. It all can be accomplished by applying one minute management. To understand how it works you need to be familiar with three secrets of one minute management.

The First Secret: One Minute Goals

One Minute Goal Setting is absolutely essential, it introduces philosophy of 'no surprises'. The point here is to be clear about what has to be done from the beginning. Therefore:

The One Minute Manager feels that a goal, and its performance standard, should take no more than 250 words to express. He insists that anyone be able to read it within a minute. Manager always makes sure that people know what good performance is. Performance standards are clear.

Rationale for that:

You see, in most organizations when you ask people what they do and then ask their boss, all too often you get two different lists. In fact, in some organizations I've worked in, any relationship between what I thought my job responsibilities were and what my boss thought they were, was purely coincidental. And then I would get in trouble for not doing something I didn't even think was my job.


And this is how the process should looks like step by step:

  1. Agree on your goals.
  2. See what good behavior looks like.
  3. Write out each of your goals on a single sheet of paper using less than 250 words.
  4. Read and re-read each goal, which requires only a minute or so each time you do it.
  5. Take a minute every once in a while out of your day to look at your performance, and
  6. See whether or not your behavior matches your goal.

Why does it work?

When your goals are clear you can work until your job is done. All the time you can compare your results with your goal which gives instant feedback. And a few words about the feedback:

Clearly the number one motivator of people is feedback on results. In fact, we have a saying here that's worth noting: 'Feedback is the Breakfast of Champions.' Feedback keeps us going.


The Second Secret: One Minute Praisings

Manager should strive to help people succeed and become a big help to the organization, therefore manager's main concern, especially at the beginning of a new task or responsibility, should be:

Help people reach their full potential, catch them doing something right.


Managers usualy try to catch people doing something wrong ... so why "catch them doing something right"? To help people by letting them know in no uncertain terms when they are doing well, to praise them and make them feel good.

Remember you don't have to praise someone for very long for them to know you noticed and you care. It usually takes less than a minute. And that's why it's called a One Minute Praising

This is how the process should looks like step by step:

  1. Tell people up front that you are going to let them know how they are doing.
  2. Praise people immediately.
  3. Tell people what they did right—be specific.
  4. Tell people how good you feel about what they did right, and how it helps the organization and the other people who work there.
  5. Stop for a moment of silence to let them "feel" how good you feel.
  6. Encourage them to do more of the same.
  7. Shake hands or touch people in a way that makes it clear that you support their success in the organization.

Why is it so important?

Key to training someone to do a new task is, in the beginning, to catch them doing something approximately right until they can eventually learn to do it exactly right.

but in some organisation it doesn't always work this way ...

That is what we often do with new, inexperienced people. We welcome them aboard, take them around to meet everybody, and then we leave them alone. Not only do we not catch them doing anything approximately right, but periodically we zap them just to keep them moving. This is the most popular leadership style of all. We call it the 'leave alone-zap' style. You leave a person alone, expecting good performance from them, and when you don't get it, you zap them.


The Third Secret: One Minute Reprimands


If you have been doing a job for some time and you know how to do it well, and you make a mistake, the One Minute Manager is quick to respond.

One minute reprimand has two parts and here is how it should looks like step by step:

  1. Tell people beforehand that you are going to let them know how they are doing and in no uncertain terms.

    The first half of the reprimand:
  2. Reprimand people immediately.
  3. Tell people what they did wrong—be specific.
  4. Tell people how you feel about what they did wrong—and in no uncertain terms.
  5. Stop for a few seconds of uncomfortable silence to let them feel how you feel.

    The second half of the reprimand:
  6. Shake hands, or touch them in a way that lets them know you are honestly on their side.
  7. Remind them how much you value them.
  8. Reaffirm that you think well of them but not of their performance in this situation.
  9. Realize that when the reprimand is over, it's over.

The most important is to remember this:

You will be successful with the One Minute Reprimand when you really care about the welfare of the person you are reprimanding.

What might happen if you don't use this rule in everyday life?

If managers would only intervene early, they could deal with one behavior at a time and the person receiving the discipline would not be overwhelmed. They could hear the feedback. That's why I think performance review is an ongoing process, not something you do only once a year.


That's it, those are all the secrets of the one minute management in a nutshell. Maybe I can add one last thing:

"I can see now," the young man said, "where the power of your management style comes from—you care about people."
"Sometimes," the One Minute Manager said, "you have to care enough to be tough. And I am. I am very tough on the poor performance—but only on the performance. I am never tough on the person."


Sound easy and cool isn't it? In the end we spend lots of time working, if there is something we can do to enjoy this time then I believe we should at least give it a try. So if you are asking yourself: Should I apply one-minute management? The answer for me is obvious: YES!

If you found one minute management interesting I highly recommend reading the book. It will give you much more details, examples and guidelines how to become One Minute Manager.