Death March

 

I read Death March over the weekend.  There were some interesting points about  Death March projects:

·         There are a number of good, in fact, necessary reasons to work on Death March project.

·         The fact that the software industry employs many young people who have both the time and energy to work on a Death March project

·         The absolute necessity to get away from the overall bureaucracy when working on a Death March project – esp the methodology police, the physical plant police, etc..

·         The importance of negotiation in the project –esp with the stakeholders

·         The law of diminishing returns – after a certain point the overtime that is put in works to the detriment to the code base and the overall project. 

·         The team is critical – you need people who can and like to work together

·         I liked his suggestions to separate the team: “skunk works”, telecommute, and graveyard shift are all good ways to give the team a physical separation they need from the normal bureaucracy.

·         The most important point of the book: Triage.  All external constraints need to be filtered before they get to the project (including requirements!)   I liked his 80/20 rule – 20 of the “required” functionality will not be delivered, so triage to filter it out early.  The project will still be considered a success.

·         One more interesting point: the author seems to embrace XP (and many of the processes in Agile like daily/continuous builds, etc…) to solve Death March problems.  After reading this book, I would agree with him.

Advertisements

PLINQ – Does it save any time?

Well, yeah!

 

I created a quick Console application  and set up a linear  function call like so:

 

            Stopwatch stopwatch = new Stopwatch();

            stopwatch.Start();

            GetNorthwindCustomers();

            GetAdventureWorkWorkOrders();

            stopwatch.Stop();

            Console.WriteLine(string.Format("Elapsed Time: {0} milliseconds", stopwatch.ElapsedMilliseconds.ToString()));

 

And set up 2 data objects – I used Entity Framework for Northwind and  LINQ to SQL for Adventure works

        static void GetNorthwindCustomers()

        {

            ConsoleColor currentColor = Console.ForegroundColor;

            Console.ForegroundColor = ConsoleColor.Red;

            NorthwindEntities dataContext = new NorthwindEntities();

 

            var customers = (from c in dataContext.Customers

                             select c).OrderBy(c => c.CompanyName);

 

            for (int i = 0; i < 50; i++)

            {

                foreach (Customer customer in customers)

                {

                    Console.WriteLine(string.Format("CustomerID: {0} – CompanyName: {1}", customer.CustomerID, customer.CompanyName));

 

                }

            }

            Console.ForegroundColor = currentColor;

        }

 

And

        static void GetAdventureWorkWorkOrders()

        {

            ConsoleColor currentColor = Console.ForegroundColor;

            Console.ForegroundColor = ConsoleColor.Blue;

            AdventureWorksDataContext dataContext = new AdventureWorksDataContext();

 

            var workOrders = (from wo in dataContext.WorkOrders

                            select wo).OrderBy(wo => wo.StartDate);

 

            foreach (WorkOrder workOrder in workOrders)

            {

                Console.WriteLine(string.Format("WorkOrderId: {0} – StartDate: {1}", workOrder.WorkOrderID, workOrder.StartDate.ToString()));

            }

            Console.ForegroundColor = currentColor;

        }

 

After running it, I get the following result:

 

I then added the AsParallel to the data context like this:

var customers = (from c in dataContext.Customers.AsParallel()

var workOrders = (from wo in dataContext.WorkOrders.AsParallel()

and ran it

The results were:

Two questions came to mind:

·         Why was there a performance gain?

·         The database calls still ran in sequence.  How can I get them to run simultaneously?

I decided to tackle the second question 1st – leaving the 1st as an academic exercise to be completed later.

I realized immediate that the database calls were running in sequence because, well, they were being calling in sequence:

            GetNorthwindCustomers();

            GetAdventureWorkWorkOrders();

 

To get the database calls to run in parallel, I needed to use Parallelism at that level of the call stack.

I first tried to add both function calls to a Task:

            stopwatch.Start();

            Task taskNorthwind = Task.Factory.StartNew(() => GetNorthwindCustomers());

            Task taskAdventureWorks = Task.Factory.StartNew(() => GetAdventureWorkWorkOrders());

            stopwatch.Stop();

 

The funny thing is – the stopwatch ran immediately and the 2 other tasks ran after that.  So I need a way to only running the database calls in parallel, not the stopwatch.

I then tried the Parallel.Invoke method like so:

            Parallel.Invoke(GetNorthwindCustomers);

            Parallel.Invoke(GetAdventureWorkWorkOrders);

 

And I got the same result as a linear call:

So then I realized I should put both tasks in the same Parallel call:

Parallel.Invoke(GetNorthwindCustomers, GetAdventureWorkWorkOrders);

 

Now the different commands in each function are running in parallel – like the console color.  However, the database calls are still operating as a single block.  I then decided to test this hypothesis by putting a Thread.Sleep  in the first  database fetch.

What do you know, the database calls do interleave – if given enough time:

I did notice that no matter what I did with the tasks, the actual sequence from the LINQ was the same (because of the orderby extension method).  That is good news for programmers that want to add parallelism to their existing application and rely on the order from the database (perhaps for the presentation layer).

All in all, it was a very interesting exercise on a Saturday AM.  I ordered

 

I wonder if there are new patterns that I will uncover after reading this.

 

 

 

 

Export To Excel in MVC

I ran into the need to export a table to Excel in a MVC application.  A quick search on Bing showed a couple of different options.  For example, Steven Walther had the most complete solution found here but it had waaay to much code.

My original code in a web form application is much more compact:

        private void ExportTimesToExcel()

        {

            Response.ContentType = "application/vnd.ms-excel";

            Response.Charset = "";

            this.EnableViewState = false;

 

            StringWriter stringWriter = new StringWriter();

            HtmlTextWriter textWriter = new HtmlTextWriter(stringWriter);

 

            HtmlForm form = new HtmlForm();

            this.Controls.Add(form);

            form.Controls.Add(this.SwimmerTimeGridView);

            form.RenderControl(textWriter);

 

            Response.Write(stringWriter.ToString());

            Response.End();

        }

 

So I thought if there was a way to duplicate that in MVC.  I ran across a good starting point here.

public class ExcelResult : ActionResult

{

      public string FileName { get; set; }

      public string Path { get;set; }

 

      public override void ExecuteResult(ControllerContext context)

      {

            context.HttpContext.Response.Buffer = true;

            context.HttpContext.Response.Clear();

            context.HttpContext.Response.AddHeader("content-disposition", "attachment; filename=" + FileName);

            context.HttpContext.Response.ContentType = "application/vnd.ms-excel";

            context.HttpContext.Response.WriteFile(context.HttpContext.Server.MapPath(Path));   

      }

}


 

And here is the function to get the File

public ExcelResult GetExcelFile()

{

      return new ExcelResult

                  {

                        FileName = "sample.xls", Path = "~/Content/sample.xls"

                  };

}

 

And here is a MVC View control that calls it

<%= Html.ActionLink("Download Excel", "GetExcelFile", "Home")%>

 

So I needed to meld together some separated parts.

They key is this line:

context.HttpContext.Response.WriteFile(context.HttpContext.Server.MapPath(Path));   

The problem is that I don’t have an Excel file to start with so I can’t use Write File.  I first thought of writing a stream of XML that Excel recognizes.  So what is the fastest way to get XML from a Data Table?  I can’t use the web form trick of this

form.Controls.Add(this.SwimmerTimeGridView);

            form.RenderControl(textWriter);

 

because the controller doesn’t know anything about how the views are rendering the model.  It could use a Table, it could use a series of labels, etc…

I then thought (briefly) about looping through some LINQ  that is hitting the entire model – and realized that I didn’t want to do that (that is what Steve did)

I then thought – wait a second.  <insert second here> ADO.NET has a XML from DataTable.  If I had a model with the data as a DataTable, I can render the stream out as XML and everything should work…

Step #1: Add a ADO.NET DataSet

 
 

 

Then

Step #2: Create that Excel Class

    public class ExcelResult : ActionResult

    {

        public string XMLStream { get; set; }

        public string FileName { get; set; }

 

        public override void ExecuteResult(ControllerContext context)

        {

            context.HttpContext.Response.Buffer = true;

            context.HttpContext.Response.Clear();

            context.HttpContext.Response.AddHeader("content-disposition", "attachment; filename=" + FileName);

            context.HttpContext.Response.ContentType = "application/vnd.ms-excel";

            context.HttpContext.Response.Write(XMLStream);

        }

    }

 

Step #3: Call it from the Controller

 

        public ExcelResult GetExcelData()

        {

            Northwind northwind = new Northwind();

 

            string xmlStream;

            using (StringWriter sw = new StringWriter())

            {

                northwind.Tables[0].WriteXml(sw);

                xmlStream = sw.ToString();

            }

 

 

            return new ExcelResult

            {

                FileName = "sample.xls",

                XMLStream = xmlStream

            };

 

Step #4: Hook up a link to try it out

    <%= Html.ActionLink("Download Region Info To Excel", "GetExcelData", "Region")%>

 

The first try I got this:

And the result was this:

Ooops – Looks like the DataTable is not like Linq to SQL or Entity Framework – I actually have to code up the loading of it.  I added this code to load the data table

            RegionTableAdapter regionTableAdapter = new RegionTableAdapter();

            Northwind.RegionDataTable regionDataTable = regionTableAdapter.GetData();

 

Getting Closer

I have the data, Excel is launching, but Excel is not recognizing the XML as XML:

So the last thing I have to do is change the extension to .xml

            return new ExcelResult

            {

                FileName = "sample.xml",

                XMLStream = xmlStream

            };

 

and boom goes the dynamite (assuming the user has .xml hooked to open as Excel)

 

MVC Ajax/Json and Serialization

One of the biggest learning curves for me over the last week is how to pass data from the client to the server using Ajax and Json in an MVC solution.  As I can tell, there are 3 main patterns:

·         Individual parameters

·         Complex object that is serialized

·         Abstract Collection or array

MVC uses:

·         Request.Form[“x”] where x is the name of the property to assign to the object

·         UpdateModel(x) where x is the name of the class you want to update.  Behind the scenes, MVC hooks up the proprieties from the form collection to the proprieties in that object.  Therefore, the name of the property must match exactly between the form and the object

·         Or just pass in the object to the parameter.  Behind the scenes, MVC assigns the values to the properties, serializes the object, and throws it into your parameter

The first option, Request.Form[“x”] where x is the name of the property to assign to the object:

For example:

        [HttpPost]

        public ActionResult Create()

        {

            try

            {

                Region region = new Region();

                region.RegionID = Int32.Parse(Request.Form["regionId"]);

                region.RegionDescription = Request.Form["regionDescription"];

 

                //Insert Logic Here

                return RedirectToAction("Index");

            }

            catch

            {

                return View();

            }

        }

 

But you can’t have 2 methods with the same interface, so you can’t have this also:

        public ActionResult Create()

        {

            return View();

        }

 

So you can drop the one for the Get – but then every get requires an input.  So you are stuck, unless you muck about in the routing.

Alternatively, you can use the default and pass in a variable and not use it – which is what MVC does out of the box with the formCollection.  I passed in an Id for the heck of it:

        [HttpPost]

        public ActionResult Create(int? id)

        {

            try

            {

                Region region = new Region();

                region.RegionID = Int32.Parse(Request.Form["regionId"]);

                region.RegionDescription = Request.Form["regionDescription"];

 

                //Insert Logic Here

                return RedirectToAction("Index");

            }

            catch

            {

                return View();

            }

        }

 

And looking at the watch, you can see that the Form collection is being populated:

 

This is all well and good, but does not help me with a JSON request (and least not directly) because there is no Form Collection to be had. 

Interestingly, when making a method with the class as a parameter, MVC does a Json call behind the scenes.  For example, consider this method that returns a View:

        [HttpPost]

        public ActionResult Create(Region region)

        {

 

            //Insert Logic Here

 

            return View();

        }

 

Check out the base value: JsonSerialization:

So I wondered if MVC can do it, why can’t I?  Can I send the exactly same Json values and have it parse?

I wrote some JQuery with Ajax like so:

    <script type="text/javascript">

        function CallService() {

            var regionId = $("#RegionID").val();

            var regionDescription = $("#RegionDescription").val();

            var postData = { RegionID: regionId, RegionDescription: regionDescription };

            $.ajax({

                type: "POST",

                url: "/Region/Create",

                data: postData,

                dataType: "json"

            });

        }

    </script>

 

I then fired up Fiddler and find out:

Here is the request/response:

 

 And Holy Cow – It worked!  Check out a Quick Watch from my controller:

 
When using Json and Ajax, all you need is an object on the client, explicitly assigning  the parameter values, and have a matching object on the controller’s parameters.  I have not investigated default values, missing values, and the like, but I am just excited to see this working.

CSS gotcha when using VS2010 and IE

I ran into a frustrating gotcha when using VS2010 and updating my .css.

I created the following .css entry:

/* Hidden Column

———————————————————*/

 

.noDisplay

{

    display:none;

}

 

I then implemented it in a basic MVC View:

    <table>

        <tr>

            <th class="noDisplay">

                RegionID

            </th>

            <th>

                RegionDescription

            </th>

        </tr>

 

    <% foreach (var item in Model) { %>

   

        <tr>

            <td class="noDisplay">

                <%: item.RegionID %>

            </td>

            <td>

                <%: item.RegionDescription %>

            </td>

        </tr>

   

    <% } %>

 

    </table>

 

I then spun up my site and started changing some things.  Interesting, every time I change a value in the .css, it was not reflected on the next spin up of the site.  Even if I closed the instance of Cassani running on the desktop, I was getting the same mysterious behavior.  After some frustrating experimenting (that you fiddler), I deduced that IE is caching the .css so as long as I was using the same url (localhost/xxx), the cached .css was being used – IE was not recognizing changes to the .css.  Ugh.

To get around IE’s limitation (perhaps there is also a setting – I haven’t checked yet), I changed the site address everytime.  How, you might ask?  Under Project Properities-> Web in VS:

 

Once I specified a different port for each run – IE would dedect a change in the address and re-load all of the filed – including the new .css.

For the record, this code

    <script type="text/javascript">

        function DisplayColumnIds() {

 

            $(".noDisplay:gt(0)").each(function () {

                var regionId = $(this).html();

                alert(regionId);

            });

 

        }   

    </script>

 

Detected the hidden field like a champ

 

MVC and Unit Testing

I set up a entity framework in my model for Northwind:

I then created a Region Controller to return all of the regions for the Index() Method

        public ViewResult Index()

        {

            var region = (from r in dataContext.Regions

             select r);

            return View(region.ToList());

        }

 

 I then set up a Unit Test to check to make sure I am getting 4 regions back (Yes, I know I should be using a stub).

        [TestMethod()]

        public void IndexTest()

        {

            RegionController target = new RegionController();

            List<Region> regions = (List<Region>)target.Index().ViewData.Model;

            int expected = 4;

            int actual = regions.Count;

            Assert.AreEqual(expected, actual);

        }

 

 

The thing that surprised me is that the chain I needed to go though to get to the model and the cast.  However, seeing this:

makes it all worthwhile.

 

Json and Entity Frameworks

I started digging deeper into Ajax and Json in a MVC solution when I came across an interesting gotcha.  I wanted to have a table with some data be assocaited with a drop down – a very common pattern.  I started using Entity Frameworks and I created a couple of Json methods to expose some of the classes from the EF Data Context. 

The most import thing to recognize is on line 23 and 34.  If you do that I have on line 41, you get an error.  Gotcha #1 is that there is no way to detect the error using this code:

            $.ajax({

                type: "POST",

                traditional: true,

                url: "/Territory/GetRegions",

                dataType: "json",

                success: function (data) {

                    $.each(data, function () {

                           //Stuff

                }

            });

 

 

Because there is no error handler implemented.  Fortunately, Fiddler came and saved the day:

In addition, you can add an error handler with the Error property:

                error: function(XMLHttpRequest) {

                    alert(XMLHttpRequest.responseText);

                }

 

And now that I think about it, the gotcha makes sense.  The return from entity framework has all of these relation properities, etc… that json simply does not care about.  My lesson learned is to use DTOs where communicating with Json, don’t use the classes created by EF

In any event, once the Json methods got working, I needed to hook them up to the controls on hte form.  I started 1st with ASP.NET controls (drop down list and grid view) but the client side ID was getting screwed up (I think there is a fix in 4.0, need to investigate) so I used plain old HTML controls and implemented the following 2 loads.  The 1st goes on the page load for the select list:

        function LoadRegionList() {

            $.ajax({

                type: "POST",

                traditional: true,

                url: "/Territory/GetRegions",

                dataType: "json",

                success: function (data) {

                    $.each(data, function () {

                        $("#SelectRegion").append($("<option />").val(this.regionId).text(this.regionDescription));

                    });

                }

            });

        }

 

 

And the other goes on the select list changed event – updating the table

        function LoadTerritoryList() {

            $("#TerritoryTable tr:gt(0)").html("");

 

            var regionId = $("#SelectRegion").val();

            var postData = { RegionID: regionId };

 

            $.ajax({

                type: "POST",

                traditional: true,

                url: "/Territory/GetTerritories",

                data: postData,

                dataType: "json",

                success: function (data) {

                    $.each(data, function () {

                        $("#TerritoryTable").append("<tr><td>" + this.territoryID + "</td><td>" + this.territoryDescription + "</td></tr>");

                    });

                }

            });

 

I don’t like stringing together HTML (the <option>,<td>,<tr> tages) but I can’t find a simpler way.  It seems that there is no separation of concerns I am placing data into markup.

Once those methods were implemented – volia! I had a table that changed with every change of the select list.  Very cool.

I am still wondering if the developer experience taking a step-backwards might be worth the trade off – the user experience is much much better than ASP.NET post-back and even ASP AJAX update panels…