Knockout Components – Separating Templates from View Model

Part 1: Dipping your feet into Knockout JS Components

This is the second part in my Knockout JS Components series. So far we have made a simple ‘greeter’ component and used it multiple times. But our HTML template was rather simple and hard coded in the ViewModel as a string. That was fine for a small snippet but for involved templates you don’t want to stick an HTML as string. Rather you would like it to be an independent HTML that’s loaded as required.

Today we’ll see how we can separate the HTML and load it dynamically using another library called RequireJS and a plugin for it (RequireJS-Text).

If you want to follow along, you can get the source code branch for the previous article.

Quick intro to RequireJS

RequireJS is a library that is used for ‘Automatic Module Detection’ and loading of JavaScript modules and references. It was developed by James Burke from Mozilla. It is an open source project hosted on Github with a vibrant community that helps maintain it. It also has good documentation that you can refer to for getting started.

This article is not a RequireJS tutorial, instead we’ll jump right in and start using it, I’ll explain the syntax as we go along. If you have not used RequireJS before, well don’t panic, neither have I Smile.

Installing RequireJS and RequireJS-Text plugins

RequireJS has an official Nuget channel so getting the base library is easy to install via Nuget Package Manager Console, simply type in

install-package requirejs

Next we have to get the Text plugin from Github directly. You can either clone the repo from here or just download the JS file.

Once you have the file, add it to your scripts folder. I have started creating sub-folders for each library because it will come in handy later.


  • App/boot : This folder will contain scripts that initialize our libraries
  • App/components : This folder will contain all the components we create. Each component in turn has it’s own folder that may contain the ViewModel, HTML template and more.
  • Scripts/*: As mentioned earlier I’ve moved each library into it’s respective sub-folder under the Scripts folder. So Scripts is essentially for all libraries and frameworks that we will use (and not modify), and everything that we build will go under App.

NOTE: This folder structure is completely arbitrary. I ‘feel’ this works, you can use it, you can totally use your own structure. Just remember where your ‘root’ (folder) is Smile.

Now that we are done with the structure of the libs and sources let’s move on and configure RequireJS.

Configuring RequireJS

Under App/boot folder add a new JS file called require.configure.js. The name is, again, not a part of any convention, use what works for you, just refer it in Index.html correctly).

I have configured RequireJS initially as follows

var require = {
    baseUrl: “/”,
    paths: {
        “bootstrap”: “Scripts/bootstrap/bootstrap”,
        “jquery”: “Scripts/jquery/jquery-1.9.0″,
        “knockout”: “Scripts/knockout/knockout-3.2.0beta.debug”,
        “text”: “Scripts/require/text”
    shim: {
        “bootstrap”: {
            deps: ["jquery"]

  • I’ve created a require global variable that has three properties, baseUrl, paths and shim.
  • The baseUrl property sets the root location with respect to which all other paths will be considered.
  • The paths property is assigned an object with name value pairs corresponding to the library names and their relative locations. Not the .js is stripped away from the paths.
  • The shim property provides additional dependency information. Here it shows bootstrap is dependent of jquery. The value ‘jquery’ matches the name used in ‘paths’ above.

Updating references

Now that we have configured RequireJS, technically all we need to do is load RequireJS using the configuration and all should be fine!

Well, let’s update the Index.html file to load RequireJS and remove all other Script references. We update the script references as follows:

<!–<script src=”Scripts/jquery-1.9.0.js”></script>
<script src=”Scripts/bootstrap.js”></script>
<script src=”Scripts/knockout-3.2.0beta.debug.js”></script>–>

<script src=”App/boot/require.config.js”></script>
<script src=”Scripts/require/require.js”></script>

Now you are wondering where is the greeting.js gone and how is it going to be loaded? Well there is no magic, we have a couple of more steps to go.

Add App/boot/Startup.js

In the App/boot folder add a new JavaScript file called startup.js. The name is to help us understand the process, it’s not a convention.

Add the following ‘module-definition’. You can read up about RequireJS Modules here.

define(['jquery', 'knockout', 'bootstrap'], function ($, ko)
      require: ‘App/components/greeter/greeting’

The Startup module says that it is dependent on jQuery, KnockoutJS and BootStrap. Note, it uses the same names that were used in RequireJS configuration above. The function parameters are instances of the dependencies requested in the array, so if you put another input parameter like boots it would have instance of the bootstrap library. We’ll just keep this in mind for now.

Next it declares a function that has jQuery and KO lib references as input parameters.

In the function we ‘register’ our ‘greeter’ component. Note, that we have moved the registration from the greeting.js to startup. Also note instead of specifying the hard-coded template and view model, we are simply configuring a ‘require’ property, that points to the folder where the greeting.js is (without the js).

Well, that’s the startup configuration. Needless to say, as we add more components they will need to be registered here.

Updating our ‘greeter’ component

The first thing we do is add a greeting.html (name same as JavaScript is again not a convention, just easier to map in our heads).

It contains the same markup that we had hardcoded in the template:


Update greeting.js

Finally we update the greeting.js component. We comment out all the old code and replace it with the the following:

define(["knockout", "text!./greeting.html"], function (ko, greeterTemplate) {   
function greeterViewModel(params) {
        var self = this;
        self.greeting = ko.observable(; = ko.observable(new Date());
        return self;
    return { viewModel: greeterViewModel, template: greeterTemplate };

So essentially we have morphed our component to a RequireJS module. Key thing to note here is use of the text plugin to load load the greeting.html. Require does all the work to load the template and stuff it into the greeterTemplate parameter.

Finally we return an object that KO accepts as definition for a module.

One more thing!

We are almost there. Those paying close attention would have noticed that we didn’t put in reference to the startup.js anywhere. How does RequireJS know how to initialize our app and it’s dependencies?

Back to the Index.html, we update the script tag that refers to RequireJS as follows:

<script data-main=”App/boot/startup” src=”Scripts/require/require.js”></script>

The data-main tag, tells Require that the main module to initialize the app is in that JS file. Thus RequireJS knows what to invoke once fully initialized.



If you run the application now, you’ll see the same old screen we saw in the first part. So what have we done new? Well plenty:

1. Let RequireJS load scripts dynamically.
2. Separated our KO components’ view from it’s model
3. Defined a central place to register all KO components

What we have not done is do more ‘webby’ stuff like putting in links to other possible pages of the app and creating different types of modules for each page and then showing how our app can navigate to those pages as well as load those dependencies on-demand. That’s what we’ll cover in the next part – Routing!

Source Code

The source is on Github as usual (note, after each article I am branching the code out and keeping the master for the latest updates)!

Tagged , , ,

VisualStudio Online–Not for the travelling consultant

Microsoft, in my experience cannot integrate with payment systems. Last year, it took me ages to renew my Windows Developer Account, why because the card on the system had expired and they wouldn’t let me put in another one, until I got the CVV number of the expired card!!! (Yeah, I carry around expired cards by the dozens)!

Six months ago, I tried to renew my O365 license and I went around the loop multiple times before I was able to do it, but I couldn’t change my payment method.

I have been planning to get a VisualStudio Online subscription primarily because the professional sub gives you a Visual Studio Pro license and a monthly payout of 45 GBP seems reasonable to move up from the Express editions I was using.

VisualStudioOnline subs are tied to Azure payment methods. My current one is tied to India. So I thought no worries I’ll create a new Payment method right? After all that’s all we do at Amazon right? WRONG! I couldn’t change the Country, it’s fixed to India. So any new payment method has to be from an India based card! I would have been okay if there are a payment mode in INR, but no, I’ve to pay USD/GBP conversion charges!!! Okay, so I cannot add a new payment method, let me add a new Subscription, surely that will solve the problem right? NO. After filling up all the details the country is still grayed out and set to India.

Foxed, I go about looking for ‘where’ is this India actually selected? I look everywhere and I can’t find it. My outlook account that’s tied to Azure has been changed to UK and that’s what I see on the portal. So I opened a Ticket, hoping things can be resolved, but no I am WRONG there too.

I got a response saying nothing could be done about the country and that I would have to create a new Live account with country UK and setup another Azure account using that? Wow brilliant! What happens when I go back to India again? Or if I move to US or Australia!!! Well as per Azure you should scrap your accounts as you go along, and request them to keep transferring the data!!!

I really have no idea what it takes to build a payment system, but something tells me this is messed up! If Amazon (a book seller right?) can do it, surely the mighty Microsoft can do it, NO?

Well, the story is developing… I let you know how it ends!!!

Tagged , , ,

Dipping your feet into KnockoutJS Components

Last week I saw Steve Sanderson’s NDC 2014 talk on how to build large Single Page Applications using KnockoutJS and other tooling. It struck a chord because SPA is something that I am dealing with right now and really wanted to get neck deep into how to use KO and other tools to build one ‘correctly’.

I have tried with AngularJS in the past and you may have read my multipart series on Devcurry. While I have nothing against AngularJS I still find myself more inclined towards using KO more than anything else. Don’t read too much into it, it’s just me, I like KO!

Anyway, Steve’s talk is very deep and he rightly says, you feel like you are hanging on to a race car when sitting through his talk. So I’ve decided to really slow things down and take it one small bit at a time to see how we can use the latest and greatest version of KO (3.2.0 Beta) to build a front-end framework.

Today we’ll look at something new that’s not available in the release version of KO 3.1 as of date – Components.

Components allow you to build HTML+JS snippets that can be reused just like old server side controls of yester-years or like Directives in AngularJS. They actually mimic an upcoming web standard called Web Components.

Steve also used a set of tooling that I am unfamiliar with, so I’ll try to map stuff he did to how I would do if I were using Visual Studio (as far as possible). Lets get started.

Getting Knockout 3.2Beta

As soon as 3.2 goes live it will be available for use via Nuget and the Nuget Package Manager from inside Visual Studio. Today you can download the build from the Releases page on KO’s Github repo –

If you are a JS ninja you can get the entire library source and build it using NPM and Grunt.

Starting with a clean slate

I’ll start with an empty Web Project in Visual Studio 2013.

Screenshot 2014-07-18 20.55.33

Screenshot 2014-07-18 20.55.57

Screenshot 2014-07-18 20.56.28

As you can see, we get a really really empty template. If you run this in VS, you will get an error saying you don’t have directory browsing permissions.

The Home Page

Since our project is all clean, let’s first install Bootstrap that we’ll use for styling. In the PM console type:

install-package bootstrap

This gives us the following folder structure, where Content has the StyleSheets and Scripts has the JavaScripts required.


Now add a new HTML page to the root of the project, call it Index.html (or home.html)


Thanks to Visual Studio I have forgotten how to setup a basic startup page using Bootstrap. Serves me right, that, I have to scratch my head and wonder ‘now what’:


After ‘considerable’ struggle Winking smile, I update the HTML to include Bootstrap styling and jQuery references.

<!DOCTYPE html>
<html xmlns=””>
    <title>Dipping your feet into KnockoutJS Components</title>
    <link href=”Content/bootstrap.css” rel=”stylesheet” />
    <link href=”Content/bootstrap-theme.css” rel=”stylesheet” />
    <div class=”navbar navbar-inverse navbar-fixed-top”>
        <div class=”container”>
            <div class=”navbar-header”>
                <button type=”button” class=”navbar-toggle” data-toggle=”collapse” data-target=”.navbar-collapse”>
                    <span class=”icon-bar”></span>
                    <span class=”icon-bar”></span>
                    <span class=”icon-bar”></span>
                <a class=”navbar-brand” href=”/”>KO Components</a>
    <div class=”container body-content” style=”padding-top:50px”>
        <h2>Dipping your feet into KnockoutJS Components</h2>
        <hr />
    <footer class=”navbar navbar-fixed-bottom”>
        <p>&copy; 2014 – Still Learning </p>
    <script src=”Scripts/jquery-1.9.0.js”></script>
    <script src=”Scripts/bootstrap.js”></script>

Now if I run this in Visual Studio we’ll get the following page:


Let’s see some KO!

I am assuming you have downloaded the KO library as instructed above, so add it to the scripts folder and add a reference to it in our HTML file.


Now our App is going to be all HTML and JavaScript and I don’t see the need for CSHTML files at the moment. So let’s create an App folder in the application to consolidate ‘our stuff’.

We add a folder called App and add a js file called greeting.js to it.

In this file we add a simple view model with two properties greeting and date. For now we’ll hardcode the greeting to a standard “Hello World”.

var viewModel = {
    greeting: ko.observable(“Hello world!”),
    date: ko.observable(new Date())

$(function () {

Finally we add reference to this script in our Index.html and add a couple of spans to show our greeting.

<div class=”container body-content” style=”padding-top:50px”>
    <h2>Dipping your feet into KnockoutJS Components</h2>
    <hr />
    <div class=”container-fluid”>
<div> Hello <span data-bind=”text: greeting”></span></div>
        <div> It is <span data-bind=”text: date”></span></div>

<script src=”Scripts/jquery-1.9.0.js”></script>
<script src=”Scripts/bootstrap.js”></script>
<script src=”Scripts/knockout-3.2.0beta.debug.js”></script>
<script src=”App/greeting.js”></script>

Refresh the Index.html on your browser and you should see something similar:


Now if you are wondering what’s so great about this, it’s exactly how KO works, you are right!

Where are my komponentz?!?

Well, let’s say we want to convert our ‘Greeting’ HTML + View Model into a reusable component that can be applied anywhere we want? Hello KO components!!!

Change the greeting.js to the following:

ko.components.register(‘greeter’, {
    // The register method needs a config object with
    // 2 properties

    template: // template is a string with the HTML template to apply
        // Here we have directly hardcoded the string we originally
        // had in index.html
        “<div class=’container-fluid’>” +
            “<div> Hello <span data-bind=’text: greeting’></span></div>” +
            “<div> It is <span data-bind=’text: date’></span></div>” +
    viewModel: function(){ // viewModel that can be an object or function
        greeting = ko.observable(“Hello world!”);
        date = ko.observable(new Date());

$(function () {
//We have removed the explicit reference to the viewModel

As we can see above, we have used the new ko.components API to declare a new component called ‘greeter’ (first parameter).

The component needs two parts to be initialized properly, one is the HTML template (in the template) property, and other is the View Model in the viewModel property of the initialization object.

As of now, we have hardcoded the HTML that we had added to our Index.html as the template. Later on, we’ll see how to get it from a separate file etc.

We have used the constructor method technique to define the view Model with the same two properties we had earlier. Why? We’ll see in a minute.

Now that our ‘component’ is ready how do we ‘use’ it? Simple, the name of the component is also the name of the tag, so switch back to Index.html and update the body as follows:

<div class=”container body-content” style=”padding-top:50px”>
    <h2>Dipping your feet into KnockoutJS Components</h2>
    <hr />

Refresh your Index.html in browser and you’ll see things still work as they were! Congratulations, you’ve just built your first component.

Passing Parameters to components

Well, the hard coded Greeting is not quite flexible, what if we wanted to pass in a message to the component?

It is very simple to pass parameters into a component. Add a ‘params’ attribute to the tag and pass in name: value pairs. You will get it as an object in the constructor of your viewModel and you can use it accordingly.

So change the component setup in greeting.js as follows

ko.components.register(‘greeter’, {
    // The register method needs a config object with
    // 2 properties
    template: // template is a string with the HTML template to apply
        // Here we have directly hardcoded the string we originally
        // had in index.html
        “<div class=’container-fluid’>” +
            “<div> Hello <span data-bind=’text: greeting’></span></div>” +
            “<div> It is <span data-bind=’text: date’></span></div>” +
    viewModel: function(
params){ // viewModel that can be an object or function
        greeting = ko.observable(;
        date = ko.observable(new Date());

Next update the Index.html to pass parameters to our component.

<greeter params = ‘name: ” Sumit!”‘></greeter>

Refresh the page and you see the following:


Copy paste multiple <greeter … /> instances and pass different names to them

<greeter params=’name: ” Sumit!”‘></greeter>
<greeter params=’name: ” Optimus!”‘></greeter>
<greeter params=’name: ” Bumblebee!”‘></greeter>

Refresh Index again and things work as expected!


Congratulations, you have built your first KO component!

This concludes the first 15 (approx.) minutes of Steve’s talk. Lots of more stuff is in store. As I explore these things, I’ll continue to share what I learn, first of which will be using RequireJS and Automatic Module Detection. So watch out for the next part in the series.

Source code on my Github repo here –

Tagged , , ,

Getting Started with Train Modeling: My first Hornby Train and Layout

Train modeling is one of those geeky things that I have long aspired, but have never gotten to primarily due to time constraints. Also, I’ve been waiting for Junior to grow up enough to be able to enjoy it too and also I can justify (to better half) buying these things in his name ;-).

Anyway, bought our first modeling train-set last couple of weeks back with a scheduled delivery for Saturday, perfect for an upcoming long weekend. Surprisingly it arrived on schedule on Saturday morning at the Collect Plus collection center and I sneaked it home while Junior was doing his math practice ! He got a real kick when he actually saw it.


As pictured above, the Hornby Western Master consisted of :

  1. Steam Loco with GWR (Great Western Railway, England) Livery. The loco has a bare metal DCC controller built in. Bare metal as in, it goes forward and reverse, that’s it. Don’t expect sound, smoke etc. etc.


  2. There are three carriages (1 low flat bed, one open carriage and one break van). The green cargo box on the flatbed is loose and can be replaced by other small toys. Son’s favorite cargo is one of his die-cast cars. image
  3. All the parts are reasonably high quality plastic. By high quality I mean not cheap looking, well finished paintwork, decent enough details and no jaggy/plasticky edges etc. Here is a closeup of the Break Van. 2014-05-10T17-47-20
  4. There is a ‘Track mat’ that accompanies the kit. It actually is a giant paper poster with a proposed layout. However, the quality of paper is again, rather good and it will rip only if you try very hard or poke it with sharp objects. Gently stepping on it (tested up to 70kgs Winking smile) or crawling on it, doesn’t rip it. It’s a massive poster 1.6 meters by 1.2 meters (massive as compared to space available in our living room Smile). I had originally hoped to put the track on an unused table but the ‘Track mat’ was exactly double the size of the table. The tracks that come with the kit have rather gentle curves so it will occupy the entire mat and you can’t make it smaller. This is what it looks like fully laid out. 2014-05-09T05-55-45_2
  5. Note the siding track provided on the top of the image, complete with a stop buffer. The track position can be changed between siding and main line manually. The inner tracks are actually printed on the mat, but Hornby sells each set separately. So eventually you could buy all the sets and complete the entire layout.

Setting up the layout

First thing we did was the setup the track. Each Track piece comes with fishplates at opposite ends. Initially I was surprised by the amount of pressure it took to join two together. But if you make sure you have aligned the track and the fishplates correctly the fit in rather smoothly. The reason they are so hard is because the track powers the Loco and without the fishplates pressing down hard the circuit won’t be complete.

One of the track straights has the power connector. It has two green push clamps. Green is supposedly digital and orange is ‘analogue’.

Next I installed the Railmaster software. Instead of using the CD I downloaded the software directly from Hornby’s site. Make sure you install it as Administrator. If you use the KEY that’s in the CD case it will register your computer and you won’t be able to use the key again. They do provide an ‘Unregister’ on uninstall, but I haven’t tried it out yet. I am still on the 30 days trial.

Controlling the train from your computer

After software installation, power up the Elite controller using the transformer that came with the kit. Connect it to the computer using the provided USB cable.

There is another 2-wire cable this is to connect the Elite controller to the track (and provide power to the track). Plug it into the ports on Elite called

Windows 7 will detect it as a RS232 emulation device. If you get a chance to cancel the automatic driver installation do it. Else, after it has installed, go to the Control Panel and update driver for the newly installed USB port device. When updating driver, select option of providing your own driver, select the C:\Program Files (x86)\Railmaster folder and pick the Vista driver (yeah Surprised smile)!!!

Once the driver has been updated, you are not off the hook yet. The Railmaster software can talk to the Elite device only if it’s port number is less than 12. I got a port assignment of 18 from Windows. I manually set it it to a used port 8. Windows gave a warning but continued to work. I think Windows 7 can reshuffle ports in these scenarios BUT it doesn’t remember the updated port assignments once you unplug Elite. So when you re-plug it later you have to reset the port number for the Elite again.

By default the application recognized the Loco and was able to connect to it and control it. Though it looks like it was built using Visual Basic 4, the basic functionality works fine. I was able to drive the train in reverse and forward without a glitch. The shunt and cruise speeds work fine as well.

Programming the Train

Here programming doesn’t mean writing code, but more like creating macros using a visual list designer. You can specify train speed, direction and duration for each step. Once you are done, you can run the ‘program’ for the train. I am still to figure out how to do run a program in a continuous loop. More as things happen.

Wrapping up for the day

In conclusion, train modeling is loads of fun but it’s not a cheap hobby. You need lots of space, time and patience to build your layout over time. Most people have dedicated garage/rooms for their layouts and put together parts painstakingly over a period of time.

I have to do some ‘jugaad’ (out of the box thinking) to work around to space constraint. Have some ideas in my head. Watch this space. Next step will involve giving the track a decent base and the ability to flip between siding and main tracks automatically. That requires another Digital decoder and a ‘Point motor’ (told you it isn’t cheap or a onetime thingy). More on what that means after I am done with it. Cheers!

Tagged , , , ,

Setting up my new Dev Rig and solving the DPC_WATCHDOG_VIOLATION error

Last month I finally setup my new dev-rig. Yes, I still prefer desktops for development (call me old school)! The configuration is as follows:

  1. Processor: Intel Core i5 4570
  2. Motherboard: Gigabyte H87-HD3
  3. RAM: Crucial Ballistix Tactical (8GBx2) 16GB @1600MHz  
  4. Power Supply: Corsair CX500
  5. Case: Aerocool GT Advance Mid Tower Interior USB3 12cm Red LED Fan Screwless – Black
  6. SSD: Kingston 240GB V300 (Recycled from my MBP)

Here’s the Amazon ( Wishlist with all the items – My Haswell Build


Putting it together

Having assembled quite a few machines in my past, this was my first LGA processor. So after a few videos on YouTube on how to put one in, I had the confidence to do it. The processor installation went without hiccup. The Heat Sink and Fan gave a few anxious moments with the amount of pressure required to bolt all the four corners down, but I did it without damaging anything.

Rest was reasonably easy. When installing RAM (if installing 2 sticks) follow the color coding on the slots and check your M/B instruction manual to confirm which set you should use first. If you are putting in 4 sticks, hopefully they are all evenly matched so color coding doesn’t matter.

Key to getting a nicely matched Case was evident when I was able to use the USB3 header on the M/B for plugging in the USB3 port in the front.

The front panel Audio headers work, but sound from the built in Sound Card is pathetic. I can hear the hissing of the fan, whining of the PSU and even moving the (wireless mouse) results in funny squeaking sounds. I use my Blue Yeti (microphone) as an external soundcard, it has very neutral sound and I can use headphones without the annoyances of the internal soundcard.

This case has two fans built in, both come with 3 pin cables that you can plug on to the motherboard, allowing finer control of Fan speed from OS/Additional Software.

The Power Supply goes at the bottom of this case. One suggestion, before putting in the power supply, put the SSD in. Once the P/S is in, there are way too many cables to get out of the way, unless you are using a fully modular P/S, in which case sequence doesn’t matter). The SSD sits just above the 3.5” bays (there place to put only one SSD. For more than one, you’ll need a 3.5” to 2.5” converter). The case comes with a pamphlet that suggests you bolt it to the grill (above the 3.5” bays), but if you do so, it’s difficult (if not impossible) to plug in the SATA and power cables.

This case doesn’t have a power or ‘HDD’ light header, it does have a molex connector for the Red LEDs  that serve as the ‘power’ lights. Plug it into any of the molex power connectors from the P/S.

Given that we don’t have HDDs any more, the ‘HDD’ light doesn’t really make much sense.

The 5.25” drive bay covers can be pinched out, but don’t press too hard, you’ll snap something. I snapped a hook and had to superglue it back.

BIOS Defaults

The default BIOS settings are by default pretty good. Just make sure you set the Disk Controller to AHCI (don’t remember what the default was). This is important from getting the best out of your SSD and you can’t change it after OS is installed.

I also turned on the XMP memory profile, this makes full use of the 1600MHz Memory Bandwidth, else your RAM will run at 1333MHz.

The CX500 and Haswell

After I purchased the CX500 I found various articles saying the CX500 was still not Haswell Compliant. Haswell processors have a special power save mode that draws even less power than the ATX standards. I thinks it’s called S7 or something. This results in system not coming out of Power Save mode. However, I have not encountered any issues with the P/S. I put my machine to Sleep overnight regularly and it comes back without fail. So newer versions of CX500 should be fine. There is an ‘official’ CX500 V2 also but mine doesn’t say V2. Either ways, it works fine for me. I’ll update on longevity later.

Windows Phone 8 SDK Installation and DPC_WATCHDOG_VIOLATION

I installed Windows 8 on it first and then upgraded to 8.1 (complete overwrite). After running updates repeatedly, till all updates were in place, I installed Windows Phone 8 SDK. As soon as it re-booted, I got a BSOD saying DPC_WATCHDOG_VIOLATION. I ignored it initially and updated the latest Visual Studio 2012 service pack (without which WP emulator doesn’t work on Windows 8.1). After that the WP Emulator worked but every reboot would result in a watchdog violation. This upset me terribly. No amount of Driver install from the Driver disk/Manufacturer’s site etc. helped. Ultimately uninstalling Hyper V (and effectively killing any chance of WP development) I was able to ‘fix’ the reboot issue.

After a lot of Ducking around and posting on Eight Forums, I did a Windows 8.1 Reset. However, after reset as soon as I installed WP 8 SDK, the BSOD returned. This time I installed Gigabyte’s APP Center app from the CD that accompanies the Motherboard. Using the APP Center, I updated all the drivers and then updated the BIOS. The motherboard came with V5, I updated it to V6 (latest available). The APP Center is really cool way to update BIOS updates. Just point to the one of the servers offered, wait for download to complete and hit OK. Thanks to the dual BIOS scheme, the App can flash the BIOS on the fly. Once done, it reboots.

After the reboot the DPC_WATCHDOG_VIOLATION was resolved. Phew!

Side Note

If you have a Office 365 for Business account, don’t setup Mail or pay very close attention to the warning that Mail gives about changing security settings. Once the security settings are applied I lost Admin rights. I couldn’t install SQL Express/Create new User Account and in general had permission issues all around. This is why I actually did the reset. I haven’t reapplied the security settings. Not sure if this is the root cause, but for now I don’t have the permissions I had earlier, so I intend to keep things that way.

SSD Performance

I had high hopes from my Kingston V300, but was surprised to see it’s result compared to my V200 back in India.

as-ssd-bench KINGSTON SV300S3 3.22.2014 5-22-56 PM

As you can see the V300 outpaces the V200 overall but 4K read/write speed is actually lower than the V200. Wonder why! Maybe last six months playing second fiddle in my MBP took it’s toll.


I built this rig on a tight budget, but it’s a decent performing setup thanks to the ample RAM and SSD. Honestly, once you go SSD you just can’t go back to an HDD. Maybe I’ll add a Crossfire someday to it (unlikely), but it is likely to get more RAM and bigger SSDs in the future.

Tagged , ,

Get every new post delivered to your Inbox.

Join 1,149 other followers

%d bloggers like this: