Going Freelance

In December of last year, I decided to leave my job of 11 years to concentrate full time on freelance work, and to try and build up a software company of my own – Nighthawk Software.

8 months in, and it’s been a lot more challenging than I thought, but at the same time a lot more rewarding than anticipated.

One of the biggest challenges I’ve faced has been trying to define exactly what it is I do.

I’ve always concentrated on backend web development work – creating the software behind the scenes that powers web applications such as CRMs, email systems, project management suites etc. Personally, I’d call this “Web Development”.

But when people here this, they automatically want you to design them a new website for their small business. While I often work as a full stack developer, and I’m proficient with CSS, HTML and Javascript, I suck at design.

While I’d call this type of work “Web Design”, in order to distinguish the two, the terms seem to be used interchangeably within the industry, particularly amongst clients.

That’s been my biggest challenge so far – explaining what I do to clients, and specifically, what I don’t do.


Read More

Beirut, Lebanon – May 2012

Everyone’s heard the saying “it’s like downtown Beirut” – and it’s never meant in a good way. When Lebanon descended into civil war in 1975, central Beirut became one of the most fought over parts of the country. The constant fighting made downtown Beirut a no-go zone. But that all ended in 1990, and I’d read good things about it since – so when a friend won a pair of free return tickets anywhere on bmi’s network, it seemed like the logical choice.

A Sprawling City
An overview of the city, from the nearby hills

While relatively safe nowadays, there are still a few things to be aware of when visiting Beirut. The main one, is the country’s neighbour – Syria. The consiage at our hotel had recommended visiting some ancient ruins to the west of Beirut, but when checking a map we found these ruins were just 20 miles from the border – an area we had been warned by the Foreign Office to avoid at all costs – kidnappings of western tourists near the border tended to happen occasionally. We were also told to avoid the north of the country, for similar reasons. Thats fine – I don’t intend to stray too far from Beirut anyway! News reports the day before revealed there had been a car bomb in an area on the outskirts of Beirut – just off the main road from the airport. It all adds to the fun, right? People pay good money for excitement like this… We decided to spend our first day exploring central Beirut. Our hotel was located close to the coast on the northern tip of the city, so we headed down to the coast and started walking along the relatively newly constructed promenade.

A fisherman casts off into rather rough seas,right in the centre of Beirut

Redeveloped Waterfront
Redeveloped Waterfront

As you start to explore, you immediately see the amount of redevelopment that is taking place in Beirut. Once the most important financial centre in the Middle East, the civil war saw a drastic withdrawal form the city, but investment is now starting to return. But the reminders of the citys past are still there to be seen, while Versace may be building a new luxury apartment block, right next door is the shelled remains of the former Holiday Inn


The Past Remains
New developments alongside the ruins of the former Holiday Inn


Once the tallest building in the city, the Holiday Inn provided an ideal location for snipers during the civil war. As a result, it became one of the most fought after buildings, as each side wrestled for an advantage over the other. The remains can still be seen, along with many other buildings still riddled with bullet holes.


Old Meets New
Ancient Roman Ruins

The Holiday Inn isn’t the only throwback to the cities past. Beirut was also an important Roman town, and a former Roman Bathhouse discovered in the 1960s has been well preserved, and is open to the public to explore. Lebanon changed hands many times, most recently becoming a french colony. The French influence is very apparent in central Beirut, with wide avenues adorned with street side cafes. You could easily forget you are in Beirut, and confuse it with Paris.


French Influence
Wide, cafe lined streets, the French influence is apparent.

Hamidiya Clock Tower
Hamidiya Clock Tower

The following day, we decided to head out of the town and explore some of the surrounding area. The concierge at the hotel suggested we hired a driver for the day, and after pricing it up, we decided it was indeed the way to go. Slightly more expensive than an organised bus tour, but it gave us the flexibility to go where we went, and when we wanted. No rushing about, or waiting on others. We took a drive up into the hills to get an overview of the city, before heading up the coast to a beach town the driver had suggested for lunch. It appeared to be a popular spot, and despite the stone beach, the beautiful blue sea certainly made up for it.

White Stone Beach
White Stone Beach

After lunch we continued up the coast to our destination – Byblos, thought to be one of the oldest continuously inhabited cities on earth, tracing its routes back to 8800BC. The city is built around a rather picturesque old harbour, while narrow market streets wind their way inland from there. Unfortunately, as is the case throughout the world, the market streets are now selling tourist tat, rather than the fresh produce and wares of days gone by.

Byblos Harbour
Byblos Harbour
Narrow Market Streets
Narrow Market Streets

All in all, Beirut was a fantastic place, and somewhere I would highly recommend visiting. Very cosmopolitan, with a strong French influence, but with their own unique Lebanese twist.

For more photos from my trip, please see my flickr account: https://flic.kr/s/aHskQ5FYZ4

Read More

Using WebAPI? Disable WebDAV To Avoid PUT/DELETE Issues

This one turned out to be a right head scratcher!

A WebAPI that worked fine in development, but as soon as I deployed it to the live server, any DELETE command started returning “500 – Internal Server Errors”. I stripped the code back as far as I could, to the point where the DELETE function was only returning a string. Nada, still getting errors.

I checked the request directly from the server, so it would bypass the standard error page, and give me the real error. It was all down to WebDAV.

It turns out that by default on IIS, WebDAV will be configured as the default binding for DELETE and PUT requests.

If you disable WebDAV (which it was by default), it STILL remains the default binding! IIS passes the requets to WebDAV, which immediately rejects it, due to it being disabled. Thus causing the 500 error.

If you are using WebAPI, then you need to disable the module WebDAV in your web.config file, as follows:

  <remove name="WebDAVModule"/>
  <remove name="WebDAV" />

Read More

Never Trust Your Hosts With Your Backups

On Saturday, 123-Reg informed their customers that one of their scripts had gone wrong, resulting in around 67 servers being accidentally deleted. As these were all un-managed VPS servers, there were no backups.

This came just days after someone posted on Server Fault  asking for help after one of his scripts also went wrong, deleting the accounts and data of all 1,500 of his customers. All his backups were also deleted by the same script.

While the latter case is now claimed to have been a publicity stunt, it does raise an interesting question about the safety and security of web hosts own backup solutions.

The Dangers Of Trusting Your Backups To Your Web Host

Often a web host will offer a backup solution for an additional price on top of the hosting package. This is often configured as a location on a remote backup server, which is mounted as a drive on the web server for easy access.

This seems to be the kind of backup solution offered in the second example above, and as we can see it proved to be next to useless. Sure, it would help in the event of a physical hardware failure on the server, but for every other case, it is of no use. It doesn’t protect against mistakes made by the host, it wont protect against hackers or viruses penetrating the server, and it certainly doesn’t help in the event of the host going bankrupt.

If the lights go out at your hosting provider – how do you access the backup to restore your data on your new server?

The Preferred Way – A Third Party

The only way you should ever consider doing backups is with a third party company. In almost all cases, they will provide you with a piece of software that does the physical backups. This creates a nice air gap between your server and the backup, keeping them safe from viruses and hackers. The fact that they are owned by a different company protects you against bankruptcy, and they will almost certainly be in a different physical location, again offering protection from extended failures or loss of a data centre.

Anyone not using a third party for backups is just putting all their eggs in one basket. And we all know not to do that, don’t we?

Read More

Bloggers: Always Date Your Posts!

One of my (many) pet peeves is blog posts or articles that don’t have a date on them. The world of technology moves quickly, and technical articles can very quickly become obsolete.

Recently I was implementing Stripe payment support on a website I run, and was looking for articles explaining how to use it with C#. A google search produced plenty of articles, however it quickly became apparent that the Stripe API had changed in the last few months, and therefore anything written more than 3 months prior was obsolete.

Google makes it easy to identify old articles: it shows the date in the search results, allowing you to skip over the article.

google dates

Yet despite this, I still frequently come across blog posts and technical articles that don’t contain a date.

PLEASE: Always date your posts so we don’t have to waste time reading them to find out if what you are saying is still relavent, or has become obsolete.

Read More

Remote Desktop & High DPI Displays

When I received a new work laptop last month with a 4k screen, I began to come across all sorts of issues with various programs. While some applications scale nicely, others do not. One of my biggest pains has been with using Remote Desktop.

Here’s a typical remote desktop session window:



With the server running at 1600×1200, and my desktop at 3840×2160, you can see that the remote desktop window takes up less than a quarter of the screen. On a 15″ screen, that makes the text almost unreadable!

Remote Desktop doesn’t support scaling, but fortunately Remote Desktop Connection Manager does. You can download it here: http://www.microsoft.com/en-us/download/details.aspx?id=21101

Now I can scale the Remote Desktop window, and save having to squint at the screen all day long!



Read More

Windows Power Plans

I recently received a new laptop at work – a 2.4Ghz i7 with 16 Gb of memory.

That’s quite a beast of a laptop, so you’d think it would have no problem coping with Visual Studio 2013. But over the last few days it’s been frustrating me a little – there’s a noticeable lag when typing, and Visual Studio as well as various other programs were performing a little slow.

I immediately disabled all extensions, and removed the recently installed virus checker, but to no avail.

It wasn’t until I checked Task Manager that I noticed the problem, CPU usage was 0% of 0.77 Ghz.

That can’t be right!

It turns out I had switched the power profile last week while running on battery, and changed to “Power Saver” mode. I thought these settings only took effect when running on battery? Apparently not. I’m now back on “High Performance”, and Visual Studio is once again responsive.

Read More

Changing The DPI Scaling In Windows 8

Screen real estate is important – so much so, when I was looking for a new laptop, it was one of the key factors I considered. I like to have enough space to have the Visual Studio Solution Explorer open, while still being able to see plenty of code, and ideally still have enough room to see at least part of the web browser next to it – the more the better.

Despite wanting a portable laptop with a 13″ screen, I wasn’t prepared to accept a screen resolution any less than 1920×1080. I eventually found one that met my needs.

On receiving the laptop however, I was disappointed to see that I didn’t have as much space as I had expected. I could barely view a single web page, and to say VS was crushed was an understatement. I double checked the screen resolution, and sure enough it was indeed 1920 x 1080. What was wrong?

It turns out Windows 8 will scale the DPI depending on the size of screen and resolution selected. While this gives a sharper image, it comes at the expense of real estate. It’s an easy enough fix if you’d rather have the space back, however.

How To Adjust Size In Windows 8

Right click anywhere on your desktop. In the popup menu that appears, select “Screen Resolution”


The Screen Resolution dialog will appear. At the bottom of this dialog, click “Make text and other items larger or smaller”


You can now adjust the slider to resize items on the screen. I’ve set mine to the smallest possible setting, but on a 13″ screen this may be a little too small. I’ll need to have a play around with this over the next few days.



Read More

Multi-Tenancy System With Separate Databases in MVC


With improvements in broadband and web technologies, we are seeing a shift away from traditional desktop applications towards web based systems. The cloud is all the range these days. Accounting packages such as Sage and Quickbooks are being replaced by online alternatives such as Kashflow and Wave Apps.

Rather than creating a unique software instance per customers, the likes of Kashflow and Wave Apps have developed their systems as multi-tenancy applications – a single instance of the software is used by all users. In each case, their data is separated from that of other customers by the architecture of the system.

Multitenancy refers to a principle in software architecture where a single instance of the software runs on a server, serving multiple tenants. A tenant is a group of users sharing the same view on the software they use. – Wikipedia

This can be achieved in two ways:

  • Single Database – A single database is created, and all data is stored here. Each record is assigned a tenant key, and only data belonging to that tenant is accessible. Access is restricted by the application software.
  • Multiple Databases – Alternatively, a separate database can be used to store each customers data. Access to the database can then be restricted by using SQL login credentials.

While I have used the single database approach many times, when starting a recent project it became apparent that the multiple database approach may be more suitable.

Advantages of the Multi Database Approach

One of the main advantages of the multi-database approach is that it makes it possible to backup and restore an individual users data. With a single database approach, restoring the database would wipe out changes for all customers, and makes it impossible to offer a roll-back functionality in the event a single customer makes a mistake.

Additionally, should the site become extremely successful, multi-database systems allow data to be moved between servers very easily on an individual basis.

The main selling point however in my case, was the anticipation that a number of clients may require customisation of the system beyond what can be achieved in a multi-tenancy design. By using separate databases, these can be moved to a new server if need be, and fully customised if needed. While this may break the advantage of a multi-tenancy system in the first place, it does offer flexibility and future-proofing that a single database system would not offer.

Architecture of A Multi-Database System

Architecture of a Multi-Tenancy Application

In a multi-database multi-tenancy system, each users data is stored in it’s own database. A separate database is therefore required to hold login details, and provide details of where the users data is stored. This could point to a database on the same server, or a remote data location.

How to Create A Multi-Database System With MVC 6

In Visual Studio, create a new ASP.NET Web Application

Create a new project


Select MVC as the template type, and in “Change Authentication”, ensure “Individual User Accounts” is selected. We will use forms authentication for this example.

Select MVC as the type

Ensure Individual User Accounts is selected

First, create a folder called AccountDAL – we will use this to store all the code for accessing the Account data store.

Create a folder for the account DAL


Create a new class, and name it DataContext.cs. Add the following code:

public class DataContext : DbContext
   public DataContext() : base("accountContext")

   public DbSet Accounts { get; set; }
   public DbSet Users { get; set; }

We will use Entity Framework, code first, to generate a DataContext that represents the data stored in our Account database. There will be two tables:

Accounts – An account represents a single tenant. This data will contain the location of the tenant’s data store. Each account can have multiple users.

Users – contains the login username and password for all users of the system. Each user is tied to an account

Add a connection string to web.config to connect to the account database:

<add name="accountContext"
connectionString="Server=desktop\SERVER2012; Database=Accounts;
Integrated Security=SSPI" />

While in web.config, we will also check that the auth mode is set to forms authentication:

<authentication mode="Forms">
<forms loginUrl="/Account/Login" cookieless="UseCookies" />

Next, lets create two classes to represent the tables in our database, User.cs and Account.cs:

public class User
   public int Id { get; set; }
   public string Email { get; set; }
   public string Password { get; set; }
   public string Name { get; set; }
   public int AccountId { get; set; }
   public virtual Account Account { get; set; }

public class Account
   public int Id { get; set; }
   public string Name { get; set; }
   public string Database { get; set; }
   public virtual ICollection<User> Users { get; set; }


We now need to create our database. Create a new database called Accounts, and add two tables called Users and Accounts, as follows:



Add the following test data to each:

Finally, let’s make a couple of changes to our Login and Logout function to use FormsAuthentication:

public ActionResult Login(LoginViewModel model, string returnUrl)
   if (ModelState.IsValid)
      var dataContext = new AccountDAL.DataContext();
      var user = dataContext.Users.FirstOrDefault(x => x.Email == model.UserName && x.Password == model.Password);

      if (user != null)
         FormsAuthentication.SetAuthCookie(model.UserName, false);
         return RedirectToLocal(returnUrl);
          ModelState.AddModelError("", "Invalid username or password.");

   // If we got this far, something failed, redisplay form
   return View(model);


The above code will create a new instance of our Account DataContext, and check the user and password match an existing user. If so, we will set an auth cookie, which will log in the user.

And logout:

public ActionResult LogOff()

The above code will clear the auth cookie we set earlier. This will have the effect of logging the user out of the system.

If we now run the project, we will now able to log in as either of the two companies we created in the test data. All very straightforward.


Now comes the multi database approach.

Let’s create two new databases, one for each of our companies. Call them “Company1” and “Company2”, as we specified in the “Account” table of our test data. In each, create a new table called Jobs, as follows:


Add a couple of test jobs in each database:

Company 1 test data:

Company 2 test data:


Now, back in Visual Studio, create a folder called SystemDAL to store all our data objects that relate to the actual system.

First, create a new class called DataContext.cs:

public class DataContext : DbContext
   public DataContext(string database)
      : base("Data Source=desktop\\Server2012;Initial Catalog=" + database + ";Integrated Security=True")

   public DbSet<Job> Jobs { get; set; }

This is where we implement our multi-database logic. Rather than pass in the name of a connection string to the DataContext base constructor, we will instead build our own, using a database name passed in to the DataContext constructor. This will be taken from the Account table in our database.

Create a second class to represent a job object:

public class Job
   public string JobName { get; set; }
   public int Id { get; set; }

We will now modify the Home\Index() function to load the current users data:

public ActionResult Index()
   // get the current user:
   var accountContext = new AccountDAL.DataContext();
   var user = accountContext.Users.FirstOrDefault(x => x.Email == User.Identity.Name);

   if (user != null)
      // now we have the current user, we can use their Account to create a new DataContext to access system data:
      var systemContext = new SystemDAL.DataContext(user.Account.Database);
      return View(systemContext.Jobs);
   return View();

The above code first creates an instance of our Account DataContext, and gets an object representing the current logged in user. From this, we can then create a System DataContext instance, passing in the name of the database we wish to connect to.

Once connected, we can then pass a list of all the companies jobs to the View.

Modify the Index view as follows, replacing the existing code:

@model IQueryable<MultiTenancy.SystemDAL.Job>

   ViewBag.Title = "Home Page";

@if (Model != null)
   foreach (var job in Model)


There we have it – a multi-tenancy web application that stores each users data in a separate database!

Read More

Always Hide Your Webstats Code On Your Dev Server

While Google Analytics may have been the bees knees when it comes to web stats in the past, it seems to have taken an almighty tumble from the top spot in recent years. The interface has gone from being one of the cleanest, easiest to use, to one of the most confusing out there.

I decided it was time to consider moving away from Google Analytics, so recently I have been playing around with a few different web stat packages, running multiple at once on the same page. In doing so, I began to notice that some packages were reporting a higher number of views than others.

When I started to dig closer to see what was causing this, I discovered that some of the packages were showing a large number of hits to pages `localhost:page_name`.

Yep, it seems that not all of the packages are clever enough to filter out page views from your development server.

To resolve the issue – I found a nice little hack that will only display the tracking code on the live server.

The ASP.NET property `Request.Url.Host` will reveal the URL by which the site was accessed – by simply checking this value you can tell whether the site is live, or running on a dev server. You can then hide the tracking code accordingly.

@if(Request.Url.Host != "localhost")
// show tracking code - site is live

Read More