Posts

Showing posts from 2011

Debugging the SSIS Script Component

Image
I've been doing a lot of work with SSIS recently, and learning a lot on the way. One of my tasks over the last week was to convert each row of a result set in to XML and post it to a web service. The web service only accepted one row at a time, and generating XML in SSIS is not the most intuitive of tasks, so I wrote a script component to do the conversion work. Anyone who's tried anything more advanced than a "Hello World" example in a script component will be able to tell you that you cannot use breakpoints in them - making it quite difficult to debug even the simplest of code. For a developer such as myself, it's like having an arm removed in a horrific train accident. I had a bit of a task ahead of me - I had to create an XML document with tags containing attributes and values, attributes and child elements, child elements and values, attributes, child elements and values, then self-closing tags with some of the above... whilst not exactly a daunting task, i

Tools Round-up

The development world has been coming up with ways to improve productivity for years. However since Visual Studio 2010 was released, with its fully extensible model, things have seriously changed. Most tools have been modified to extend VS2010's already pretty awesome IDE. With just a few extensions, you can fill in almost all of the gaps the Visual Studio team have left in the VS experience. I'm not a massive fan of NuGet; I use it, but not for production sites. It's good for quickly implementing a library in a test project, but I just don't like the "bloat" you end up with. I like to know what every single file does and why its there. The same goes for WebMatrix; although handy, I much prefer to manually create projects... I somehow feel like it makes it easier in the long run, even if it takes you a while to set them up in the first place. I recently purchased ReSharper (after trying trials a number of times over the years and not getting along with it)

Getting Visual Studio 2008 to work with Team Foundation Server 2010

Recently I've been collaborating via Team Foundation Server 2010 - it's my first time, and I'll be honest... it was very gentle. It eased me in with its promise of improved Source Control (improved over Source Safe, that is) and toyed with me, sporting its ideals surrounding automated builds and work item collaboration. Whilst I've not been directly involved with the set up, using it on a daily basis has been a refreshing change. I'm one of those developers who obsessively refactors code, renames and moves files around until I'm happy everything is in the right place, so when using Source Safe, CVS or SVN, there's quite a bit of repeat work involved when it comes to making sure the source control server is also updated. Suffice to say that TFS 2010 works beautifully in this area, keeping the local, server and project file structure the same as each other. I've been exclusively working in Visual Studio 2010 for a while now, as I've been creating n

Visual Studio 2010 defaults to an undesired browser when opening MVC projects

A while ago, I set my default browser in Visual Studio 2010 to Google Chrome by using the "Browse With..." trick, and I've been happy with it for some time. Recently I've found that I've needed to do quite a bit of testing in IE, so I wanted to change it back. I went down the "Browse With..." route again, which seemed to work... but every time I close and re-opened Visual Studio, it reset itself back to Chrome. I've been living with it for a while, as I tend to do extended sessions anyway, so it's not been too much of a hassle... but today I've needed to close and restart Visual Studio a few times, so it's become a bit annoying. Anyway, to cut a long story short, it seems that when setting the "Browse With..." option in an MVC project, the settings aren't saved to the configuration file, probably because MVC projects don't actually have physical files to be "Browsed", unlike an ASP.Net project. I'm not ma

The ole' blog

I've just had a bit of a clean up with the blog - re-tagged most of my older posts and changed the layout slightly. I apologise if any of your readers have been spammed with updates - just ignore them if so :) I've also deleted some of my posts, most notable are the vast majority of the post relating to my bespoke nTier framework. Since I made them, I've made some significant changes, so the old posts don't really represent the end product any more, and will probably just confuse matters. I'm still working on it, and I've made some good progress. I'm also learning a hell of a lot in the process and getting to play with some cool tools like T4 and MVC3, Razor and unobtrusive JQuery. Instead of blogging while I'm writing it, I thought I would develop it as a standalone framework you can download and play with yourself. It's about 85% complete, and although other commitments are taking over at the moment, I'm trying my hardest to get it finished

Entity Framework 4.1 - POCO Generator: Function Import with empty return value doesn't generate

I'm in the middle of building a security model for one of my clients. They've specified that they want everything done in the ADO.Net Entity Framework 4.1, so I've been working with it quite a lot recently. Though it hasn't made a great first impression, I'll reserve my judgement until a later blog post. One of my main concerns with EF is separation of concerns / layer abstraction. It offers a fairly robust model (with its limits), but in an enterprise environment, you wouldn't want to expose that model to your presentation code. After a bit of digging (and being pointed in the right direction by a colleague), I found the ADO.Net POCO Entity Generator Visual Studio Extension, which isn't available by default... you have to download it via Extension Manager. I managed to get it to generate my POCO (Plain Ole' CLR Objects) in a relatively short amount of time, but because I'm concerned with the dynamic SQL generated by EF, I've been writing sto

Microsoft Community Contributor Award

Image
I'm sure this isn't hard to get (although I'm not entirely sure how I managed it), but today I opened my email to find a lovely email from the Microsoft Community & Online Support department stating I'd won a Microsoft Community Contributor Award. I got a certificate and everything: I am now "officially" evil. I've not been that active on the ASP.Net forums for a while, but I did recently outline and troubleshoot a bug in Phil Haack's Route Debugger . After commenting, Phil sent me an email and I replied... but I've not heard anything else, so I don't really know :) Either way, it's a nice little addition to the CV and I received a few little benefits from Microsoft - time will tell if they're useful to me or not!

The benefit of T-SQL 3 part naming

So yesterday I made the second ever major technical boo-boo of my career - I accidentally ran a dynamic "DROP TABLE" script on an extremely large, high-availability system. The last time I did something of this magnitude was literally when I first started out with T-SQL, and I ran a load of DROP / CREATE PROC statements on a similar sized system, right in the middle of rush hour; instantly breaking the 500 of so clients that were connected to it. Obviously this kind of thing is very embarrassing, not to mention the massive inconvenience caused throughout the departments using the system. Luckily the database is backed up hourly (thank you Mr DBA), and although I made the mistake 6 minutes before the next back up, the impact was relatively minimal due to the main user's being in a departmental meeting. I'm blogging to outline my mistake and document changes to my T-SQL script, which up until 13:34 yesterday, made my life considerably easier and I thought was the dog

T-SQL CRUD Generation

Following on from my last post  complaining about MSSQLMS UI code generation , I thought I would back it up with a few T-SQL scripts that generate code that I actually want to see. I'm currently developing a large scale, real life application using my new .Net 4.0 nTier framework , so I've decided to make my life a little bit easier by generating the boring old CRUD stored procedures with some funky T-SQL scripts. I'm also currently looking in to Visual Studio's T4 code generation tools to help me create my business objects, as most of those will also be fairly standard - though I still need the flexibility to change them (and guarantee that a code generation tool isn't going to overwrite my changes) once the templates have all been created. Let me start by saying I'm usually against the use of the 'sp_executesql' stored procedure due to the additional overhead created by executing dynamic SQL. In this instance however, I'm happy to use it becaus

MSSQL UI code generation

Since the start of my career I've used SQL Management Studio (or Query Analyser in MSSQL 2000) to manually create my MSSQL database objects via code. This is a practice I have vehemently defended a number of times - being told that producing objects with the UI tools is "quicker" and "easier". Subjectively, I have to agree with this statement - any kind of well designed UI generation tool is almost certainly going to be quicker than writing a load of CREATE statements by hand... there is of course, a caveat. I tend to find that developers who do not fully understand RDBMS' (which is a vast majority, unfortunately), or do not enjoy writing databases always take the quick and easy approach. The problem with allowing a UI to generate code is that a large portion of it is unnecessary, and it's sometimes difficult to read. This is true of most generator engines, including those found in .Net. In terms of SQL generation, hopefully these code snippets will

Generating a self-signed SSL certificate for my QNAP NAS

I recently bought a new NAS (Network Attached Storage) to deal with some of my remote work requirements. Initially I was going to build a server to do the job from spare parts, but I was concerned about power consumption and of course the amount of time it would take to build, configure and administrate. In the end, I had a little look at a few reviews and I decided on the QNAP 219P+. I can honestly say without a doubt that it's one of the best pieces of kit I've bought for a while. It's very feature rich, (almost) everything seems to work perfectly, the interface is well organised and easy to use, and maybe the most important part - it's fast . I'm having a few issues getting the WebDAV service accepting connections, though I think it's Windows' fault, not the QNAP. It's a little on the pricey side for home users, but I didn't mind paying what I did considering how good it is. One of the things I wanted to do was secure my remote communication w

MVC 3 Unobtrusive Validation with LINQ to SQL

Recently I've been working on a small project that uses LINQ to SQL as it's data layer (it's a very simple model) and contains two presentation projects - a web form application for the front end and an MVC 3 administration panel. When playing with the unobtrusive validation engine, you'll find many examples on the 'net telling you to decorate your model with various attributes from the  System.ComponentModel.DataAnnotations namespace. Now, I ran in to a bit of an issue with the LINQ to SQL objects - they're generated dynamically. I could add the attributes, but they would be overwritten any time I made a change to my DBML file. So, after a bit of searching I came across the MetaDataTypeAttribute in the same namespace. This attribute allows you to create a completely separate, standard class and define it as a Meta Data container for another class. Neat, hu? Well, I quickly ran in to my second issue - I could create my separate class, but I still had to