Thursday 24 December 2009

ASP.NET: Retreive ID of inserted record

I needed to be able to access the returned identifier value from an Insert operation on my ASP page, so that I could use the value in other controls. You can do this by adding an event handler to your ObjectDataSource:

Protected Sub ObjectDataSource1_Inserted(ByVal sender As Object, ByVal e As System.Web.UI.WebControls.ObjectDataSourceStatusEventArgs) Handles ObjectDataSource1.Inserted
Dim returnvalue As Int32
returnvalue = Convert.ToInt32(e.ReturnValue)
' Do what you need to do with the ID here
Session.Add("CustomerID", returnvalue)

When your ObjectDataSource fires its Insert method, the value that the underlying BLL method returns is available for you to use in the above method. You can then set a session variable with this ID or do something else with it.

Just make sure that your BLL is returning the ID properly, or this won't work.

Job done!

Wednesday 23 December 2009

ASP.NET: Uploading Documents to SQL Server and Retrieving them

I've just completed a new function within one of my applications that allows users to upload binary files (such as Word, Excel, PDF, Powerpoint and images) and store them within SQL Server itself, rather than loading the files onto the file system of the web server. This reduces the complexity of the system and means that a backup will take care of all user data, including their documents.

There are several steps in getting it set up, none of which are particularly complicated but the documentation I found online was a bit sketchy at best.

Aside from the database components, the whole lot was done in Visual Studio Express 2008 with Visual Basic.

SQL Server Database Layer
The requirement was to be able to upload documents and associate them with individual customers. In order for the browser to know what type of file is being returned, you have to store the MIME type as well as the other bits and bobs:

CREATE TABLE [dbo].[Client_Document](
[Client_Document_ID] [int] IDENTITY(1,1) NOT NULL,
[Filename] [varchar](128) NOT NULL,
[Description] [varchar](2048) NULL,
[Binary_Document] [varbinary] (MAX) NOT NULL,
[MIME_Type] [varchar](50) NULL,
[Upload_Date] [datetime] NULL,
[Uploaded_By] [varchar](255) NULL,
[User_ID] [varchar](7) NULL,
[Lob_ID] [numeric](38,0) NOT NULL

The actual file is in the column [Binary_Document] [varbinary] (MAX). Previously, to store files in the database itself, you were encouraged to use the old 'Image' data type, but since SQL Server 2005 this has been deprecated. Instead, use varbinary(MAX) which automatically scales depending on the size of the file inserted (up to 2Gb - you may want to check the file size in code to keep it to a sensible level).

Then you just need the usual CRUD stored procedures to INSERT, UPDATE, DELETE.

Data Access Layer (DAL)
First, add your TableAdapter to your XSD file, then add your methods which tie up to your stored procedures. Check that the data type of the Binary_Document column has been set to System.Byte().

Business Logic Layer (BLL)
Create a BLL for the Client Document which defines an adapter based on the Client_Document datatable. Add your BLL methods which map to the methods defined in your DAL. You can add some validation in here or just call the DAL methods directly, passing the arguments which have come from the calling ASP page.

Presentation Layer
On my ASP page, I have a GridView which displays the current documents associated with the selected customer, as well as a FormView which allows new documents to be added, as well as allowing updates to existing document details (or to overwrite the file with a new version).

Passing the documents into the database and retrieving them is pretty much the same as anything else you've done, except that you need to capture the binary stream of the selected file and pass it into the database. The file selector control comes for free from the .NET framework:

<asp:FileUpload ID="FileUploadClientDocument" runat="server" Width="320px" />

Then, when you call your methods to save or update you can check the attributes of the selected file, extract its filename, size and MIME type, and save them into the database using your BLL methods, e.g:

' If a file has been selected, load it into the database with its details
If FileUploadDocument.HasFile = True Then
Dim fileBytes(FileUploadDocument.PostedFile.InputStream.Length) As Byte
FileUploadDocument.PostedFile.InputStream.Read(fileBytes, 0, fileBytes.Length)

' Display the uploaded file's details
UploadDetails.Text = String.Format( _
"Uploaded file: {0}<br />" & _
"File size (in bytes): {1:N0}<br />" & _
"Content-type: {2}", _
FileUploadDocument.FileName, _
FileUploadDocument.FileBytes.Length, _
FileUploadDocument.PostedFile.ContentType)

' Insert
clientDocumentInfo.AddClientDocument(cloverClientId, FileUploadDocument.FileName, strDescription, fileBytes, FileUploadDocument.PostedFile.ContentType, strAddedBy, strAddedBySid, Session("lobId").ToString)


Make sure that you not only define the file's length (line 2), but actually use the Read method of the InputStream to add the data to the variable (I forgot to do this and got files in the database which were the right size but completely blank!).

UploadDetails is just a label which displays the attributes of the file which has been loaded (size, MIME etc).

Getting the files back is just a case of retreiving the file from the database using the stored proc, then setting the Response to match the MIME type and the file's binary stream.

In this case, clicking on the filename in the GridView fires the below code and returns the file to the browser:

If e.CommandName.CompareTo("GetClientDocument") = 0 Then
' Since we are not using the Select command, we have to get the selected Client Document ID from the command arguments.
Dim btn As LinkButton = DirectCast(e.CommandSource, LinkButton)
Dim gvr As GridViewRow = DirectCast(btn.NamingContainer, GridViewRow)
' Get the selected document ID
Dim clientDocumentId As Integer = GridViewClientDocuments.DataKeys(gvr.RowIndex)("Client_Document_ID").ToString()
' Invoke the DocumentBLL.AddDocument method to get the file from the database.
Dim clientDocumentInfo As New ClientDocumentBLL()
Dim clientDocuments As cleartrak.cleartrak.Client_DocumentDataTable
Dim clientDocument As cleartrak.cleartrak.Client_DocumentRow

' Get a table with the appropriate row in it
clientDocuments = clientDocumentInfo.GetClientDocumentById(clientDocumentId)
clientDocument = clientDocuments(0)
' Set the response to the MIME type of the document and its binary contents.
Response.ContentType = clientDocument.MIME_Type.ToString
Response.BinaryWrite(clientDocument.Binary_Document)
End If


That's it - the files are now saved in the database and can be retrieved by a single click.

Friday 18 December 2009

Dell Studio 17: Resuming itself from Hibernate

I use my laptop for development, so I often have ten to fifteen programs open at once, each with various documents open within them. That, combined with the fact that development tools can often be heavy on CPU and memory means that restarting my PC is a pain in the neck.

So rather than restart or leave it on and waste power, I like to put it into Hibernate mode where it will quickly resume where I left off - no need to reload everything.

The problem was that if I left it overnight, I'd often come downstairs the next morning to find the Windows welcome screen staring back at me - the laptop had woken itself from Hibernate. This is not ideal, because it might cause it to overheat and its a monumental waste of power.

I could not figure out why this was happening so contacted Dell support to see what they could do. They were very keen to help, but didn't offer many useful suggestions. They suggested I flash the BIOS, which I did. No luck.

Then I devoted some time to googling the issue and discovered the problem.

The setting I had selected in Windows Update appeared to be the sensible option, for Windows to 'Automatically download and install updates for me'. Given the swiss cheese nature of Windows security, I thought this would afford me the best protection.

But it turns out that with this setting chosen, Windows Update was causing the system to wake from Hibernate to look for updates. How stupid is that? If I wanted it to do that, I would deliberately select such a setting if it was available. And surely if you did want it to resume to download updates, you would want it to automatically go back into Hibernate? But oh no.

There was no indication in the Windows Update settings that the system would resume from Hibernate

Anyway, I now have it set to automatically download updates, but instead of installing them automatically it will ask me to select which ones to install. At the cost of being able to hibernate my machine, I have to ensure that I remember to periodically install the available updates when they download.

Not the best, Microsoft, not the best.

Review: Dell Studio 17

Being an I.T. boy, if you believe in stereotypes you might think I'd have loads of computers everywhere, whirring away, lights flashing. But since my PC blew up last year, I've only had my work laptop to use.

So the time came when I got sick of typing my password into the encryption program, then waiting five hours for it to boot up and then to get logged into the laptop, only to forget why I switched it on in the first place. Grudgingly at first, I decided that now was the time to get a new computer all of my own.

It took me ages to choose a laptop. I wanted a fairly high spec machine, since I'll be using it for software development as well as some image editing, and I didn't want to have to replace it after just a year. Plus, if I decide to buy a PC game or two I'd like it if it could run it (even in lower resolutions).

After hours poring over which machine to choose, it came down to a short list between the Sony Vaio VGN-AW11M/H, the HP HDX18 and the Dell Studio 17.


I was tempted by the Sony, but the Vaio brand comes at a premium and the reviews said it was too slow to be a proper desktop replacement machine (plus, it's just a big black slab of a thing). The HP machine was very nice to look at and very quick, but too expensive (a rip off, actually), and in the end I decided an 18 inch screen was a little too large anyway. So I spec'd up my machine at Dell:

CPU: Intel Core 2 Duo P8400 (2.26GHz, 1066MHz FSB, 3MB cache)
Display: 17in Widescreen WUXGA with Truelife - CCFL (1900x1200)
Memory: 4096MB (2x2048) 800MHz DDR2 Dual Channel
Storage: 500GB Serial ATA (5.400RPM) Dual Hard Drive (2x 250GB)
Graphics: 256 MB ATI Mobility RADEON HD 3650
Optical Drive: Slot loading Blu-ray Disc (DVD+/-RW + BD-ROM) Drive
Network: WiFi (802.11 a/b/g/n), Bluetooth, built in Vodafone mobile broadband
Other: Backlit keyboard, 2.0 megapixel webcam

I plumped for the spec above, and managed to get 16% off.

The spec is very similar to that of the HP HDX18, but cost me about £200 less. When it arrived on Friday, I was immediately impressed by it. To be honest, I wasn't expecting to like the appearance of the machine too much since Dell's previous offerings have been as dull as a day locked in a room with John Major. But it's nice. The very solid build quality, the nice plastics, the backlit keyboard, a glossy screen and a decent pattern on the lid come together to form a handsome machine.

The first thing I did was turn the dreaded User Account Control off, and delete all the crap which Dell preinstalled on the machine. And so far, I'm more than happy with the performance.

Most of the other Vista machines I've seen (none have been particularly high spec, to be fair) have shown signs of poor performance when loading up the windows sidebar, and some of the property screens on Control Panel, among other things. But this machine doesn't do that - it loads pretty much everything nice and quickly, fast enough to avoid an irritating wait for windows to render. One benchmark test I had concocted was to see how fast it could convert a 70*40cm GIMP XCF image I'd been working on from RGB to Grayscale. On my work machine (Dell D610, 1.8Ghz CPU, 2 GB RAM), it took about twenty seconds, but on the Studio 17 it took about half that.

I paid a bit more and got the upgraded screen (the 1900x1200 WUXGA with Truelife one), and I was not disappointed. It's incredible. Being an extremely high resolution panel, everything on the screen is small, so you can fit a lot on it. Some people hate that, but I like it since I have loads of windows open at once. When playing a video or looking at photos, it really comes into its own, and if you were considering a Studio 17, then I'd upgrade the screen every day of the week. If you don't like the small text of the high resolution screens, you can increase the DPI settings to make everything a bit bigger (but this kind of defeats the purpose of having the high res panel!).

The battery life is alright, but I didn't expect much from a machine of this size. I think I'm getting about an hour and a half from the battery, which is acceptable to me since it'll be plugged in most of the time anyway.

It's heavy though. Heavier than Michelle McManus leaving McDonald's on a JCB. So it's not really all that portable, but I think if you get a decent messenger bag style case, you'd be able to carry it about without any problems. The weight doesn't really bother me since I don't intend to travel around with my laptop too much, just when travelling sometimes and maybe the odd client visit.

My only gripe with it is that there's too much light leakage from underneath the keys, it gets a bit irritating when you're working on the laptop when its on a desk. But the backlit keyboard is still worth the extra cash.

So in summary, it's a fast, well built machine with a stunning screen and nice features. Fortunately, with my wallet being several hundred pounds lighter, I'd recommend it to others.

Monday 7 December 2009

Debugging CSS in Internet Explorer: Firebug Lite

Firefox is absolutely brilliant for checking your CSS styles and layout when you've installed Firebug.

It allows you to see where styles have come from in your CSS, and to switch them on or off, or to adjust them to see what changes you need to make to sit your site nicely.

But when trying to figure out why things look the way they do in the abysmal Internet Explorer, there isn't really an equivalent which is as good. You can try the IE Developer Toolbar which is ok, but like the browser it supports it's very clunky and pretty annoying.

I'd been waiting for a Firebug equivalent for IE for ages, and then I discovered that there is a version of Firebug itself which is designed for IE, Safari and Opera: Firebug Lite.

All you do is stick a reference to a little remote js in your page, load it up and you have most of the functions of the full, Firefox version of Firebug!

Nice.

Friday 27 November 2009

ASP.NET: Validate email address input using a Regular Expression

A nice way to validate an email address which has been entered, using a Regular Expression and absolutely no additional code:

Add a texbox for the input:
<asp:TextBox ID="TextBoxEmail" runat="server" Text='<%# Bind("Email") %>' Width="200px" MaxLength="255"></asp:TextBox>

Then, add a RegularExpressionValidator:
<asp:RegularExpressionValidator ID="regexpName" runat="server"
ErrorMessage="Enter a valid email address" ControlToValidate="TextBoxEmail" ValidationExpression="^([0-9a-zA-Z]([-\.\w]*[0-9a-zA-Z])*@([0-9a-zA-Z][-\w]*[0-9a-zA-Z]\.)+[a-zA-Z]{2,9})$" />


Job done - no code, no fuss. It's pretty robust, too.

Thursday 26 November 2009

ASP.NET: Setting the maximum number of characters in a multiline text box

Bizarrely, ASP.NET includes an out of the box setting which allows you to set the maximum number of characters on an ordinary text box, but not on a multiline text box.

Now there may be a good reason for this, and if there is I would love to be enlightened.

There is no less need for a multiline textbox control to be properly validated than any other text box - perhaps it is even more important since the user might have spent ages typing something in, only to be rebuked for exceeding the maximum number of characters and to be punished by losing what they'd typed.

A friendly character on the ASP.NET forums offered this simple JavaScript solution:

<script language="javascript">

function Count(text,long)
{

   var maxlength = new Number(long); // Change number to your max length.

   if (text.value.length > maxlength){

      text.value = text.value.substring(0,maxlength);

      alert(" Only " + long + " chars");

   }

}
</script>


Then just add these to the multiline textbox's markup: onKeyUp="Count(this,200)" onChange="Count(this,200)".

Wednesday 25 November 2009

ASP.NET: Adding Error Handling With Email Notifications

If you're developing a new application, the temptation is often to get it completed and deployed in the shortest time possible. And sometimes that may come at the expense of proper error handling. So with the best will in the world, sometimes problems do occur in a Production database and the users see an error.

And with no error handling in your application, that can be a pretty poor experience for the user which can make you look a bit unprofessional and can potentially expose your application to security issues if your settings allow the user to see the Exception Details Yellow Screen of Death. So after I completed a deployment of a major release, I decided it was time to implement some error handling and to enable my application to automatically email me with details of any errors which occurred in Production.

One of the things think is brilliant about .NET is the documentation. It's terrific. There are tutorials on every subject, and error handling within ASP.NET is very well covered. So basically, the steps to implement are as follows:

1. Add a custom error page with the appropriate look and feel
Rather than me plagiarise the tutorials on the official ASP.NET web site, here's a link to a tutorial which explains the concepts involved and will get you up and running with a custom error page which, depending on the type of error, shows a nice message to your users, rather than the Yellow Screen of Death (YSOD):



2. Add a Global.asax file to enable application-wide error handling
If you followed the tutorial above, when an error occurs the user sees a nicely formatted error message which is in keeping with the rest of your application. But nothing is really done about the error. It's equally important that you are notified of the error, and that you can investigate it with a view to resolving it.

So the next step is to enable your ASP.NET application to catch any errors, application wide. This is done by adding a Global.asax file which has event handlers for various different application wide events, including trapping errors. The next tutorial from the ASP.NET website explains the concepts involved and helps you to get an error handler set up. It also shows you how to automatically notify you via email when an error occurs.

Remember, when pasting any code from the example into your project that you need to import System.Net.Mail at the top of your Global.asax page in order to identify the appropriate classes for sending mail. You also need to add the appropriate mail server settings to your web.config file (you might need to specify a port number and username and password):

<system.net>
<mailSettings>
<smtp>
<network host="mailhost.*****.net" />
</smtp>
</mailSettings>
</system.net>

So now if you've done things by the book, you should be set up to be automatically notified of the details of any errors which occur, and the users should see a nice friendly screen rather than the horrible YSOD.

Finally, I'm a big fan of using XML for configuration settings where possible, rather than hard-coding them into the application. So you can add some keys to your web.config file which contain the settings you want to use for your email notifications. You could add something like this to appSettings inside your web.config:

<!-- Settings used for email error notifications -->
<add key="ErrorToAddress" value="support@example.com"/>
<add key="ErrorFromAddress" value="aspApp@ example.com"/>
<add key="ErrorEmailSubject" value=" aspApp: An error has been logged"/>

Then, in your Global.asax file, replace the hard coded values with a call to extract these settings from your web.config:

' Get the appropriate values from the web.config file
Dim ToAddress As String = ConfigurationManager.AppSettings("ErrorToAddress")
Dim FromAddress As String = ConfigurationManager.AppSettings("ErrorFromAddress")
Dim Subject As String = ConfigurationManager.AppSettings("ErrorEmailSubject")

This all worked for me, and now my application is more robust, the users don't see any horrible error messages and if any problems occur in Production I'll know about it pretty quickly.

Hopefully, this might help you to consolidate the necessary parts of what's required to implement some error handling within your application too. But of course, this is no replacement for using proper try-catch error handling where appropriate.

Friday 20 November 2009

After Upgrading to .NET Framework 3.5: ASP.NET Reconfiguration and Reporting Services 403 error

Due to the issues I had with the .NET Framework and SQL Server Integration Services, I had to roll back the version to 3.5 instead of 3.5 SP1.

When the upgrade from 3.0 to 3.5 was completed, I had to do a few things to get IIS working again (I was getting errors when attempting to browse to my ASP.NET web sites, and a 403 forbidden error when browsing to a Reporting Services page):

- Re-register ASP.NET (aspnet_regiis -i from the .NET Framework directory).
- Enable ASP.NET in the Web Service Extensions section of IIS Manager.
- Check that each application was set to use the correct version of ASP.NET
- Check that each application was still set to the correct Directory Security settings.

At this point, the ASP.NET web sites started working again but Reporting Services was still giving me a 403 forbidden error.

To get this resolved, I had to delete the ReportServer Virtual Directory in IIS, and then recreate it using the Reporting Services Configuration manager.

All working again!

I Lost My iPhone!

Devastated.

I was at the Kasabian concert at the SECC last week (which was absolutely amazing). At the last song which of course was LSF, I got a text saying that the wife was waiting outside for me to give me a lift.

I took the phone out to text her back since otherwise she would have no idea how long she would be waiting, and someone knocked my arm hard. My much prized iPhone was launched into the blackness of the undulating crowd in front at what seemed like warp speed.

Immediately I went after it and created a space in the crowd with my flapping arms and panicked expressions. A few people around me noticed that I was looking for my phone and joined in the search, but it was never looking good.

The concert finished and after a few more minutes of scanning the floor for a stricken and stamped on iPhone, we were ushered out of the arena by the SECC staff. I reported it to the Control Room at the SECC, the police and O2.

Over the few days that followed, I realised how much I have grown to depend on my iPhone. Not only for the phone calls and text messages, but for the instant access to my email, Facebook and being able to find out the answer to just about any question I could think of just by firing up Safari. It's hugely sad, I know.

Thankfully, the phone is insured via my Lloyds TSB bank account. They have a deal with Lifestyle Services Group, and I've been using them to insure my phones for years. In the past when I had to make a claim, I'd just phone them up, tell them what happened, pay the excess and a new phone would be with me a couple of days later.

Not this time. What a palava.

Everything is now done via mail and fax, which seems like a step back to me. I had to prove that I had reported it to the Police, O2 and them within 48 hours (thankfully I had) and provide proof of purchase. So the claim form is now with them and I'm hoping that I will be receiving a nice new, shiny iPhone sometime soon. If not, I'm gonna be in trouble.

The lesson for me? Don't take it to concerts, sign up to MobileMe (if I had done this, I might've been able to find the phone via the GPS tracker if it wasn't broken) and keep backing everything up.

Though I do have to admit, I have quite enjoyed using the ancient backup phone which has the ability to make and receive calls, and to send and receive texts - and that's it. It's a simpler way to be, no personal data to worry about, no time spent on checking Facebook or emails, and no notion of entertainment on the go.

That said, give me an iPhone any day. Please, Lifestyle Services Group, give me an iPhone.

Existing SSIS Packages Generating New Errors (.NET Framework)

I had been using an SSIS package to load in data from a remote database using the Sybase Adaptive Server Enterprise driver on my own machine and on two 64 bit servers.

The load was working fine, but then it stopped working suddenly complaining of error code 0x80131937:

* Error 0xc0209029: Data Flow Task: The "component "Source - Query" (1)" failed because error code 0x80131937 occurred, and the error row disposition on "output column "WhseCode" (12)" specifies failure on error. An error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)


But the weird thing was that the exact same DTSX package stopped working on my PC and on only one of the two servers - it kept working on one of them.

So I tried everything:

- Stopping the package from failing on error to see what rows would be loaded into the database - every row loaded but every column was set to NULL.
- Rolled Sybase driver back to ASE ODBC 12.5 - this actually worked, but I could not roll the drivers back on the server machines so I had to find another answer.
- I tried several different query formats, but it failed every time.

Then, I stumbled across this thread and the good advice on the .NET framework versions proved to be the answer.

On my local workstation machine and the server which stopped working, I had .NET 3.5 SP1 installed. But on the server which was still working, the .NET framework was 3.5 without SP1.

Basically, I removed all versions of the .NET Framework, reapplied .NET 2.0, 2.0 SP1, 3.0 and 3.5 in order and tested the package between each installation.

Now it's working and the data is loading in just as it should.

I can't believe that .NET 3.5 SP1 would just break ADO like this. I don't have a problem with compatibility issues between versions of .NET, but some explanation or notification would be useful.

Oh well, the things you learn.

Tuesday 17 November 2009

The Horrors of Internet Explorer 6: Detecting a Visitor's Web Browser

I tend to keep up with updates to browsers, so I left Internet Explorer 6 behind in a cloud of dust a long time ago. It's horrible, and craps all over CSS in a way which beggars belief.

So much so, that several online campaigns have been set up to try to get rid of Internet Explorer 6. After all, it is eight year old technology - and a LOT has changed since its release in August 2001.

So when I discovered that my own website looks absolutely horrible through IE6, I realised that I had to accommodate this somehow.

The market share of IE6 has been dropping steadily throughout 2009, and at the time of writing, accounts for roughly ten percent of browsers on the web.

Rather than try to accomodate such a terrible piece of browser technology, I have decided just to display a message to IE6 users telling them that the site doesn't work for their browser and that maybe it's time for an upgrade.

Now I realise that a lot of people would object to this, shouting claims of lost custom due to IE6 users just closing the site, and that not everyone can upgrade from IE6 (standardised corporate PC builds etc).

But I am willing to sacrifice that potential ten percent of visitors to save myself from a complete site redesign which would be out of date in a year when the last IE6 users finally upgrade to something a bit more robust. I also think that some of those ten percent would have another browser on their system, which they may just use to access sites which don't show up properly in IE6 (there must be a few).

Anyway, the point is that there is a really simple way to detect which version of IE is being used, and display custom messages to the user or to show/hide certain parts of a web page. This excellent blog post highlights how it can be done in detail, but basically detecting IE6 can be done using the following simple conditional comment (which is only parsed by Internet Explorer browsers):

<!--[if IE 6]>
Special instructions for IE 6 here
<![endif]-->

Now, a custom message can be displayed and problematic sections of the site can be hidden for IE6 users.


Simples!

Monday 9 November 2009

Firefox: This address uses a network port which is normally used for purposes other than Web browsing.

When browsing to some sites using Firefox, I got the message:

"This address uses a network port which is normally used for purposes other than Web browsing. Firefox has cancelled the request for your protection"

When you see this message, Firefox completely blocks you from accessing the site you have requested. Since people often need to access non-standard ports for development purposes, this is stupid - the kind of 'nanny' behaviour you would expect from Microsoft, not a company who are normally very 'developer friendly'.

There is of course, a way around it:

1. Open Firefox
2. Type about:config in the address field
3. Right click anywhere on the screen
4. Click new > string
5. Enter preference name as: network.security.ports.banned.override
6. Enter string value as: 2049 (or whatever port number you want)

When you attempt to visit the site again, Firefox will allow it since you have added an exception for the selected port.

Friday 23 October 2009

IIS, ASP.NET: Could not load file or assembly 'System.Core, Version=3.5.0.0, Culture=neutral' or one of its dependencies'

If you receive this error when trying to access a newly deployed application on IIS:

"Could not load file or assembly 'System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. The system cannot find the file specified"

Then check that you have the .NET framework version 3.5 installed. This was the LAST thing I checked after about a million other things - I wish I had checked that first!

I'll know for the next time though!

IIS: "Page cannot be found" error when browsing aspx pages in Windows Server 2003 with IIS 6.0

I was configuring a new application, and had re-registered the .NET framework.

I kept getting a page cannot be found message when I tried to browse aspx pages on Windows Server 2003 with IIS version 6.0 and .NET framework 3.5.

Apparently it's because in Windows 2003, all the webservice extensions are "Prohibited" by default to ensure security. These settings were restored to default when I re-registered the .NET framework.

To resolve this, I did the following:

1. Opened IIS Manager (from a command shell, type inetmgr and press enter).
2. Expanded the appropriate nodes in the IIS to locate the "Webservice Extensions" Node
3. There was a list of "prohibited" extensions in the right.
4. Clicked on ASP.NET and "allow" it

Worked! Well, I was now back to where I was before I started mucking about with .NET.

This article applies to Windows Server 2003, IIS 6.0 environment.

Microsoft Outlook: Attachment field does not appear in header

Being a heavy user of Outlook 2007, I found it really annoying when writing an email and adding an attachment when sometimes it would appear in the body of the email, rather than where it should be; on the attachments bar at the header of the message:


I thought it was a bug so I kept restarting Outlook but it had no effect. When I created a new message and added an attachment, it would behave the way I wanted.

Then I figured out that it's something really straightforward. If the message is in Rich Text Format the attachments will appear in the body of the message (which looks messy), but if you change it back to HTML or Plain Text format they will go back to the attachment bar in the header - where they should be!

Simples!

Thursday 22 October 2009

Simple Auto Refresh of ASP.NET Page

I have a 'log viewer' page in one of my projects which I need to refresh every ten seconds.

There are lots of ways to achieve this using JavaScript and nasty code behind bodge jobs but there's a much easier way that only requires one line of code.

<HEAD>
<title>Auto-Refresh</title>
<meta content="Microsoft Visual Studio .NET 7.1" name="GENERATOR">
<meta content="C#" name="CODE_LANGUAGE">
<!--The next line of code will refresh the page every 10 seconds-->
<meta http-equiv="refresh" content="10">
<meta content="JavaScript" name="vs_defaultClientScript">
<meta content="http://schemas.microsoft.com/intellisense/ie5"
name="vs_targetSchema">
</HEAD>

Clearly, content="10" means the interval in seconds.

Tuesday 20 October 2009

Postbacks Not Working in ASP.NET

I was deploying an application to a Development server, and noticed that when I tested anything which caused a postback, the page would simply refresh and the postback would not work.

When accessing the site on the server itself using the localhost address, it worked fine.

Bizarre.

Then I checked if it was working in Firefox, and lo and behold - it was!

Even more bizarre.

But apparently, it's something to do with validators which are specific to IE. It detects the browser, then if it's IE it sends specific JavaScript down to cause validation of postbacks.

The solution? Copying the aspnet_client folder into the root of our IIS installation's Default Web Site and restarting IIS. Once I closed my IE session and accessed the site again, the postbacks started working.

Friday 16 October 2009

SQL Server Agent: Error When Running a DTSX Package to load Excel

When deploying a project onto a 64 bit Windows Server machine, I noticed an issue with scheduling SQL Server Agent to run a DTSX package to load an Excel file. The error is a bit like this:

An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040154 Description: "Class not registered"

I could run the package directly using SQL Server Management Studio without any problems, so it's obvious that the issue is being introduced by SQL Server Agent. After rooting around on the net looking for the answer, it seems that the agent is trying to run the package in 64 bit mode.

There are no 64 bit Excel drivers, so the package fails. For information on the impact that 64 bit architecture has on SSIS, refer to this blog post by Deniz Arkan, the SQL Server Integration Services Program Manager at Microsoft, and this Microsoft article on the subject.

Many forum posts suggest that the package should be run via the Agent using the 32 bit version of DTEXEC rather than the 64 bit version.

By default, a 64-bit computer that has both the 64-bit and 32-bit versions of an Integration Services command prompt utility installed will run the 32-bit version at the command prompt. The 32-bit version runs because the directory path for the 32-bit version appears in the PATH environment variable before the directory path for the 64-bit version. (Typically, the 32-bit directory path is :\Program Files(x86)\Microsoft SQL Server\90\DTS\Binn, while the 64-bit directory path is :\Program Files\Microsoft SQL Server\90\DTS\Binn.)

So if you have both 32 bit and 64 bit versions installed, you have to explicitly point to the 32 bit version of DTEXEC on the command line.

Alternatively, to run a package in 32-bit mode from a 64-bit version of SQL Server Agent, select Use 32 bit runtime on the Execution options tab of the New Job Step dialog box. This depends on your having the 32 bit runtime tools installed.

For a step by step description of how to run a package from an Agent job in 32 bit, refer to this Microsoft KB article.

But after all this reading, I bottled it. I can't be bothered having to get the 32 bit version of the runtime installed on the Dev, UAT and Production boxes so I decided to get the files I needed converted into a flat file format. This won't suit everyone as not everybody can get their input files changed to a different format, but for me it's the best route.

Anyway, clearly I am by no means an expert on this subject, but I reckon that Microsoft are trying to discourage people from using 32 bit architecture on 64 bit machines by making this kind of thing difficult to implement, and by not providing 64 bit drivers for their Office products.

Thursday 8 October 2009

Locating Duplicates in Excel

If you work with large Excel files regularly, it can be a pain in the neck to find duplicates in a column of data.

You could export the file to a text file and load it into a database, and use a query to find the duplicates - but that's probably the sledgehammer approach.

The problem is that Excel doesn't appear to offer a straightforward way of identifying duplicates, but it can be done using Conditional Formatting.

Rather than reinvent the wheel and write a detailed how-to here, it's a good idea to link to the definitive authority on the subject: Microsoft. So here's a link to an article which shows you how to locate duplicates in a matter of seconds. Hope it's useful.

Wednesday 7 October 2009

ASP.NET: "It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level"

You may get this error when trying to browse an asp.net application.

The debug information shows that "This error can be caused by a virtual directory not being configured as an application in IIS."

When you create an new web application using visual studio.net, it automatically creates the virtual directory and configures it as an application. However, if you manually create the virtual directory and it is not configured as an application, then you will not be able to browse the application and may get the above error. The debug information you get as mentioned above, is applicable to this scenario.

To resolve it, Right Click on the virtual directory - select properties and then click on "Create" next to the "Application" Label and the textbox. It will automatically create the "application" using the virtual directory's name. Now the application can be accessed.

This solution 'borrowed' from this blog.

IIS 5.1: "MMC has detected an error in a snap-in"

When using IIS Manager on a Windows Server Enterprise Edition box, the error message "MMC has detected an error in a snap-in. It is recommended that you shut down and restart MMC." sometimes appears.

Googled around for a bit, and this seems like a bug that Microsoft know about, but don't care about. Odd! Suppose that's nothing new though:

http://support.microsoft.com/kb/915519

Thursday 3 September 2009

Visual Web Developer 2008: Failed to load dataset because of the following error: Unable to find connection

Somehow, I ended up with two connection strings in my ASP.NET project.

I don't know where the second one came from (I suppose it could've been user error), but VSWD called it connectionString1 and without my permission or instruction, modified a whole load of other files in my project to suit. So I just took the sledgehammer approach and removed the second connection string and did a project-wide find and replace on all occurrences of connectionString1.

I thought I was ok until I got to the point where I needed to make some changes to the DAL of the application, and the following error appeared:


The error message isn't exactly self-explanatory and VSWD wasn't exactly helpful.

So I started panicking, sweat coming from my brow at the prospect of having to spend hours at figuring out the source of this problem, feeling stupid because I felt that I had caused this problem by messing about with settings I didn't understand.

I could open the DAL .xsd file in Source mode, but could not open it in Design mode. Hmm... seems to be some issue with all of the stuff VSWD does in the background to render the visual portrayal of your DAL...

Anyway, lo and behold the answer came nice and easily. In my DAL's .xsd file, I had two references to exactly the same connection string. So when trying to load it up it got confused, spat the dummy and didn't bother telling me what the problem was.

So if you see this problem in your project, check to make sure that

1) You don't have multiple connection strings in your project where you don't explicitly need them.
2) You don't have duplicate references to your connection strings in your DAL's .xsd file.

I think that error can be caused by other things, but check those two things first and hopefully it'll help.

Monday 31 August 2009

ASP.NET: Server Application Unavailable

If you get this error when you try to access your ASP.NET application:

... it could be caused by a huge number of things.

It happened to me, and to get more information I went to the Event Viewer where some errors had showed up. Specifically these two errors:


and:


This page is quite useful in explaining the potential problems and gives a few ideas for resolution. I went through a lot of them but the answer ended up being very simple.

The ASPNET user account somehow became locked out, so all I had to do was go into Control Panel > Administrative Tools > Computer Management > Local Users and Groups and look at the properties for the ASPNET account.

The account was marked as locked out, so I uncleared the checkbox and restarted IIS with an iisreset at the command line. Then I tried to access my application and it worked!

It was so straightforward, but I still have no idea why the account became locked out in the first place!

Thursday 27 August 2009

Apple Tablet: Will Steve's New Toy do the Job?

Many people apparently believe that Apple are about to announce a new gadget, the Apple Tablet.

Since it's such a work of art, I always thought it would be cool to take an iPod Touch and enlarge it by several times. It looks a bit like the Tablet (if it actually appears) will look just like that.
I really hope that they do release such a gadget, because aside from an iPhone I've never owned a Mac. It looks like it could be the perfect first buy; possibly portable, durable and running Mac OS X, with a user experience that beats the hell out of Windows (not that difficult).

Of course, it's going to be expensive and demand will undoubtedly reach fever pitch,
so I better start saving!

Thursday 13 August 2009

Posting Code in Blog Posts

It's a nightmare trying to post any code in Blog posts. The tags confuse Blogger (fair enough) so if it happens, it won't let you post the code.

I tried to install SyntaxHiglighter from Google, but it requires you to host its code on a webserver somewhere and I couldn't get it to work.

Stanley Shilov came up with a nice alternative online tool which you can paste your code into, and it will parse it and replace any characters which will confuse your blogging engine with HTML tags which it'll love. You can access the tool here, and it needs zero set up and zero hassle.

The drawback is that it doesn't offer any syntax highlighting, but it's still useful and I'll certainly be using it in future!

ASP.NET: Adding a simple Ajax Web Service with AutoCompleteExtender in Visual Studio 2008 Express

I'd been really struggling to get any kind of Ajax functionality in my ASP.NET project. The first problem was that I couldn't be sure if I had enabled the Ajax extensions in my project, so to verify it I decided to try to create a new 'ASP.NET Ajax Enabled Web Site' template in Visual Studio.

But I couldn't see that option, and wondered if I had the wrong version of the .NET framework installed. No - that was fine, I had 3.5. So it must be something else. Until I discovered that Visual Studio 2008 Express automatically adds the Ajax extensions to any new ASP.NET project you create in it. So that's good.

But when I went to add an AutoCompleteExtender to my ASPX page, I couldn't see the control in my Toolbox. This was odd. But EVENTUALLY I discovered that I was missing the AjaxControlToolkit.dll from my project's Bin folder. This comes as part of the Ajax Control Toolkit, which you can download from CodePlex:



So I downloaded the Binary package (to avoid all the extra crap that comes with it) and dropped the AjaxControlToolkit.dll into the Bin folder of my Project. Then I right-clicked on the Toolbox in VS2008 and selected Choose Items. Selecting the DLL file now meant that after a couple of seconds of chewing it over, the Ajax controls I needed appeared in the Toolbox.

Now I had to see if I could get an extremely simple Ajax control set up on my page, just to see if it was working. So I dragged a new texbox onto my page, gave it a name and then dragged a ScriptManager control onto the page. Next, I had to add an AutoCompleteExtender control which would target the text box I chose, and Ajax enable it.

<cc1:AutoCompleteExtender ID="AutoCompleteExtender1" runat="server" MinimumPrefixLength="1" ServiceMethod="HelloWorld" ServicePath="WebService.asmx" TargetControlID="TextBoxAjax"> </cc1:AutoCompleteExtender>

So now I have an Ajax enabled control on my page, and finally I needed to set up a web service to feed it something. So I created a new web service called WebService.asmx and added the following code to it:

Imports System.Web.Services
Imports System.Web.Services.Protocols
Imports System.ComponentModel

' To allow this Web Service to be called from script, using ASP.NET AJAX, uncomment the following line.
<System.Web.Script.Services.ScriptService()> _
<System.Web.Services.WebService(Namespace:="http://tempuri.org/")> _
<System.Web.Services.WebServiceBinding(ConformsTo:=WsiProfiles.BasicProfile1_1)> _
<ToolboxItem(False)> _
Public Class WebService
Inherits System.Web.Services.WebService

<WebMethod()> _
Public Function HelloWorld(ByVal testStr As String) As String()
Dim returnString(2) As String

returnString(0) = "one"
returnString(1) = "two"
returnString(2) = "three"

Return returnString
End Function

End Class

It creates a WebService.asmx.vb file too, which actually contains the code above. So all the HelloWorld function does is return a string array to the Ajax textbox.

I built and started the app, and voila - when I type something into the box, the values 'One','Two','Three' appear below. It's not smart at all at this point, since the web service just returns those values no matter what is typed into the text box. But it's square one, and it's a good place to start.


Next step, add some proper code to the Web Service which returns the correct autocomplete values based on what's typed.

Tuesday 4 August 2009

Solution: IIS - You Are Not Authorised To View This Page

If you're setting up a new site on IIS, and occasionally (usually after you lock your workstation and log back in, or if you leave it for a long period of time) when you attempt to view a site on your localhost or on another remote machine using IIS - you might see the page displayed which says:

"You are not authorized to view this page (Error 401.1)"

If you're allowing anonymous access to your site using the IIS account, a temporary workaround is to run an iisreset at the command line. But a longer term (and less irritating) solution is to go into the Directory Security properties of your Virtual Directory in IIS configuration and uncheck the box which says "Allow IIS to control password".

Then do an iisreset and the error shouldn't appear again, unless there are deeper problems with your security set up.

Thursday 30 July 2009

Solution: Blank Report Manager (SQL Server 2005 Reporting Services)

If, after installing Reporting Services you see a blank screen when trying to open Report Manager, you can try the below methods to get you moving.


If you have an Administrator account on the local machine:
· Enable Integrated Windows Authentication in the Reports (and ReportServer) virtual directories in IIS, and disable Anonymous Access.
· Restart IIS (iisreset on command line)
· Try to open Report Manager. If your account is recognised as an Administrator, you should see more content in Report Manager.

If you still can't see full content in the Report Manager, then you can try this:
· Add the local anonymous IIS account (IUSR_MACHINENAME) to the Administrators group for the local machine (this is just temporary, read on).
· Restart IIS (iisreset on command line)
· Try to open Report Manager - you should now see full content.
· Go to the Properties tab, and click New Role Assignment. Grant the IUSR_MACHINENAME account the appropriate privileges to access the Report Manager (be careful not to allow too much access since this means your users could change/delete reports).
· Remove the IIS account from the Administrators group for the local machine.
· Enable Anonymous Access for the Reporting Services virtual directories in IIS, and disable Integrated Windows Authentication.
· Restart IIS
· You should now be able to see the appropriate content within Report Manager.


Hope this helps! It worked for me, and I had been going mad with frustration

Wednesday 29 July 2009

SQL Server 2005 Installation: Issues and Resolution (Errors 29559 and 29515)

I was attempting to install SQL Server 2005 Developer Edition with the following components, and experienced some issues:

· Database Services
· Analysis Services
· Reporting Services
· Integration Services

I thought it might be useful to share my findings in case a Google search brings up these errors and the resolutions I found, since I eventually managed to get everything installed.

Some people might have opposing views on the steps I took, but it works now and that's the main thing.

  1. SQL Server 2005 Error 29559
    Full Error Text (From Windows Event Viewer): "Product: Microsoft SQL Server 2005 -- Error 29559. SQL Server Setup failed to modify security permissions on service MSSQLServerADHelper for user V024755. To proceed, verify that the account and domain running SQL Server Setup exist, that the account running SQL Server Setup has administrator privileges, and that exists on the destination drive."
    Background: When attempting to install SQL Server, the following error appeared near the end of the installation when it was attempting to configure Database Services. Was connected to the installation machine over Remote Desktop (mstsc). Had initiall selected to install all of the desired SQL Server 2005 components, with all except database services installing succcessfully in this installation.
    Workaround: Logged onto machine directly rather than over mstsc. Checked that domain account was a member of the Adminstrators group. Removed all traces of SQL Server 2000 files from C:\Program Files\Microsoft SQL Server\80. Modified file permissions on all files and folders under that location to allow full control to the appropriate domain account.
    Result: The above error did not reappear, the installation proceeded past this point though another error was encountered.
  2. SQL Server 2005 Error 29515
    Full Error Text (From Windows Event Viewer):
    "Product: Microsoft SQL Server 2005 -- Error 29515. SQL Server Setup could not connect to the database service for server configuration. The error was: [Microsoft][SQL Native Client]Encryption not supported on the client. Refer to server error logs and setup logs for more information. For details on how to view setup logs, see "How to View Setup Log Files" in SQL Server Books Online."
    Background: After setup had proceeded past the point of error 29559 above, the installer was attempting to start the database engine service when the above error occurred. The suggestion was that the problem was caused by the fact that some SQL Server 2005 components (Integration Services, Visual Studio 2005 IDE etc) had installed successfully in the very first installation, so there was some kind of conflict caused.
    Workaround: Removed all traces of all SQL Server (2000 and 2005) installations to begin from a 'clean slate'.
    NOTE: Everything except Microsoft SQL Server Setup Support Files - you need this to complete 2005 uninstallation. If this does not appear in Add/Remove Programs, you can reinstall it by running SqlSupport.msi, located in the installation folders.
    - Removed everything related to SQL Server which was visible in Add/Remove Programs dialog.
    - Followed uninstall steps for SQL Server 2000 from Microsoft website.
    - Followed uninstall steps for SQL Server 2005 from Microsoft website.
    - Deleted C:\Program Files\Microsoft SQL Server and all its contents (You may experience file lock issues when attempting to delete, in this case you can use Process Explorer to identify which process has a lock on the file and kill it, at your own risk!).- Restarted machine
    - Restarted installation, to install Database Services only. This was done with the goal of minimising risk by introducing installation of other components.
    Result: Installation was successful of Database Services.
    - Started installation of remaining components (Analysis Services, Reporting Services, Integration Services), without Database Services.
    Result: Installation of remaining components was successful.

Conclusion
· Remove all instances of SQL Server 2000 before you do any installation of SQL Server 2005 whatsoever, do not attempt to upgrade if at all possible.
· Ensure you have administrator privileges on the machine where SQL Server is being installed.
· Install Database Services first, before any other components.
· Once installation of Database Services is successful, then install remaining components.

Tuesday 7 July 2009

Windows Remote Desktop (mstsc) can't connect: quits unexpectedly

I'm in London and I've been trying to establish a remote connection to another machine which I use for development.

I needed to access it so that I could work remotely while away, but when connecting using Remote Desktop (mstsc on the command line) I could see the log in dialog and enter my details, but as soon as I pressed 'OK' it just quit. No error messages, nothing!

I tried using the IP address, and the fully qualified hostname but no cigar. What drove me even more mad was that I previously tested that I could connect to the machine when the laptop and desktop were physically right next to each other.

But as soon as I moved away from the same location, I found I was unable to connect.

I thought it was something to do with trying to connect to it from another IP subnet, but I managed to connect remotely to a third machine which was on the same subnet, and from there I still could not connect to the target machine.

I thought it was odd that I could connect to one machine, but not the other. So I did a search of the Microsoft support website, and it turns out that the problem is a dodgy NVIDIA display driver! Apparently it doesn't like people connecting remotely, so it just decides to kill the connection with no error and no warning.

See this support article: http://support.microsoft.com/kb/886212/en-us which suggests rolling back your NVIDIA driver.

But in the short term, if you have an NVIDIA graphics card and if this problem has been driving you mad and you are unable to roll the driver back, hopefully this will help. I think the problem is due to the multiple monitor functionality that comes with the NVIDIA driver software. So if you have that enabled, then this might help.

My resolution was:

- Get the machine restarted
- It *should* let you log in since there is no session to restore
- Disable the nView functionality

For me, that meant Bob was my Uncle. But it remains to be seen if I can disconnect and then reconnect to my session on the remote PC.

Friday 26 June 2009

SQL Server Job Error: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B

If you deploy some DTSX packages to SQL Server, and create an Agent job to run them, you might see this error:

Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B

The workaround is probably simple. When deploying to your SQL Server and you're on the Specify Target SQL Server page of the wizard, select the 'Rely on server storage for encryption':



That should probably do it. If not, check out this thread on MSDN.

Thursday 25 June 2009

SQL Server Integration Services: Setting Input Filenames Dynamically from a SQL Server Table

Being a recent convert from SQL Server 2000, I was used to the DTS framework for importing files to the database. From recent blog posts, you will see that it's been a learning curve in getting completely up to speed with SQL Server 2005's Integration Services.

When loading files into a database, I obviously found it important to dynamically set the file path so that values were not hard coded, otherwise it became a maintenance nightmare. Especially when promoting the packages to different environments.

I decided that for my application, the best way to dynamically set these properties was by creating a database table with the details on each file which was to be imported. Then I would extract the filename into the DTSX at runtime and use that to import the file. This way, the filenames would be picked up according to whatever environment the DTSX is running in, and there would be no hard-coded filenames to deal with.

So I thought I'd share a quick description of how I went about it, as much as a reminder for myself in future as for others who might stumble across this blog.

The table which contains the import file details is set up as follows:

CREATE TABLE [Import_File_Specifications](
[ImportFile] [nvarchar] (100) NOT NULL ,
[FileName] [nvarchar] (255) NULL ,
[FileExtension] [nvarchar] (30) NULL ,
[Location] [nvarchar] (255) NULL ,
CONSTRAINT [PK_Import_File_Specifications] PRIMARY KEY CLUSTERED ( [ImportFile] )
ON [PRIMARY] )
ON [PRIMARY]
GO


So I loaded the file specifications into there, and at runtime the DTSX would run some SQL against the table to build the filename based on the folder location, filename and file extension.

All I did was (this is assuming you're importing a flat file - tab delimited, csv etc):

1. Create a Package Variable to contain the filename of type String (right click on the Control Flow canvas and click 'Variables') .

2. Create an Execute SQL Task at the beginning of the DTSX package which flows into the rest of the process with an 'on success' connector.

3. On the General tab of the SQLStatement setting in the SQL Task, I put:

select Location + FileName + FileExtension as FileName from Import_File_Specifications where ImportFile = 'xxxx'
"

'xxxx' is the unique ImportFile key for the file you want to use). Set ResultSet equal to 'Single Row' and SQLSourceType to 'Direct Input'.

4. The next step is to set the Package Variable equal to the filename which comes from the SQL Task. So on the Result Set of the SQL Task, press Add and select Result Name to equal FileName (from the SQL Statement you just entered), and select the Package Variable you created at the beginning.

5. Now, when the DTSX runs the SQL Task, it should set the Package Variable to equal the filename which comes from the database. The final step is to set the ConnectionString of the file's connection manager equal to the Package Variable which contains the filename. Select the connection manager, and on its properties which appear on the right hand side, click the '...' button which appears when you click the 'Expressions' field.

6. When the Property Expressions Editor appears, Press the '...' button to add a new expression property and select 'ConnectionString' from the drop down on the left. Then choose your package variable from the drop down on the right hand side.

That's it! When you test your DTSX package, if you've set up everything correctly then the SQL Task should retrieve the filename from the database, set the Package Variable equal to it and pass that into the Connection Manager.

For an Excel file, you set the 'ExcelFilePath' at runtime rather than the 'ConnectionString' property because the ConnectionString tells the package to use an OLE connection to the file. Note that you must set your PackageVariable to have a default value otherwise the package will not compile.

I admit that this post could have done with more detail, and probably some screen shots but as I said, it's just really a note to remind me of this technique in future. Others have also found different ways of doing this which they might argue are better, but the way I have this set up it works for my application and will allow for different input filenames depending on whether the app is running in DEV, UAT or Production.

Tech Review: Apple iPhone 3GS

It had been a long time since I possessed what could be called a 'good phone'. This is because I always found myself willing to sacrifice having a decent handset for the sake of a free xbox 360, PS3 or Wii with my phone contract.

I finally got out of contract with Orange, and was dying to get a new phone with unlimited net access because it was becoming more important that I have instant access to email on the move. The only two competitors were the Palm Pre...

...and the new iPhone.
Both have a touch screen, both have unlimited data access, both can be used as an iPod (the Pre tricks iTunes into thinking it's an iPod), both have GPS and Wi-Fi and just about everything else you can imagine (see here for a full side-by-side comparison).

But initially, when comparing the two online I couldn't help but be more impressed with the Pre. It looks very sleek, and as if a lot of thought has gone into designing both the hardware and the WebOS operating system on it. It has brilliant multitasking capabilities, lots of ways to intelligently organise your data and do cool stuff with GPS and social networking sites.

That, in addition to the fact that I've never liked the look of the iPhone, meant that for the past couple of months the Pre had been at the top of my wish list.

At the recent WWDC conference in San Fransisco Apple announced the iPhone 3GS, which is basically a pimped out version of the iPhone 3G handset, with a few new features and a faster user experience. It appealed, but still sat second in my wish list when compared with the Pre.

Palm have released the Pre in the US already, but for some reason they have not released it in the UK. I think this is because they haven't completed the compatibility with our slightly different GSM networks.

In the incredibly competitive smartphone market where the Pre is undoubtedly Palm's 'do or die' handset, this is an unbelievable planning error.

There must be literally thousands of people like me, who wanted a new phone and couldn't decide between the Pre and the iPhone. These people would like to see what the Pre can do, but are not willing to hang around for months until Palm get their act together and set a release date. So I lost patience, and bought a new iPhone 3GS last week.

It was disappointing not to have the choice between the two, but if Palm want to compete with Apple, they needed to play the game.

And Palm's move should have been to release the Pre in the UK before the iPhone 3GS, or at least a matter of days or weeks after. Now of course I'm not saying that Palm will fail because I personally did not buy a Pre, but surely it's damaging to their prospective Pre sales that since launch, Apple have sold 2 million iPhone 3GS in the European market alone.

Maybe they underestimated the importance of the European consumer electronics market. But they shouldn't, have because it could have made the difference between the Pre becoming their saviour, or the iPhone becoming their downfall.

The point of this post was really to review the iPhone. And I must say, I absolutely love it. It's a complete pleasure to use, and the hype is correct - the App Store is the killer. So many cool little applications to download, and a large portion of them are completely free. It's not perfect though, the email client is a little overly simplistic and the battery life is horrible. But I don't regret for one second buying the iPhone, and by the time the Pre finally hits the UK shores I will be content that I shouldn't have hung about for months with my old Nokia 6300 and chunky Orange contract in tow.

Tuesday 23 June 2009

Implementing Error Handling in SSIS

I've been trying to get to grips with SQL Server Integration Services, as opposed to the old DTS technology in SQL Server 2000. It's a task.

One of the things I've found difficult is to implement error handling, and after poring over much rubbish on the net, I found two useful blogs on the subject. Both of them have reusable code which is easily added to existing DTSX packages.

Step one is to read this blog by Jamie Thomson at SSIS Junkie.

Then, this additional blog adds a lot more detail and allows you to store errors in a database table which is very useful if your application has a lot of flat file data inputs (such as a reconciliation).

Now I just need to add error handling which will detect if the file is not there at all, and log that too!

SQL Server Reporting Services and IIS: Failed to access IIS metabase - Guaranteed Fix

I spent ages trying to get SQL Server 2005 Reporting Services working, and kept getting the error "Failed to access IIS metabase" from IIS.

After spending ages trawling through documentation, Microsoft KB articles, messing about with security settings and adding Windows and domain users to the IIS metabase, I found the answer in this post which worked perfectly and took less than a minute. If you're having a similar problem, this will solve it:

My MCP: Failed to access IIS metabase.

To summarise his post, you run this command:

%windir%\Microsoft.NET\Framework\v2.0.50727\aspnet_regiis.exe -i

What's the lesson? Ignore Microsoft and listen to the blogosphere.

Also: if you're just getting started with Reporting Services, this installation tutorial is the business, and this one is superb for showing you how to build and deploy reports.