# Friday, 10 May 2013

Every once in a while you find yourself on a project that doesn't have all the resources you'd like to have and you start falling behind with respect to the tooling and SDK versions being used.

In this case I'm working on a project that is running Azure SDK 1.6 and we're primarily developing the application using Visual Studio 2010 on Windows 7. The IT department has prompted an upgrade to Windows 8 and I was hoping to be able to leapfrog several upgrades and go straight to Azure SDK 2.0 and Visual Studio 2012. Unfortunately this didn't work so well and after the long process of installing the new OS, new tooling and attempting to upgrade the project due to the breaking changes in Azure storage SDKs I find myself looking for a better process to test technology upgrades in a way that leaves flexibility for the team to upgrade their current environment or rebuild it as quickly as possible to minimize developer downtime.

Friday, 10 May 2013 09:32:10 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, 07 June 2012

Have you ever wanted to load different reference data into your Visual Studio 2010 database project based on Project Configuration?

I found a great thread exposing how to do this. (Thank you Kuris01)  I’ll add a couple of screenshots from my experience.

This is the UI for adding SQL Command Variables:

image

This is the bottom of the Unloaded Database.dbproj where I added the ItemGroup:

image

Sure beats the heck out of what I used to do which was have unique .sqlcmdvars for each build configuration!

Thursday, 07 June 2012 12:34:05 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Thursday, 17 May 2012

I’ve been working on automating a build on my new project for a while now and came across an interesting problem just before I was able to get the build to succeed.

I got this error:

C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\Windows Azure Tools\1.6\Microsoft.WindowsAzure.targets (3356): Unable to copy file "ServiceDefinition.csdef" to "ServiceDefinition.build.csdef".  Access to the path 'ServiceDefinition.build.csdef' is denied. [C:\Builds\…..

csdef_Copy_error

And after digging around I found that the ServiceDefinition.build.csdef file shouldn’t have been checked into TFS according to this thread or it needed to not be marked “read only” according to this blog entry.  I renamed the files to *.old in source control for the two cloud projects in my solution and that did the trick. (I wanted to make sure before I deleted them that this was the actual problem)

Thursday, 17 May 2012 10:43:41 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 08 May 2012

I came across a stored procedure that was doing the upsert (update/insert) pattern and it was using XML as the source rather than a TVP. Since we're using Entity Framework and TVP isn't reaily a vaiable option at this point, I decided to see if I could re-write it to work on SQL Azure as a T-SQL Merge.

The original looked something like this (schema and variable names changed to protect the innocent)

CREATE PROCEDURE [dbo].[UpdateOrAddOrder]
@emailAddress NVARCHAR(255)
,@order XML
AS
BEGIN TRY
DECLARE @userId INT
BEGIN TRAN
IF NOT EXISTS(SELECT (1) FROM [dbo].[Customer] WHERE [emailAddress] = @emailAddress)
BEGIN
INSERT INTO [dbo].[Customer] VALUES(@emailAddress, GETUTCDATE())
SELECT @customerId = SCOPE_IDENTITY()
END
ELSE
SELECT @customerId = [customerId] FROM [dbo].[Customer] WHERE [emailAddress] = @emailAddress

DECLARE @tmpOrder TABLE
([itemId] UNIQUEIDENTIFIER
,[itemName] NVARCHAR(25)
,[itemQuantity] INT
,[itemOrderDate] DATE)

INSERT INTO @tmpOrder
SELECT
i.value('(Identifier)[1]', 'UNIQUEIDENTIFIER') AS 'itemId'
,i.value('(Name)[1]', 'VARCHAR (25)') AS 'itemName'
,i.value('(Quantity)[1]', 'INT') AS 'itemQuantity'
,i.value('(OrderDate)[1]', 'DATE') AS 'itemOrderDate'
FROM
@order.nodes('/Order/Item') AS o(i)

UPDATE [dbo].[Order]
SET
[Order].[Quantity] = [t].[itemQuantity],
[Order].[ModifiedOn] = GETUTCDATE()
FROM
@tmpOrder AS [t]
WHERE
[Order].[customerId] = @customerId AND
[t].[itemId] = [Order].[itemId]

INSERT INTO [dbo].[Order]([customerId], [itemId], [itemName], [itemOrderDate])
SELECT
@customerId
,[t].[itemId]
,[t].[itemName]
,[t].[itemOrderDate]
FROM
@tmpOrder AS [t]
WHERE [t].[itemId] NOT IN
(SELECT [itemId] FROM [dbo].[Order] WHERE customerId = @customerId)
COMMIT TRAN
END TRY

BEGIN CATCH
ROLLBACK TRAN
DECLARE @ErrMsg nvarchar(4000), @ErrSeverity INT
SELECT @ErrMsg = ERROR_MESSAGE(), @ErrSeverity = ERROR_SEVERITY()
RAISERROR(@ErrMsg, @ErrSeverity, 1)
RETURN
END CATCH

 

After some digging around, I came up with this:

CREATE PROCEDURE [dbo].[UpdateOrAddOrder]
    (@emailAddress    NVARCHAR(255)
    ,@order    XML)
AS
BEGIN TRY
    BEGIN TRAN
   
        DECLARE @ErrMsg NVARCHAR (4000)
        DECLARE @ErrSeverity INT
        DECLARE @subscriberId INT

        IF NOT EXISTS(SELECT (1) FROM [dbo].[Customer] WHERE [emailAddress] = @emailAddress)
            BEGIN
                INSERT INTO [dbo].[Customer] VALUES(@emailAddress, GETUTCDATE())
                SELECT @customerId = SCOPE_IDENTITY()
            END
        ELSE
            SELECT @customerId = [customerId] FROM [dbo].[Customer] WHERE [emailAddress] = @emailAddress

        MERGE [dbo].[Order]    AS target
        USING
            (SELECT
                i.value('(Identifier)[1]',    'UNIQUEIDENTIFIER')    AS 'itemId'
                ,i.value('(Name)[1]',        'VARCHAR (25)')        AS 'itemName'
                ,i.value('(Quantity)[1]',    'INT')                AS 'itemQuantity'
                ,i.value('(OrderDate)[1]',    'DATE')                AS 'itemOrderDate'
            FROM
                @order.nodes('/Order/Item') AS o(i)) AS source
                 ([itemId]
                 ,[itemName]
                 ,[itemQuantity]
                 ,[itemOrderDate])
        ON (target.[itemId] = source.[itemId] AND target.[customerId] = @customerId)
        WHEN MATCHED THEN
            UPDATE SET
                 target.[itemQuantity]        = source.[itemQuantity]
                ,target.[ModifiedOn]        = GETUTCDATE()
        WHEN NOT MATCHED THEN
            INSERT
                ([customerId]
                ,[itemId]
                ,[itemName]
                ,[itemQuantity]
                ,[itemOrderDate])
            VALUES
                (@customerId
                ,source.[itemId]
                ,source.[itemName]
                ,source.[itemQuantity]
                ,source.[itemOrderDate]);   

    COMMIT TRAN
    RETURN
END TRY

BEGIN CATCH   
    ROLLBACK TRAN
    SELECT @ErrMsg = ERROR_MESSAGE(), @ErrSeverity = ERROR_SEVERITY()
    RAISERROR(@ErrMsg, @ErrSeverity, 1)
    RETURN
END CATCH

Tuesday, 08 May 2012 10:54:16 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Tuesday, 01 May 2012

I’m working on a project that is deployed to Windows Azure and we’re using Visual Studio 2010 as our IDE and Team Foundation Server for ALM (Application Lifecycle Management).  Unfortunately we haven’t had a lot of time prior to my joining the team to be able to automate unit tests or the deploy process.  One of the best ways to get support or better support for products coming out of Redmond is to submit suggestions through uservoice.com.  I’d appreciate any support you can provide for deeper integration with Azure.  Until that time I will be attempting to automate testing and deployment for this project with Visual Studio 2010 and migrating over to Visual Studio 11 in the future.

Tuesday, 01 May 2012 13:04:58 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  | 
# Monday, 30 April 2012

I decided to go with dasBlog since it was supported by my hosting provider and didn't require a SQL Database. For the next few weeks I anticipate tweaks to the configuration and site. I don't anticipate very many users or comments, so any that I receive are likely to be given serious consideration if they include suggestions.

I've been reading Scott Hanselman's blog recently and attempting to follow some of his suggestions on how to have a good blog

I am modifying the blogroll to have only blogs that I actually follow. I did leave a few in from the default dasBlog install, but I removed the ones that looked to be inactive. I may remove a few more of them as time passes. My intention is to only keep the ones that I actually read in place.

I set Akismet up to help with SPAM and disabled captcha, so I'm hoping I don't have to do too much pruning on comments.

I installed Windows Live Writer so that I could do a better job of not messing up the markup (like I already did forgetting to close a <p> tag).

Monday, 30 April 2012 10:43:46 (Pacific Daylight Time, UTC-07:00)  #    Disclaimer  |  Comments [0]  |