I missed this event. My friend Sajid attended it – and he felt it was great. Today the blog innovative singapore has released the slides those were presented on that day. Of course you could request for a recording as well.
More on Microsoft Azure later.
Today I came across a word named TestDouble – I was wondering what it was! I had no idea – obviously I knew it had something to do with Unit Tests or Fitnesse Tests!
Test Double is a generic term for any case where you replace a production object for testing purposes
There are different types of TestDoubles :
1. Dummy Object
2. Test Stub
3. Test Spy
4. Mock Object
5. Fake Object
One of the common case, where we use TestDouble is when we need communication with external services.
More information here!
Once in a while I get this query: Do you know any tool to generate scripts for data in an existing table, of course for SQL Server?
Well, There are quite a few options.
If your company is rich enough to sponsor a tool then you could go for Redgate SQLToolBelt. This has quite a nice set of tools to generate scripts from existing data or new data and to compare scripts etc.
You could also try out the free tool : http://www.ssmstoolspack.com But this works only with SQL server Management Studio 2005 SP2 or above.
If you are using SQL Server 2008 you could use the built in option SSMS in 2008. This is a quick run through to generate Insert statements for all of the data in your table, using no scripts or add-ins to SQL Management Studio 2008:
- DATABASE NAME: Right Click
- TASKS: GENERATE SCRIPTS
- Under Table/View Options: Set SCRIPT DATA = TRUE
You will then get the create table statement and all of the INSERT statements for the data straight out of SSMS.
I am using it for quite sometime now. Works perfect for my requirement!
When we changed the existing service account we were using for TFS build agent and used a new service account. After this change our continous integration failed with the following error:
C:\Program Files\MSBuild\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets(699,5,699,5): error : The working folder [WorkingFolder] is already in use by the workspace [workspace];[domain]\[user] on computer [buildmachine]
Part of the Team Build build process involves creating a workspace that can be used to get sources for the build. This workspace is typically deleted and then created during the course of the build, meaning that after the build the workspace hangs around. So – when you changed the service account, the delete for the next build had nothing to do (since workspaces are owned and the current user didn’t have a workspace to delete) and the create failed, since a workspace already existed in the same location. You’ll just need to delete the old workspace, owned by DEV\tfssetup
I found the above excellent tip from the this page.
And the correct command to execute the deletion of workspace is as follows:
C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE>tf workspace /delete /server:http://%5Bservername%5D:8080/ [workspacename];[domainname]\[utcode]
You will need workspace deletion permission to execute the above command.
More information here!
February UG meeting is held on 4th February 2010, at Level 22 CF12, NTUC Building from 1845 – late. Light Refreshments will be served before the session.
Software Testing in Visual Studio 2010 and Team Foundation Server 2010
Maung Maung Phyo. MVP (Security)
1900 – late
Learn about how Team Lab Management in TFS 2010 enables better collaboration between software developers and testers. We will show you how testers can use Test and Lab Manager to plan tests, associate requirements, execute test cases and reports bugs with invaluable information ensuring that the developers are able to fix the reported bugs.
This event gives you a valuable opportunity to learn more about how Microsoft Application Life-Cycle Management (ALM) solution can help you reduce cost, reduce development time, increase quality, and improve project manageability.
Our local MVPs (Justin and Michael) are organizing a social gathering on 10 Feb 2010 from 7pm to 10pm @HackerspaceSG (70A Bussorah Street). See Silverlight 4.0 in action and discuss or brainstorm Silverlight 4 projects.
This is an informal social gathering to amass like minded developers.
Michael Sync (Silverlight MVP) will be sharing with you "What’s New in Silverlight 4.0" in 15 minutes.
Followed by a roundtable session to know everyone. We will also share projects/ideas that developers are working on.
All are welcomed to just show up!! No registrations required!
Actually we were looking for a course like this: Visual Studio Team System Hands-On-Lab. But unfortunately we missed it. Now I have subscribed for the feeds; I will never miss any course.
If you guys have any info regarding this kind of social-technical gatherings or hands-on-lab in Singapore, please share with me.
Of course, most of our decisions would be to use INT (or BIGINT) as data type and set IDENTITY to true. There is another solution: using GUID. [NEWID in SQL Server].
When a comparison is done between using INT and GUID as primary key, we can list out few important differences:
Pros of using GUID
- GUID will be unique through out the database; while INT is not
- @@IDENTITY can be a problem while doing INSERT through TRIGGERS. And using @@IDENTITY to get the identity for the recently added row in the table brings the created identity for the current connection.[This could be solved using SCOPE_IDENTITY (see here for an excellent explanation on this!)]
- GUID can be used to create primary keys before inserting the value to the database
- Integers are not big enough for most of the scenarios; even though you could start from smallest negative value. Anyway BIGINT can be a solution here.
- Using INT could be a real night mare when doing manual merge of tables.
Cons of using GUID
- 1. In theory GUID presents a risk of getting duplicate GUIDs; but in practice it is not. In the recent windows version (from windows 2000) the algorithm to generate GUIDs doesn’t use the MAC address of the network cars. Instead it is just a random value.
- GUID is 4 times larger in size.
- There are portability problems, not all the database have GUID as a data type.
- Huge overhead; 4 times larger than INT.
- Evolution of COMB GUID
So apparently overhead is the main problem here. When Jimmy Nilsson did real inserts using GUID as primary key he found out that time taken for inserts with GUID is 30 times greater than inserts with INT. That happened for 2 reasons
1. The generated GUIDs lacked the order; they were random. As you see in the below table only the 4 is common.
C87FC84A-EE47-47EE-842C-29E969AC5131 2A734AE4-E0EF-4D77-9F84-51A8365AC5A0 70E2E8DE-500E-4630-B3CB-166131D35C21 15ED815C-921C-4011-8667-7158982951EA 56B3BF2D-BDB0-4AFE-A26B-C8F59C4B5103
2. SQL insert statement makes the indexes to be reordered and this takes a lot of time. The new ordering for indexes depended on the last bytes
Therefore Jimmy Nilsson arrived at a new algorithm and his new GUID is called as COMB GUIDs.
His presentation in Belgium!
I sent the above photo, as a mail to some of my colleagues; surprisingly, most of them didn’t know why every one was wearing Red Shirt! Gu always wear red shirt during presentation.
Anyway, there are loads of presentation slides and demos on Visual Studio 2010 and .Net Framework 4.0 in the above link!
PS: I wear Contact Lenses every day; but man, no one gives a damn about that!
PS2: Man, he is just 34?