Bandwidth Monitor
Processing The Same Row More Than Once
By Allan Mitchell

In his SQL Server Magazine article of August 2002 Itzik Ben-Gan shows us how to use Transact SQL to normalise lab test results from a fictional wine maker using either functions in SQL Server 2000 or Transact SQL gymnastics in Version 7. In this article we're going to show you how to do the exact same thing but using DTS. The added advantage of using DTS is that it is Non-Connection specific.

The Situation

The scenario revovles around taking temperatures from sites. The way the data is entered into the source system is highly denormalised and it's our job using DTS to get it into our normalised structure. Listing 1 shows the source table structure with sample data and Listing 2 shows the normalised target table design.

Listing 1 Source.

CREATE TABLE [TempSamplesBad] ([siteID] [int] IDENTITY (1, 1) NOT NULL ,[Readings] [varchar] (50))
INSERT TempSamplesBad(Readings) VALUES('1,9,-6,3,5,7')
INSERT TempSamplesBad(Readings) VALUES('34,39,10,13,51,27')
INSERT TempSamplesBad(Readings) VALUES('1,21,2,9,7,78')

Listing 2 Destination.

CREATE TABLE [TempSamplesGood] (
	[siteID] [int] NOT NULL ,
	[ReadingID] [int] NOT NULL ,
	[Reading] [int] NULL ,

The way we process the same row more than once in a DTS package is through setting the return status of the transformation to DTSTransformStat_SkipFetch. Listing 3 shows us how we do this in an ActiveX script transformation in a data pump.

Listing 3 Code.

Option Explicit 
Function Main 

'This will form an array which we use to determine when to keep on the same row or 
'move to the next 

DIM arr

'Populate the array 

arr = SPLIT(DTSSource("Readings"),",")

'We create a Global variable to hold our position within the array and for each processing 
'of the row we need to check the value.   

IF Cint(DTSGlobalVariables("ReadingID").Value) <= Ubound(arr)  THEN 

'Here is the insert of an element in the array based on the global variable value.
DTSDestination("siteID") = DTSSource("siteID")
DTSDestination("ReadingID") = Cint(DTSGlobalVariables("ReadingID").Value) + 1
DTSDestination("Reading") = arr(Cint(DTSGlobalVariables("ReadingID").Value))

'This is the key to the whole thing. Setting the return value to DTSTransformStat_SkipFetch 
'forces DTS to reprocess the same row.

Main = DTSTransformStat_SkipFetch

'Increase the global variable value so we can pick up the next array element the 
'next time we process the row

DTSGlobalVariables("ReadingID").Value = Cint(DTSGlobalVariables("ReadingID").Value) + 1 


'If our global variable = highest element in the array then we skip the insert and move 
'on. We set our return status to DTSTransformStat_SkipInsert otherwise DTS would insert a 
'NULL row in our table

DTSGlobalVariables("ReadingID").Value = 0 
Main = DTSTransformStat_SkipInsert


End Function

We could extend this model further to processing a different column on every iteration of a transformation so this would come in handy if you had a table with fields like Product1, Product2, Product3.