Posts

SSIS Expression Language and Dynamic Property Expressions

Introduction

This post is part of a series titled An Introduction to the SSIS Expression Language.

In this post, I demonstrate the SSIS Expression Language and Dynamic Property Expressions.

To build the demo project described in this article you will need SSIS 2005 and the AdventureWorks sample database (which can be downloaded at http://www.codeplex.com/MSFTDBProdSamples/Release/ProjectReleases.aspx?ReleaseId=4004).

Build The Demo Project

If you have not already done so, create an SSIS project named ExpressionLanguageDemo.

Once the project is created, open Solution Explorer and rename Package.dtsx – change the name to DynamicProperties.dtsx. If you’ve already created the project, right-click the SSIS Packages folder in Solution Explorer and click New Package – and rename that package to DynamicProperties.dtsx. When prompted, rename the package object as well. (I think you should always do this – I always answer this message box by clicking the Yes button.)

A New Source

Before we begin connect to your instance of SQL Server – I’m using (local) – and execute the following T-SQL statement:


use master
go

if
not exists(select name
              from sys.databases
              where name=‘TestSource’)
 Create Database TestSource
go

use
TestSource
go


ifnotexists(selectname
              fromsys.schemas
              wherename=‘Person’)
 begin
  declare @Sql varchar(100)
  set @Sql =‘create schema Person’
  exec(@Sql)
 end
go

if not exists(select name
              from sys.tables
              where name=‘Contact’)
 begin
  select top 1000
   ContactID
  ,Title
  ,FirstName
  ,
MiddleName
  ,LastName
  ,EmailAddress
  into Person.Contact
  from AdventureWorks.Person.Contact
 end
else
 begin
  truncate table Person.Contact
  insert into Person.Contact
  select top 1000
ContactID
  ,Title
  ,FirstName
  ,
MiddleName
  ,LastName
  ,EmailAddress
  from AdventureWorks.Person.Contact
 end

This script creates a new database with a table named dbo.Contact, which it populates from the AdventureWorks database. We’ll use this other data source to demonstrate a cool use of dynamic property expressions.

Manage the Connection

Rename the (local).AdventureWorks Connection Manager to MySource:

Click MySource and press the F4 key to display Properties. Highlight the ConnectionString property value and copy it to the clipboard.

Right-click anywhere in the white-space of the Control Flow and click Variables. Create a new variable named ConStr. Check to make sure the Scope of the Varibale is DynamicProperties (the package) and set the Data Type to String. In the Value textbox, paste the contents of the clipboard (the connection string of the MySource connection manager).

Back To Our Regularly Scheduled Flow…

Drag an Execute SQL Task onto the Control Flow and double-click the Execute SQL Task to open the editor. Drag an OLE DB Source onto the canvas. Double-click the OLE DB Source to open the editor. Set the Connection property to MySource.

Set the Data Access Mode to SQL Command and enter the following T-SQL statement into the SQL Command Text textbox:


Select Count(*)
From Person.Contact

 

Set the ResultSet property to Single Row:

Click the Result Set page and then click the Add button. Set the Result Name to 0. Drop down the Variable Name column and click <New Variable>:

When the Add Variable form displays, set the Name to CountVal, Value Type to Int32, and the Value to 0:

Click OK to close the Add Variable form:

Click OK again to close the editor.

Follow the Script

Drag a Script Task onto the Control Flow and connect a precedence constraint (green arrow) from the Execute SQL Task to the Script Task. Double-click the Script Task to open the editor.Click the Script page and set the ReadOnlyVariables property to CountVal. Set the ReadWriteVariables property to ConStr:

Click the Design Script button to open the Visual Studio for Applications (VSA) editor.

Replace the code with the following:


Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime

PublicClass ScriptMain

Public
Sub Main()

  Dim sConStr AsString = Dts.Variables(“ConStr”).Value.ToString
  Dim iCountVal AsInteger = Convert.ToInt32(Dts.Variables(“CountVal”).Value)
  Dim sMsg AsString

  sMsg = “ConnectionString: “ & sConStr
sMsg = sMsg & vbCrLf &
“Count: “ & iCountVal.ToString

MsgBox(sMsg)

Dts.Variables(“ConStr”).Value = Strings.Replace(sConStr, “AdventureWorks”, “TestSource”)

Dts.TaskResult = Dts.Results.Success

End
Sub

EndClass

 

This script reads the value of the ConStr and CountVal variables, then displays them in a message box. Finally, the script changes the value of the ConStr variable, replacing the AdventureWorks database with the TestSource database. We’ll use this change in a bit.

Close the VSA editor and click the OK button to close the Script Task editor.

Go Back, Jack, Do It Again…

Copy the Execute SQL Task and paste it in the Control Flow. Connect the Script Task to the pasted Execute SQL Task (Execute SQL Task 1).

Drag another Script Task onto the Control Flow. Connect a precedence constraint from Execute SQL Task 1 to Script Task 1 and double-click it to open the editor. As before, add ConStr to the ReadOnlyVariables property and CountVal to the ReadWriteVariables property. Click the Design Script button and paste the same code as before into this Script Task.

I can hear you thinking: “Andy, why not just copy and paste the Script Task?” That’s an excellent question! Script Tasks are very fickle on the clipboard. On 32-bit systems, you get better results. On x64, it gets ugly. The task fails with the error:

Error: Precompiled script failed to load. The computer that is running the Script Task does not have a compiler capable of recompiling the script. To recompile the script, see the Microsoft Knowledge Base article, KB931846 (http://go.microsoft.com/fwlink/?LinkId=81885).

Ugly.

One Last Thing

Ok. We’re counting the rows in the AdventureWorks.Person.Contact table and displaying that value in a message box. Then, we’re altering the connection string contained in the ConStr variable to point from AdventureWorks to TestSource, and then re-executing the Count query. Then we’re displaying the ConStr variable value and the Count value again.

We’re almost done. Here’s where we use dynamic property expressions.

Click the MySource Connection Manager and press the F4 key to display the properties. Click the the Expressions property, then the ellipsis (…) button in the value textbox. The Property Expressions Editor form displays. Click the Property dropdown and select the ConnectionString property. In the Expression textbox, type @ConStr.

This maps the value contained in the ConStr variable into the ConnectionString property of the MySource Connection Manager – dynamically. Change the variable value and the ConnectionString changes. It’s that simple – and that cool.

Test It!

Execute the package in debug mode to view the results:

Conclusion

Changing a connection string dynamically is but one use of Dynamic Property Expressions, albeit a very powerful use.

:{> Andy

SSIS Expression Language and the Conditional Split Transformation

Introduction

This post is part of a series titled An Introduction to the SSIS Expression Language.

In this post, I demonstrate the SSIS Expression Language and the Conditional Split Transformation.

To build the demo project described in this article you will need SSIS 2005 and the AdventureWorks sample database (which can be downloaded at http://www.codeplex.com/MSFTDBProdSamples/Release/ProjectReleases.aspx?ReleaseId=4004).

Build The Demo Project

If you have not already done so, create an SSIS project named ExpressionLanguageDemo.

Once the project is created, open Solution Explorer and rename Package.dtsx – change the name to ConditionalSplit.dtsx. If you’ve already created the project, right-click the SSIS Packages folder in Solution Explorer and click New Package – and rename that package to ConditionalSplit.dtsx. When prompted, rename the package object as well. (I think you should always do this – I always answer this message box by clicking the Yes button.)

Drag a Data Flow Task onto the Control Flow and click the Data Flow tab to edit. Drag an OLE DB Source onto the canvas. Double-click the OLE DB Source to open the editor. As in the post entitled SSIS Expression Language and Variables, click the New button to create a new Connection Manager to the AdventureWorks database.

Set the Data Access Mode to SQL Command and enter the following T-SQL statement into the SQL Command Text textbox:


Select
ContactID
 ,Title
 ,FirstName
 ,MiddleName
 ,LastName
 ,EmailAddress
From Person.Contact

 

Click OK to close the editor.

On One Condition…

Drag a Conditional Split transformation onto the Data Flow canvas and connect the OLE DB Source Adapter to it:

Double-click the Conditional Split to open the editor. Expand the NULL Functions folder in the Operations list (upper right). Drag an ISNULL function into the Condition column of the first row in the Conditions grid. Expand the Columns folder in the Values list (upper left). Click and drag the Title column from the list onto the <<expression>> placeholder in the ISNULL expression:

The default name for a Condition is “Case n” where n is the row number of the Condition in the Condition grid.

Next, drag and drop the ContactID column into the second row’s Condition column. Complete the expression so that it reads: ContactID <= 5000. Rename the Outputs to NullTitles and SmallContactIDs respectively:

What we’ve done here is define a couple of outputs. One of the outputs will contain rows where the Title is NULL. The other will contain – this is important, pay attention – rows where the Title is not NULL and the ContactID is less than or equal to 5000. Why is this? It’s because rows with NULL Titles are redirected to the NullTitles output first. The ContactID value of these rows is never evaluated to see if it’s less than or equal to 5000. If neither condition applies to a row, that row is sent to the Conditional Split Default Output. This operates a lot like a Switch statement in C# or a Select Case statement in VB, with the Default Output acting like the Else branch. You can adjust the order of condition evaluation using the spinner buttons on the right:

Click OK to close the Conditional Split editor.

Terminate It!

Drag a Multicast transformation onto the Data Flow task surface. We’ll use the Multicast to terminate a Data Flow Path. Drag a Data Flow Path from the Conditional Split transformation to the Multicast. When you do this, you’ll be prompted to select an output from the Conditional Split to connect to the Multicast input:

After selecting an Output, click Ok to close:

The title of the output you selected appears in the label.

Conclusion

Expressions are used to branch data row flow inside the Data Flow Task with the Conditional Split transformation.

:{> Andy

SSIS Expression Language and the Derived Column Transformation

Introduction

This post is part of a series titled An Introduction to the SSIS Expression Language.

The SSIS Expression Language is one of the steeper slopes on the SSIS learning curve. Many database professionals have never been exposed to the syntax. In this series, I provide examples that demonstrate common uses for the SSIS Expression Language.

The examples are built in SSIS 2005. Why? Not everyone has access to SSIS 2008 at this time, and the examples work in both platforms.

Syntax

Books Online has some good information about the SSIS Expression Language.  This is an excellent place to start.

For starters, the syntax shares similarities with what I call “curly-brace” languages: C++, C#, Java, etc. It’s obvious (to me at least) from the “=” operators:

  • == for comparison
  • = for assignment

Literals

Literals often prove challenging for folks unfamiliar with the syntax. Of particular interest is escape characters. The two escape sequences I’m most often asked about are double quotes and the backslash. As in curly-brace languages, the backslash is the escape indicator. This makes Windows paths loads of fun in SSIS.

For example, you think you’re storing the path C:\results\new\andysdata.csv, but what you’re telling SSIS is C: [Carriage Return] esults [New line] ew [Alert] ndysdata.csv. Yeah. That’s gonna make for some interesting error messages! So how to address it? C:\\results\\new\\andysdata.csv works, as two backslashes together are the escape sequence for a single backslash.

If you want to include double-quotes in an expression, the escape sequence is backslash + ” – \”.

Casting

Casting is also unintuitive to the uninitiated. SSIS implicit conversions throw me every time. SSIS has its own set of data types, and the different data providers have their set of data types. Some interesting things happen where these data types meet.

To cast to a string (or SQL Server varchar) data type, preface the value with (DT_STR, [len], [code page]). For example, if you want to cast the integer 42 to a 2-character, en-us string; the expression is (DT_STR, 2, 1252) 42.

Conclusion

There's more to come in this series. I hope you enjoy it!

:{> Andy

SSIS Expression Language and Variables

Introduction

This post is part of a series titled An Introduction to the SSIS Expression Language.

In this post, I demonstrate the SSIS Expression Language and Variables.

To build the demo project described in this article you will need SSIS 2005 and the AdventureWorks sample database (which can be downloaded at http://www.codeplex.com/MSFTDBProdSamples/Release/ProjectReleases.aspx?ReleaseId=4004).

Build The Demo Project

If you have not already done so, create an SSIS project named ExpressionLanguageDemo.

Once the project is created, open Solution Explorer and rename Package.dtsx – change the name to Variables.dtsx. When prompted, rename the package object as well. (I think you should always do this – I always answer this message box by clicking the Yes button.)

Make Some Variables 

Click the SSIS dropdown menu and select Variables.

When the Variables window displays, click the Add Variable button (first button on the left) to create a new variable with the following properties:

  • Name: SQL
  • Scope: Variables
  • Data Type: String
  • Value: Select 1 As One 

 

I included a syntactically-correct Transact-SQL statement as the default value. Why? Later I plan to use this statement in an Execute SQL Task. I chose to include a valid default to avoid validation warnings and errors.

Validation: SSIS provides design-time and run-time validation of components and settings. In general, this is a good thing as it catches real and potential errors before the SSIS package is executed. SSIS also provides a means of ignoring design-time validation warnings and errors via the DelayValidation property.

Add Some Tasks

Drag a Script Task onto the Control Flow and double-click to open the editor. If you’re using SSIS2008, set the ScriptLanguage property on the Script page to Microsoft Visual Basic 2008. (If you’re using SSIS 2005 you have no other option.) Add SQL to the list of ReadOnlyVariables and click the Design Script (Edit Script in SSIS 2008) button to open the script engine editor.

Replace the code in Public Sub Main() with the following:

Public Sub Main()
  Dim sSQL As String = Dts.Variables(“SQL”).Value.ToString
  MsgBox(
“SQL: “ & sSQL)
  Dts.TaskResult = Dts.Results.Success
End Sub


Close the Visual Studio for Applications editor and click OK to close the Script Task editor.

Execute the package and observe the result. You should see a message box displaying the value of the SQL variable:

Just the SQL, Ma’am

Next, add an Execute SQL Task to the Control Flow canvas and connect a Precedence Constraint from the Script Task to the Execute SQL Task. Double-click the Execute SQL Task to open the editor. Leave the ConnectionType property set to OLE DB. Click the dropdown for the Connection property and click “<New connection…>”:

 

When the Configure OLE DB Connection Manager form displays, select a connection to the AdventureWorks database if one exists in your Data Connections list. If not, click the New button. Configure the connection to your server – I use (local) for my default local instance – and the AdventureWorks database:

Click OK until you return to the Execute SQL Task editor. You have three options for the SQLSourceType property: Direct input, File Connection, and Variable. There is a limit to the number of characters you can enter using direct input. I’m not sure but I think it’s around 4,000 or so. I’ve hit this limit once – and in a big way: the client required 4 MB of dynamic SQL. I would add italics to that statement if it didn’t make it look so funny. After unsuccessfully lobbying for a better approach, I managed the dynamic SQL in a script task that wrote it to a file, and used a file connection SQLSourceType to execute it.

For our example, set the SQLSourceType to Variable. Then set the SourceVariable property (which was hidden until you selected the SQLSourceType Variable) to User::SQL.

Click OK to close the editor and execute the package to test. Click OK when the message box displays. The Execute SQL Task should succeed:

 

Make it Dynamic

First, we’ll add some more variables and build a dynamic SQL statement the old school way.

Create the following package-scoped String data type variables [with default values]:

  • SelectClause [SELECT Title, FirstName, LastName, EmailAddress]
  • FromClause [FROM Person.Contact]
  • WhereClause [WHERE LastName IN ('Smith','Jones')]

Open the Script Task editor and navigate to the Script page. Move the SQL variable from the ReadOnlyVariables property to the ReadWriteVariables property. Add SelectClause, FromClause, and WhereClause to the ReadOnlyVariables property:

Click the Design Script button and replace the code in Public Sub Main() with the following:


Public Sub Main()
 
Dim sSelect As String = Dts.Variables(“SelectClause”).Value.ToString
  Dim sFrom As String = Dts.Variables(“FromClause”).Value.ToString
  Dim sWhere As String = Dts.Variables(“WhereClause”).Value.ToString
  Dim sSQL As String = sSelect & ” “ & sFrom & ” “ & sWhere
  Dts.Variables(
“SQL”).Value = ssql
  MsgBox(
“SQL: “ & sSQL)
  Dts.TaskResult = Dts.Results.Success
End Sub

Close the VSA editor and click OK to close the Script Task editor. Execute the package to examine the results:

Express Yourself!

Let’s look at another way to accomplish the same result, this time using expressions.

First, edit the Script Task Public Sub Main() code to read:

Public Sub Main()
  Dim sSQL As String = Dts.Variables(“SQL”).Value.ToString
  MsgBox(
“SQL: “ & sSQL)
  Dts.TaskResult = Dts.Results.Success
End Sub

This returns the script task functionality to simply displaying the value of the SQL variable.

In the Variables window, click on the SQL variable and press the F4 key to display the properties of the SQL variable. Change the EvaluateAsExpression property to True and enter the following expression in the Expression property:

@SelectClause + ” ” + @FromClause + ” ” + @WhereClause

This changes the way the SQL variable works. It no longer contains the value specified in the Value column of the Variables window. Instead, the value of the SQL variable is determined by the expression, which contains the SelectClause, FromClause, and WhereClause variables.

Conclusion

The SSIS Expression Language can be used with variables to dynamically set the value of one variable from one or more other variables.

:{> Andy

An Introduction to the SSIS Expression Language

Introduction

The SSIS Expression Language is one of the steeper slopes on the SSIS learning curve. Many database professionals have never been exposed to the syntax. In this series, I provide examples that demonstrate common uses for the SSIS Expression Language.

The examples are built in SSIS 2005. Why? Not everyone has access to SSIS 2008 at this time, and the examples work in both platforms.

Syntax

Books Online has some good information about the SSIS Expression Language.  This is an excellent place to start.

For starters, the syntax shares similarities with what I call “curly-brace” languages: C++, C#, Java, etc. It’s obvious (to me at least) from the “=” operators:

  • == for comparison
  • = for assignment

Literals

Literals often prove challenging for folks unfamiliar with the syntax. Of particular interest is escape characters. The two escape sequences I’m most often asked about are double quotes and the backslash. As in curly-brace languages, the backslash is the escape indicator. This makes Windows paths loads of fun in SSIS.

For example, you think you’re storing the path C:\results\new\andysdata.csv, but what you’re telling SSIS is C: [Carriage Return] esults [New line] ew [Alert] ndysdata.csv. Yeah. That’s gonna make for some interesting error messages! So how to address it? C:\\results\\new\\andysdata.csv works, as two backslashes together are the escape sequence for a single backslash.

If you want to include double-quotes in an expression, the escape sequence is backslash + ” – \”.

Casting

Casting is also unintuitive to the uninitiated. SSIS implicit conversions throw me every time. SSIS has its own set of data types, and the different data providers have their set of data types. Some interesting things happen where these data types meet.

To cast to a string (or SQL Server varchar) data type, preface the value with (DT_STR, [len], [code page]). For example, if you want to cast the integer 42 to a 2-character, en-us string; the expression is (DT_STR, 2, 1252) 42.

Conclusion

There's more to come in this series. I hope you enjoy it!

:{> Andy

SSIS Or Log Shipping?

SSIS or Log Shipping? A New Analysis

SSIS Or Log Shipping?

SSIS or Log Shipping?

Brian Moran (@briancmoran) and I were meeting with a Linchpin People (@LinchpinPeople) client recently. They had a common complaint. “Our Production system performance is being impacted by reporting queries,” they told us. The client had done their research and concluded a data warehouse is the long-term solution. We agreed.

“But what can we do right now?” they asked. As we talked through possible short-term options, it became clear that Brian and I were advocating different solutions. We were doing so while considering the same constraint – time. Brian was advocating Log Shipping. I was advocating an SSIS solution. But we were both thinking, “How do we get this client out of hot water as soon as possible?”

Considerations and Constraints…

Log shipping can be set up quickly. It allows us to set up a second “warm” (minimally latent) copy of the data that the reporting folks can utilize for a source. There are several other facets to consider but our primary thought was time, and so log shipping made sense.

“What about SSIS?” I thought. We could use SSIS to maintain a warm copy of the data. Further, utilizing the incremental load pattern we could load vital tables every five minutes. The reporting database fed via incremental loaders would also make an excellent staging database for the client’s eventual enterprise data warehouse.

Traditionally it takes roughly two hours per to write (and test) an incremental load SSIS package. I like to build one SSIS package per table. If the client has 50 tables, I’m quoting 200 hours for SSIS development. We can set up and test log shipping in a matter of days. When time is your top concern, log shipping wins over traditional SSIS hands down.

But we’re not doing traditional SSIS.

SSIS and Business Intelligence Markup Language (Biml)

Biml

Linchpin People’s Data Integration Group uses Mist from Varigence to compose Business Intelligence Markup Language (Biml). Our solution – originally developed by Varigence as a demo – allows us to condense hundreds of hours of development into a few days of work. As Brian and I talked with our customer we realized we could deliver either log shipping or an SSIS solution in the same amount of time.

With the time difference off the table, we could compare the advantages and disadvantages of each solution. Our customer agreed with us – options are good!

:{>

Little Data Remains Important in Healthcare IT

SameAsItEverWasIn his article Healthcare’s Big Problem With Little Data, author Dan Munro raises salient points about the state of health-related data. Electronic Health Records (EHR) were promoted as the end-all-be-all solution for the industry – a standardization that, I suppose, many thought would organically and naturally occur, stabilize, and be maintained.

It hasn’t. At least not yet.

My doctor and I speak about this almost each time I visit with him. The corporation that operates his practice nowadays seems endlessly locked in cycles of changing billing and EHR systems in search of low-cost compliance and integration. They’ve (literally) spent millions of dollars and my doctor hates the interfaces forced upon him and his patients (well, one, at least) hates the complexity of the billing and patient records systems. Can’t these systems all just get along?

The result? Higher medical data management costs. I’ll give you one guesses who pays these costs.

Munro posits the following from his article:

By at least one estimate (here) there are now about 500 independent EHR vendors.  Out of that large group is a subset of about 400 with at least one customer that has applied for Federal stimulus dollars through the labyrinthine process of meaningful use attestation. That would suggest a “first-cut” of about 100 vendors who made some commitment around certification – but have no reported customers (at least to date). That’s a staggering number of single-purpose software vendors for any industry to support – even bloated healthcare. The simple fact is it can’t. While there have been a few high-profile cases of EHR vendors shutting down, this last week was the first high-profile example of a vendor that was effectively decertified by the Feds for both their “ambulatory” and their “inpatient” EHR products. From the HHS.gov website last Thursday:

“We and our certification bodies take complaints and our follow-up seriously. By revoking the certification of these EHR products, we are making sure that certified electronic health record products meet the requirements to protect patients and providers,” said Dr. Mostashari.“Because EHRMagic was unable to show that their EHR products met ONC’s certification requirements, their EHRs will no longer be certified under the ONC HIT Certification Program.”

You may ask yourself, well, how did we get here? This, folks, is a mess. What’s missing? Applied standards.

“But Andy, you’ve told us standards slow down development!”

And I stand by that statement; standards do slow down development…unless you’re building interfaces. And then standards become the means for decoupled snippets, functions, methods, applications, and even platforms to communicate with each other. In some cases, we simply cannot be productive without standards – like TCP/IP. What would happen if everyone coded their own version of internet traffic? If that was the case, very few of you would reading this post.

Yes, standards slow things down. And yes, they are necessary to insure base functionality. In my humble opinion, we have to get this right with healthcare data. We simply must. While we see similar issues of data management across many fields, medical data is too important to mess around with; it’s (often literally) life and death. And it is certainly a high cost.

More to Consider

Standards exist. Administering and certifying 400-500 vendor solutions is hard.

Part of the Solution

From the actions of the Department of Health and Human Services last week, one can ascertain HHS is taking steps to address the matter. But will all 400-500 companies voluntarily congeal their schemas? Possibly, but doubtful.

My experience delivering US state Medicaid ETL solutions informs me there will be a need for data integration – regardless of the existence of standards and in spite of certification. Why? Standards are not static. The idea of de facto standards emerges from the life cycle of software because software is organic. Even if everyone agreed on the same interpretation of rigid standards (and they won’t), versions 2.0 through n.n will – at a minimum – add fields to the schema. And with additional fields comes additional data.

Standards will be revised when enough product schemas adopt the de facto, and this will drive the need for yet more integration. Don’t take my word for it, examine the entropic history of ICD-9 and ICD-10 codes – the direction of progress is more data, not less.

Learn More

This is one reason we at Linchpin People are focusing on Medical Data Integration. The recording of our first (free!) webinar about Medical Data Integration with SSIS 2012 is available here. Kent Bradshaw and I continue the series tomorrow presenting Medical Data Integration with SSIS 2012, Part 2 in which we focus on loading Provider and Drug data.

I hope to see you there!

:{>

From Zero To SSIS Training: 4-8 Mar 2013 in Reston Virginia

Want to learn more about SSIS? I  can help. In my course - From Zero To SSIS – I teach you how to effectively use SSIS to deliver data integration solutions.

This is not a market-y survey of the bells and whistles contained in SSIS. I focus on the tasks data integration and ETL developers will use to produce SSIS packages in the enterprise.

How do I know which tasks are used most?

SSIS Design Patterns, Apress, 2012

I have been using SSIS since the early beta versions. For over two years, I managed a team of ETL developers at Unisys as we built Medicaid solutions for two state governments and maintained the Medicaid solution for another state. When we needed more people, I had a hard time finding experienced SSIS developers. I needed a way to bring less-experienced developers up to speed quickly. I found a way to train data integration developers to use SSIS to build enterprise-ready, metadata-framework-driven SSIS applications in less than one week. The experience served as the basis for this course which has been delivered publicly and privately to hundreds of students over the past two years.

You can learn more about – and register for – the course here.

Advanced registration is available until 1 Feb 2013: $2,749 USD

Registration after 1 Feb 2013: $2,999 USD

I hope to see you there!

:{>

Using Enterprise Data Integration Dashboards

In this video session, Kent Bradshaw and Andy Leonard demonstrate and discuss Enterprise Data Integration Dashboards designed to capture runtime metrics from an SSIS Framework.

Topics include:

  • Data Integration Architecture
  • SSIS Development
  • Operational Intelligence
  • SSIS Design Patterns
  • Data Integration Performance
  • SSIS Frameworks
  • SSIS Performance Historical Analysis
Watch and learn more about using data integration dashboards to improve visibility into the status of data integration in your enterprise:

Creating SSIS Packages with the SQL Server Import and Export Wizard

Sorry for the Inconvenience

We realize that the images in this post are not displaying properly, and we are working to resolve the issue. Please email info@linchpinpeople.com if you would like to receive a notification or white paper when this blog series has been updated.

In this post, I demonstrate how to use the SQL Server 2012 Import and Export Wizard to create a SQL Server 2012 Intgertaiion Services (SSIS 2012) package. The demo is created on a virtual machine running Windows Server 2012.

You can use the SQL Server Import and Export Wizard to learn more about SQL Server Integration Services. Often, one will reach the “Save and Run Package” just before executing the Wizard. The “Run immediately” checkbox is checked and the “Save SSIS Package” checkbox is unchecked by default. If you check the “Save SSIS Package” checkbox, you can select to store the SSIS package in the MSDB database (the “SQL Server” option) or File system. You may also select the Package Protection Level:

If you select these options, the next step is selecting the target location of the package:

The next page of the Import and Export Wizard displays a summary of the selected options:

Clicking the Finish button creates, saves, and executes the SSIS package as configured, and the results are displayed:

Cool. Now what?

Now you have an SSIS package saved in the file system. In this case, the file is named ImportExportWizard.dtsx. It can be opened in SQL Server Data Tools by right-clicking the file and selecting “Edit”:

Windows Server 2012 prompts for which program to use to perform the Edit. I select Microsoft Visual Studio Version Selector:

Sql Server 2012 Data Tools uses the Visual Studio 2010 Integration Development Environment (IDE) at the time of this writing. Note the “Run” (Play) button is disabled on the toolbar:

We have an SSIS package created with the Import and Export Wizard, but we cannot re-execute it.

Bummer. But all is not lost.

Visual Studio needs a debugger to execute packages in debug mode. When we open an SSIS (*.dtsx) file all by itself, Visual Studio doesn’t load a debugger. To have Visual Studio load the SSIS debugger, we can create a Solution containing a Project. Begin by clicking File, New, Project:

Name the new solution and project, and select a folder:

Once the solution and project are created, right-click the SSIS Packages virtual folder in Solution Explorer, and select “Add Existing Package”:

Navigate to the location where you stored the SSIS package in the final steps of the Import and Export Wizard:

When you click the OK button, the SSIS package is copied into the project folder and added to the new project:

When you open the SSIS package in the context of a Visual Studio project and solution, the Debug button is enabled on the toolbar and the SSIS package can be executed in debug mode:

You can use your knowledge of the Import and Export Wizard to learn more about SSIS package development. Happy Integrating!

:{>