<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/" >

<channel><title><![CDATA[Microsoft Data & AI - SSIS]]></title><link><![CDATA[https://www.delorabradish.com/ssis]]></link><description><![CDATA[SSIS]]></description><pubDate>Fri, 13 Jun 2025 06:49:34 -0700</pubDate><generator>Weebly</generator><item><title><![CDATA[SSIS 2016: An efficient way to handle transform and load of SCD2 (type 2 slowly changing dimensions)]]></title><link><![CDATA[https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-handle-transform-and-load-of-scd2-type-2-slowly-changing-dimensions]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-handle-transform-and-load-of-scd2-type-2-slowly-changing-dimensions#comments]]></comments><pubDate>Thu, 24 Aug 2017 01:12:25 GMT</pubDate><category><![CDATA[SCD2]]></category><category><![CDATA[SSIS]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-handle-transform-and-load-of-scd2-type-2-slowly-changing-dimensions</guid><description><![CDATA[Quick Review:This blog post is about type two slowly changing dimensions (SCD2).&nbsp; This is when an attribute change in row 1 results in SSIS expiring the current row and inserting a new dimension table row like this --&gt;         &nbsp;SSIS comes packaged with a SCD2 task, but just because it works, does not mean that we should use it.&nbsp; Think of the pre-packaged Microsoft supplied SCD2 task as a suggestion.&nbsp; If we really want to get the job done, we will want to use Cozy Roc or Pr [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><strong><font color="#c2743b">Quick Review:</font></strong><br /><font color="#2a2a2a">This blog post is about type two slowly changing dimensions (SCD2).&nbsp; This is when an attribute change in row 1 results in SSIS expiring the current row and inserting a new dimension table row like this --&gt;</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/published/scd2.png?1503537298" alt="Picture" style="width:568;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#2a2a2a">&nbsp;SSIS comes packaged with a SCD2 task, but just because it works, does not mean that we should use it.&nbsp; Think of the pre-packaged Microsoft supplied SCD2 task as a suggestion.&nbsp; If we really want to get the job done, we will want to use Cozy Roc or <u><a href="http://pragmaticworks.com/" target="_blank">Pragmatic Works</a> <a href="https://pragmaticworks.com/Products/Task-Factory" target="_blank">Task Factory</a>&nbsp;</u>(TF). &nbsp; I strongly suggest Task Factory&rsquo;s [<a href="http://help.pragmaticworks.com/taskfactory/Index.html?DimensionMergeSCD.html" target="_blank">Dimension Merge Slowly Changing Dimension</a>] add-in to SSIS for the following reasons:</font><ol><li><font color="#2a2a2a">Performance</font></li><li><font color="#2a2a2a">Easy of use</font></li><li><font color="#2a2a2a">Rich features</font></li></ol> <font color="#2a2a2a">&nbsp;</font><br /><font color="#2a2a2a">The </font><strong><font color="#c2743b">key to success</font></strong><font color="#2a2a2a">, in my case, was two-fold</font><ol><li><font color="#2a2a2a">Sort the two OleDB source components (#1 and 3) both in the SELECT statement, and under the Advanced tab of the OleDB source task.&nbsp;</font></li><li><font color="#2a2a2a">Be sure to choose the right keys in the TF Upsert destination (#6 and #8).</font></li></ol><br /><font color="#2a2a2a">You can read about additional performance tips from Pragmatic Works </font><u style="color:rgb(42, 42, 42)"><a href="http://help.pragmaticworks.com/taskfactory/Index.html?DimensionMergeSCD.html" target="_blank">on line help</a></u><font color="#2a2a2a">&nbsp;(</font><u style="color:rgb(42, 42, 42)"><a href="http://help.pragmaticworks.com/taskfactory/Index.html?DimensionMergeSCD.html" target="_blank">performance tab</a></u><font color="#2a2a2a">) and they have an instructional video </font><u style="color:rgb(42, 42, 42)"><a href="https://www.youtube.com/watch?v=pBBBNgxcqds" target="_blank">here</a></u><font color="#2a2a2a">.&nbsp; The point of this blog post is to share a screen print that would have been helpful to me the first time I setup this component.</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/scd2-gui_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#2a2a2a">Now for </font><strong><font color="#c2743b">the dissection</font></strong><font color="#2a2a2a">...</font><ol><li><font color="#2a2a2a">This is your SELECT transform statement from your source system. &nbsp;It must contain an ORDER BY clause. &nbsp;The [OleDB SRC vwDimCustomer] task then must be sorted. &nbsp;</font><span style="color:rgb(42, 42, 42)">Right mouse click --&gt; Show Advanced Editor --&gt; &nbsp;Input and Output Properties tab --&gt; Ole DB Source Output --&gt; IsSorted property = True. &nbsp;On this same tab click on Ole DB Source Output --&gt; Output Columns --&gt; [Your Business Key Column Name]--&gt; SortKeyPosition = 1</span></li><li><font color="#2a2a2a">This is an optional TF component, but best practice is to not allow NULL values, assuming your data warehouse is modeled and optimized for analytics. &nbsp;Although all of the columns in vwDimCustomer have IsNull() functions, all the lookup values will come into the SCD2 task as NULL, so this step replaces NULL values with 'Unknown', '-1', 'N/A', or whatever be your preference.</font></li><li><font color="#2a2a2a">In order to know if something is new or changed, we need a comparison table and this is the function of #3. &nbsp;Just like #1, this source data must be sorted. &nbsp;Example: </font><em><font color="#818181">SELECT * FROM edw.DimCustomer ORDER&nbsp;BY Customer BK.</font></em></li><li><font color="#2a2a2a">This is the hinge pin of the SCD2 process. &nbsp;I will leave you to <u><a href="http://help.pragmaticworks.com/taskfactory/Index.html?DimensionMergeSCD.html" target="_blank">read up</a></u> on how to configure the component as I don't wish to rewrite Pragmatic Works Help. &nbsp;One word of advice: Do not rest until all warnings have been corrected. &nbsp;Where it is easy to get lost is configuring the outputs.</font></li><li><font color="#2a2a2a">The easiest of the four: a BK (business key) does not exist. &nbsp;Insert it. &nbsp;This is a traditional OleDB insert and we don't map the PK (primary key) or SCD2 ExpirationDateTime column. &nbsp;The PK is an identity seed and is taken care of by SQL Server. &nbsp;The SCD2 expiration date should be NULL. &nbsp;If you want a audacious&nbsp;default, like 12/31/9999, then this column must, of course, be mapped.</font></li><li><font color="#2a2a2a">This <u><a href="http://help.pragmaticworks.com/taskfactory/Index.html?Upsert.html" target="_blank">TF Upsert</a></u> takes care of SCD1 rows: Rows that do not have a SCD2 change and are therefore updated completely, including historical columns. &nbsp;This is a standard t-sql UPDATE. &nbsp;My personal preference is to use the TF Upsert Column Compare update method so rows that have no actual change, do not have an unnecessary update and meaningless&nbsp;[RowLastUpdated] timestamp. &nbsp;TF Upsert Column Compare works much like a hash value in many ETL methodologies. &nbsp;"If exists, go down SCD1 pipeline, but don't update the row unless there is an actual change." &nbsp;<strong>Critical&nbsp;success point:</strong> key on the BK, not the PK!</font></li><li><font color="#2a2a2a">We finally get to SCD2 results with #7. &nbsp;This is where new rows are created because of an identified SCD2 column change.</font></li><li><font color="#2a2a2a">When a new row is added, the old row must be expired. &nbsp;This is the functionality of #8. &nbsp;Because we definitely have a change in the row, there is no purpose to spend the time doing a TF Upsert Column Compare. &nbsp;Set this TG Upsert to bulk update.&nbsp;&nbsp;<strong>Critical success point</strong>: key on PK, not BK!</font></li></ol><br /><strong><font color="#c2743b">Conclusion: </font></strong><font color="#2a2a2a">&nbsp;Having fulling understood SCD2 concepts, this TF component took me a little bit to configure -- I had to actually think about things vs the rhythmic repetition&nbsp;of common data flow tasks. &nbsp;On first pass I skipped past the sort of the two source components and out of habit, picked up the PK in both TF Upsert components. &nbsp;I didn't pay attention to the &lt;New&gt; OleDB mapping and tried to insert my edw.DimCustomer.PK (hello?!) . &nbsp;My advice is to SLOW DOWN and get the first SCD2 dimension SSIS package built and aggressively&nbsp;tested, then fall back into the rinse and repeat rhythm&nbsp;of SSIS package development.</font><br /><br /><font color="#2a2a2a">If you get stuck, reach out to </font><u style="color:rgb(42, 42, 42)"><a href="http://pragmaticworks.com/Support" target="_blank">Pragmatic Works Product Support</a>.</u><font color="#2a2a2a"> &nbsp;I highly recommend their online chat.</font></div>]]></content:encoded></item><item><title><![CDATA[SSIS 2016, An efficient way to mark deleted source system rows in a persisted data warehouse table]]></title><link><![CDATA[https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-mark-deleted-source-system-rows-in-a-persisted-data-warehouse-table]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-mark-deleted-source-system-rows-in-a-persisted-data-warehouse-table#comments]]></comments><pubDate>Wed, 23 Aug 2017 23:44:20 GMT</pubDate><category><![CDATA[SSIS]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/ssis-2016-an-efficient-way-to-mark-deleted-source-system-rows-in-a-persisted-data-warehouse-table</guid><description><![CDATA[Even though many people think data warehouses ETLs (extract, transform and load) should contain insert data flows only, the vast majority of people I work with also have to deal with updates.&nbsp; Many have to also handle marking data warehouse rows as IsDeleted = "Y"&nbsp;in their ODS and EDW data repositories.&#8203;If you are working with a dimension tables with less than 500K rows (an estimate), a traditional lookup task might work just fine.&nbsp; The data flow would look like this:        [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font color="#2a2a2a">Even though many people think data warehouses ETLs (extract, transform and load) should contain insert data flows only, the vast majority of people I work with also have to deal with updates.&nbsp; Many have to also handle marking data warehouse rows as </font><em style=""><font color="#626262">IsDeleted = "Y"</font><font color="#2a2a2a">&nbsp;</font></em><font color="#2a2a2a">in their ODS and EDW data repositories.</font><br /><font color="#2a2a2a">&#8203;</font><br /><font color="#2a2a2a">If you are working with a dimension tables with less than 500K rows (an estimate), a traditional lookup task might work just fine.&nbsp; The data flow would look like this:</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/delete-with-original-dft-1_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#2a2a2a">(TF = <u><a href="https://pragmaticworks.com/Products/Task-Factory" target="_blank">Task Factory</a></u>. &nbsp;This is a great SSIS add-in for a nominal cost available from <u><a href="http://pragmaticworks.com/" target="_blank">Pragmatic Works</a></u>.)<br /><br />The problem with the above methodology is that although it may work, <em>it is not efficient</em>.&nbsp; I am a firm believer in performance tuning SSIS packages attempting to shave off minutes, and often seconds.&nbsp; Consequently, just because we get a green check mark, that is NOT indicative of &ldquo;all is well&rdquo;. &nbsp;In the above data flow, every ods.Customer has to be looked up in the src.Customer table. &nbsp;<em>P-a-i-n-f-u-l!</em><br /><br />As with everything SSIS, there are multiple ways to get the same data flow accomplished, and I tip my hat to those of you who like to write C#.&nbsp; In my experience, C# seems to be able to complete a data flow task faster than many comparable SSIS components, but C# is not my go-to solution. &nbsp;An entry-level developer will probably be maintaining and enhancing the package, so I try to find an alternative. &nbsp;Keeping with OOP (object oriented programming) techniques, I tested two alternative options this week that I thought were worth a share.</font><ol><li><font color="#2a2a2a">MergeJoin</font></li><li><font color="#2a2a2a">Temp (stage) table</font></li></ol><br /><font color="#2a2a2a">Working with a dimension table containing 5 million rows (it was &ldquo;people&rdquo; and there really were that many people so there was no trimming down the dimension row count), MergeJoin took 3 minutes. &nbsp;The standard lookup took over 20 minutes.</font><br /><br /><strong><font color="#c2743b">Assumptions:</font></strong><ol><li><font color="#2a2a2a">Using a JOIN between tables located in two different servers is not considered as a solution.&nbsp; Although a linked server in Sql Server would allow you to do this, I consider this bad form.</font></li><li><font color="#2a2a2a">The source system is not keeping a change data capture or log file of deleted rows</font></li><li><font color="#2a2a2a">Truncating and reloading our data warehouse table is not an option.&nbsp; This is another ETL methodology that I see frequently, but personally consider bad form.</font></li></ol><br /><strong><font color="#c2743b">MergeJoin Solution:</font></strong><br /><font color="#2a2a2a">It isn&rsquo;t that complex.&nbsp; Add two OleDB source tasks to a data flow: one for your actual source that is deleting records (and in my opinion, behaving badly, but sometimes that just cannot be helped) and a second source component for the data warehouse.&nbsp; Use a MergeJoin to bring the two together, use a conditional split to send (now) missing PKs to an upsert or stage destination.&nbsp; I like to use <a href="http://help.pragmaticworks.com/taskfactory/Index.html?Upsert.html" target="_blank"><u>TF (Task Factory&rsquo;s) Upsert</u> </a>component as it is super easy to configure. &nbsp;</font><br /><font color="#2a2a2a">A MergeJoin solution for marking data warehouse records as deleted in the source will look something like this:</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/delete-with-merge-join-dft_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#2a2a2a">Key to performance success in the above data flow is sorting both the ods.Customer and src.Customer tables including setting the sort properties of the Advanced Editor. &nbsp; Right mouse click on the OleDB source task --&gt; Show Advanced Editor --&gt; &nbsp;Input and Output Properties tab --&gt; Ole DB Source Output --&gt; IsSorted property = True. &nbsp;On this same tab click on Ole DB Source Output --&gt; Output Columns --&gt; [Your Business Key Column Name]--&gt; SortKeyPosition = 1<br /><br />For those new to SSIS, the Merge Join and Conditional Split components are pictured below to fill in the blanks of the MergeJoin data flow.</font></div>  <div><div class="wsite-multicol"><div class="wsite-multicol-table-wrap" style="margin:0 -15px;"> 	<table class="wsite-multicol-table"> 		<tbody class="wsite-multicol-tbody"> 			<tr class="wsite-multicol-tr"> 				<td class="wsite-multicol-col" style="width:50%; padding:0 15px;"> 					 						  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/published/merge-join-gui.png?1503536762" alt="Picture" style="width:385;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>   					 				</td>				<td class="wsite-multicol-col" style="width:50%; padding:0 15px;"> 					 						  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/published/conditional-split-task.png?1503536770" alt="Picture" style="width:370;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>   					 				</td>			</tr> 		</tbody> 	</table> </div></div></div>  <div class="paragraph"><strong><font color="#c2743b">Temp Table Solution:</font></strong><br /><font color="#2a2a2a">I won&rsquo;t spend too much time here because this option was a bit slower than the MergeJoin, and requires a stage table in the same database as the ODS.&nbsp; The basic concept is to stage the unique BKs (business keys) of the source into your ODS database using a TRUNCATE and full reINSERT.&nbsp; Then perform the lookup between your ODS database and the newly loaded stage table.&nbsp; This does work, and avoids a lookup to a 2nd server or source, but it is not my personal preference.<br />&#8203;</font><br /><font color="#2a2a2a">A temp table solution (stg.Customer) will look something like this:</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/delete-with-temp-table-dft-1_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><strong><font color="#c2743b">Conclusion: </font></strong><font color="#2a2a2a">We might not like it, but sometimes we must make a full comparison between our incrementally loaded data warehouse table and a source system to find business keys that no longer exist in the source.&nbsp; SSIS is built to handle this, but we still need to try several options to find the best performing solutions for our unique environments.&nbsp;<br /><br />This blog post just dealt&nbsp;with the ODS; deleted source records still have to be handled in the EDW. &nbsp;I loath seeing IsDeleted columns in true data warehouses. &nbsp;</font><span style="color:rgb(42, 42, 42)">Is there a reporting requirement for deleted records? &nbsp;</span><font color="#2a2a2a">Are you going to put an index on that column? &nbsp;Will you have two sets of views, one with deleted records and one without? &nbsp;A much better way to handle deleted source records already persisted to an EDW is to create a DEL (deleted) schema and <u>move</u> the edw.Customer rows to del.Customer. &nbsp;It takes more ETL effort, but once developed, always done. &nbsp;ROI (return on investment) is not having to wade through deleted rows in the EDW. &nbsp;I feel another blog post coming ...</font></div>]]></content:encoded></item><item><title><![CDATA[Error Handling in SSIS, Redirecting Error Rows, Report and Pass]]></title><link><![CDATA[https://www.delorabradish.com/ssis/error-handling-in-ssis-redirecting-error-rows-report-and-pass]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/error-handling-in-ssis-redirecting-error-rows-report-and-pass#comments]]></comments><pubDate>Wed, 12 Jul 2017 20:23:27 GMT</pubDate><category><![CDATA[Error Handling]]></category><category><![CDATA[SSIS]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/error-handling-in-ssis-redirecting-error-rows-report-and-pass</guid><description><![CDATA[I classify SSIS errors into two groups:Report and pass -- non-fatal errors that should be reported but not stop the ETL process like (my opinion) truncation or data type mismatch from a text file.Report and fail -- fatal errors that should fail a package, like "cannot acquire connection from connection manager". &nbsp;(Honestly, no package is going anywhere after this one!)This blog post is about "Report and Pass" which I first talked about in Top 5 Best Practices for SSIS. &nbsp;There are hundr [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font color="#2a2a2a">I classify SSIS errors into two groups:</font><ol><li><font color="#2a2a2a">Report and pass -- non-fatal errors that should be reported but not stop the ETL process like (my opinion) truncation or data type mismatch from a text file.</font></li><li><font color="#2a2a2a">Report and fail -- fatal errors that should fail a package, like "cannot acquire connection from connection manager". &nbsp;(Honestly, no package is going anywhere after this one!)</font></li></ol><br /><font color="#2a2a2a">This blog post is about "Report and Pass" which I first talked about in</font> <a href="http://www.delorabradish.com/ssis/best-practices-for-ssis" target="_blank">Top 5 Best Practices for SSIS</a>. &nbsp;<font color="#2a2a2a">There are hundreds of SSIS potential errors, and to start with, I STRONGLY recommend</font> <a href="http://pragmaticworks.com/Products/BI-xPress" target="_blank">BI xPress</a> <font color="#2a2a2a">from Pragmatic Works. &nbsp;SSIS package deployment and monitoring aside, however, what I'd like to talk to you about here is error handling inside of a data flow so that we are not sacrificing the permanent, the completion of your master-master package, on the alter of the immediate, a dateTime data type that came through last night in a text file's integer column space.<br /><br />Last month I dusted off an old SQL 2008 SSIS package written about that same time. &nbsp;I was hoping that error handling in SSIS through VS 2015 &nbsp;had some cool new task to "report and pass". &nbsp; Not finding anything, I was hoping that </font><a href="http://pragmaticworks.com/Products/Task-Factory" target="_blank">Task Factory</a>, <font color="#2a2a2a">a second "don't deploy without it" SSIS tool, had picked up the slack. &nbsp;Nothing doing, as some say. &nbsp;H</font><font color="#2a2a2a">ere then is a review of an old proven SSIS methodology for recording data of redirected errors in a SSIS data flow.&nbsp;<br />&#8203;</font><br /><strong><font color="#a85f2e">Process Overview</font></strong><br /><font color="#2a2a2a">First we need a SQL Server data table to hold our error rows. &nbsp;You might consider something like this:</font></div>  <div><div class="wsite-multicol"><div class="wsite-multicol-table-wrap" style="margin:0 -15px;"> 	<table class="wsite-multicol-table"> 		<tbody class="wsite-multicol-tbody"> 			<tr class="wsite-multicol-tr"> 				<td class="wsite-multicol-col" style="width:50%; padding:0 15px;"> 					 						  <div class="paragraph"><font size="2"><span style="color:rgb(194, 116, 59)">CREATE TABLE [etl].[SSISErrorHandlingLog](</span><br /><span style="color:rgb(194, 116, 59)">[RowID] [int] IDENTITY(1,1) NOT NULL,</span><br /><span style="color:rgb(194, 116, 59)">[SSISPackageId] [varchar](50) NOT NULL,</span><br /><span style="color:rgb(194, 116, 59)">[SSISTaskName] [varchar](50) NOT NULL,</span><br /><span style="color:rgb(194, 116, 59)">[SSISTableName] [varchar](50) NULL,</span><br /><span style="color:rgb(194, 116, 59)">[ErrorCode] [varchar](25) NULL,</span><br /><span style="color:rgb(194, 116, 59)">[ErrorColumnID] [varchar](10) NULL,</span><br /><span style="color:rgb(194, 116, 59)">[ErrorColumnName] [varchar](250) NULL,</span><br /><span style="color:rgb(194, 116, 59)">[ErrorDescription] [varchar](1200) NOT NULL,</span><br /><span style="color:rgb(194, 116, 59)">[RowData] [varchar](1500) NULL,</span><br /><span style="color:rgb(194, 116, 59)">[InsertedDateTime] [datetime] NOT NULL,</span></font></div>   					 				</td>				<td class="wsite-multicol-col" style="width:50%; padding:0 15px;"> 					 						  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0;margin-right:0;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/published/ssis-error-handling-01_2.png?1499906088" alt="Picture" style="width:296;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>   					 				</td>			</tr> 		</tbody> 	</table> </div></div></div>  <div class="paragraph"><font size="2"><span style="color:rgb(194, 116, 59)">CONSTRAINT [PK_SSISErrorHandlingLog] PRIMARY KEY CLUSTERED&nbsp;</span><br /><span style="color:rgb(194, 116, 59)">([RowID] ASC) ON [PRIMARY]) ON [PRIMARY]</span><br /><span style="color:rgb(194, 116, 59)">GO<br /></span><font color="#c2743b">ALTER TABLE [etl].[SSISErrorHandlingLog] ADD &nbsp;CONSTRAINT [DF_SSISErrorHandlingLog_InsertedDateTime] &nbsp;DEFAULT (getdate()) FOR [InsertedDateTime]<br />GO</font></font></div>  <div><div class="wsite-multicol"><div class="wsite-multicol-table-wrap" style="margin:0 -15px;"> 	<table class="wsite-multicol-table"> 		<tbody class="wsite-multicol-tbody"> 			<tr class="wsite-multicol-tr"> 				<td class="wsite-multicol-col" style="width:35.515151515152%; padding:0 15px;"> 					 						  <div><div class="wsite-image wsite-image-border-none " style="padding-top:0px;padding-bottom:0px;margin-left:0px;margin-right:0px;text-align:left"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/published/ssis-error-handling-02_4.png?1499906006" alt="Picture" style="width:255;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>   					 				</td>				<td class="wsite-multicol-col" style="width:64.484848484848%; padding:0 15px;"> 					 						  <div class="paragraph">&#8203;<font color="#2a2a2a">We then need to add three or four tasks to each data flow: a union all (optional), a derived column, a script task (optional), and finally an OLE DB destination to our etl.SSISErrorHandlingLog table. &nbsp;All things completed, a data flow will look something like this (on the left):</font><br /><br /><font color="#2a2a2a">&#8203;The script component has two inputs: ErrorCode and ErrorColumn. &nbsp;It has one output: ErrorText. &nbsp;(You can also add addition C3 for ErrorColumnName.) &nbsp;Other than that, the script component exposes the error description.</font><br /><span style="color:rgb(153, 153, 153)">&nbsp; &nbsp;&nbsp;</span><font color="#c2743b" size="2">public override void Input0_ProcessInputRow(Input0Buffer Row)<br />&nbsp;&nbsp;&nbsp; {<br />Row.ErrorDescription= ComponentMetaData.GetErrorDescription(Row.ErrorCode);<br />&nbsp;&nbsp;&nbsp; }</font><br /><br /><font color="#2a2a2a">The DC (derived column) task has an expression for [RowData].</font><br /><font color="#c2743b" size="2">"My Table's PK=\"" + REPLACENULL((DT_WSTR,50)IntKey," ") + "\"" + " My User BK=\"" + REPLACENULL(MyStringColumn," ") + "\""</font></div>   					 				</td>			</tr> 		</tbody> 	</table> </div></div></div>  <div class="paragraph"><font color="#2a2a2a">Additional derived columns might be as shown below: (I did not do myself any favors by making my table's string data types VARCHAR() instead of NVARCHAR(), so please follow your own DB data type standards.)</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:0px;padding-bottom:0px;margin-left:0px;margin-right:0px;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/ssis-error-handling-03_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#2a2a2a">If you are redirecting error from multiple places, like both your SRC and DST, use a Union All task before the script task making sure to only include needed columns -- you may get SSIS error when trying to union large string values, so only union needed PKs, FKs and BKs consumed by the RowData expression.</font><br /><br /><strong><font color="#a85f2e">IMPORTANT: </font></strong><font color="#2a2a2a">&nbsp;All things in moderation! &nbsp;Here is an error handling example of what NOT to do.</font></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:10px;padding-bottom:10px;margin-left:0px;margin-right:0px;text-align:left"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/editor/ssis-error-handling-04.png?1499905119" alt="Picture" style="width:355;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><strong><font color="#a85f2e">Where Things Go Wrong:</font></strong><ol><li><font color="#2a2a2a">Your error handling was NOT done in moderation</font></li><li><font color="#2a2a2a">The RowData expression is incorrect. &nbsp;Work in small pieces building the script one small part at a time.</font></li><li><font color="#2a2a2a">The data types of the derived columns or script output do not match the etl.SSISErrorHandlingLog. &nbsp;Either convert your SSIS output columns or change or destination data types.</font></li><li><font color="#2a2a2a">Your script component follows a DOT NET component and there is no error description to be had. &nbsp;The script task will "hang".</font><br /></li><li><font color="#2a2a2a">The binary code for the script is not found. &nbsp;I'm still trying to figure this out as it seem intermittent. &nbsp;Open the script component, edit the script, save the script, save the package, open and close the package. &nbsp;If you have a more technical solution, I'd love to hear from you.</font></li></ol><br /><strong><font color="#a85f2e">Conclusion:</font></strong><br /><font color="#2a2a2a">There is more than one way to trap and report SSIS errors, but in the end, we are all traditionally using a package event handler or redirecting errors in the data flow. &nbsp;I like to see the data that caused the problem, so I tend to go a little overboard with the information concatenated&nbsp;for the etl.SSISErrorHandlingLog.RowData column. &nbsp;I also hook up my error table to a SSRS report and SSRS subscription in an effort to actually act on the data collected, not just store it.</font></div>]]></content:encoded></item><item><title><![CDATA[Performance Testing OLE DB vs ADO.NET in SSIS]]></title><link><![CDATA[https://www.delorabradish.com/ssis/ole-db-vs-adonet-in-ssis]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/ole-db-vs-adonet-in-ssis#comments]]></comments><pubDate>Wed, 12 Jul 2017 20:23:03 GMT</pubDate><category><![CDATA[SSIS]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/ole-db-vs-adonet-in-ssis</guid><description><![CDATA[I always "knew" that ADO.NET was slower than OLE DB destination SSIS components, but I had never tested it. &nbsp;This month I was testing a 3rd party SSIS destination component that had been built on ADO.NET and, oh my (!!). &nbsp;The performance was substantially slower than what I could get done with a traditional OLE DB upsert process. &nbsp;However, before I told the client, "Abandon ship! &nbsp;Every SSIS package for itself!", I decided to try a few simple tests. &nbsp;Here are my results  [...] ]]></description><content:encoded><![CDATA[<div class="paragraph"><font color="#2a2a2a">I always "knew" that ADO.NET was slower than OLE DB destination SSIS components, but I had never tested it. &nbsp;This month I was testing a 3rd party SSIS destination component that had been built on ADO.NET and, oh my (!!). &nbsp;The performance was substantially slower than what I could get done with a traditional OLE DB upsert process. &nbsp;However, before I told the client, "Abandon ship! &nbsp;Every SSIS package for itself!", I decided to try a few simple tests. &nbsp;Here are my results of my second test, an upsert (insert new rows and update existing rows):</font><br />&#8203;<br /><strong><font color="#a85f2e" size="4">Environment</font></strong><ol><li><font color="#2a2a2a">(local) SQL Server 2016 v.13.0.1782.2</font></li><li><font color="#2a2a2a">(local) Visual Studio 2015 v.14.0.25431.01</font></li><li><font color="#2a2a2a">Source table contains 12,243,275 rows and was 31.1 MB on disc</font></li><li><font color="#2a2a2a">Source and destination tables had identical column structures</font><ul><li><font color="#2a2a2a">Tables contained 33 columns with total max row length of 181 bytes (information taken from sys.columns)</font></li><li><font color="#2a2a2a">There was a PK in both the dbo.SourceTable and dbo.DestinationTable.&nbsp; dbo.DestinationTableForUpdate contained no PK</font></li><li><font color="#2a2a2a">There was no FK, additional constraints or indexes on any table</font></li></ul></li><li><font color="#2a2a2a">dbo.DestinationTable was manually truncated (test #1) or half the rows were deleted (test #2) before each test and was not part of the test time results</font></li><li><font color="#2a2a2a">All tests were executed through Visual Studio 2015 unless noted otherwise</font></li></ol><br /><strong><font color="#a85f2e" size="4">Test: Insert and Bulk Update, or Merge()</font></strong><br /><strong><font color="#e0915c">Setup:</font></strong> <font color="#2a2a2a">6,121,637 rows were removed from dbo.DestinationTable prior to each test.&nbsp; Consequently, these tests inserted 6,121,637 rows and updated 6,121,638 existing rows.</font><br /><br /><strong><font color="#e0915c">Results:</font></strong></div>  <div><div class="wsite-image wsite-image-border-none " style="padding-top:0px;padding-bottom:0px;margin-left:0px;margin-right:0px;text-align:center"> <a> <img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/oledb-vs-ado-net-01_1_orig.png" alt="Picture" style="width:auto;max-width:100%" /> </a> <div style="display:block;font-size:90%"></div> </div></div>  <div class="paragraph"><font color="#a1a1a1">1. &nbsp;This was a typical t-sql MERGE() statement with no DELETE and no output into a result table.<br />2. &nbsp;The same MERGE() statement executed through SSMS in #1, was executed through a SSIS Execute Sql task<br />3. An old-fashioned OLE DB upsert process: &nbsp;If key found, insert to DestinationTableForUpdate, if new, insert into DestinationTable. &nbsp;An execute Sql task followed the data flow to UPDATE DestinationTable with values found in DestinationTableForUpdate.<br />4. &nbsp;Rinse and repeat #3 using ADO.NET components<br />5. Third party upsert component which handles the insert and update in one beautiful simple ... easy ... low maintenance ...&nbsp;<em>any-entry-level-SSIS-developer-can-do-it</em>&nbsp;task. &nbsp;(I really wanted to use this SSIS add-in!)</font><br /><br /><strong style="color:rgb(153, 153, 153)"><font color="#a85f2e" size="4">Conclusion:<br /></font></strong><font color="#2a2a2a">OLE DB has better performance, and because the nifty 3rd upsert task is built on ADO.NET, I won't be using it nor recommending it for use in anything other than a small to mid-size company with small MB incremental loads. &nbsp;Bummer! &nbsp;The good news is that I now have a benchmark and if Microsoft improves ADO.NET in future releases on SQL Server, I'll be pulling out my POC database and hoping for better results. &nbsp;(p.s. If you are wondering why I will not recommend sacrificing the permanent daily ETL load on the alter of an immediate faster ETL development, please see my <a href="http://www.delorabradish.com/modeling-for-bi/the-big-picture-what-is-in-the-center-of-your-bi-wheel" target="_blank">BI Wheel</a>. &nbsp;ETL is not the center of the wheel.)</font><br /></div>]]></content:encoded></item><item><title><![CDATA[Top 5 Best Practices for SSIS Performance]]></title><link><![CDATA[https://www.delorabradish.com/ssis/top-5-best-practices-for-ssis-performance]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/top-5-best-practices-for-ssis-performance#comments]]></comments><pubDate>Sun, 24 May 2015 17:33:08 GMT</pubDate><category><![CDATA[Best Practices]]></category><category><![CDATA[SSIS]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/top-5-best-practices-for-ssis-performance</guid><description><![CDATA[ Microsoft has built in multiple data flow performance features that you can read about here (http://msdn.microsoft.com/en-us/library/ms141031.aspx). &nbsp;The following list is not all-inclusive, but the following best practices will help you to avoid the majority of common SSIS oversights and mistakes.1.	Give your SSIS process its own server. &nbsp;The resources needed for data integration, primary memory and lots of it, are different than for data storage. &nbsp;Granted, if your entire ETL pr [...] ]]></description><content:encoded><![CDATA[<span class='imgPusher' style='float:right;height:729px'></span><span style='display: table;width:215px;position:relative;float:right;max-width:100%;;clear:right;margin-top:20px;*margin-top:40px'><a><img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/8195457.png?197" style="margin-top: 5px; margin-bottom: 10px; margin-left: 0px; margin-right: 10px; border-width:1px;padding:3px; max-width:100%" alt="Picture" class="galleryImageBorder wsite-image" /></a><span style="display: table-caption; caption-side: bottom; font-size: 90%; margin-top: -10px; margin-bottom: 10px; text-align: center;" class="wsite-caption"></span></span> <div class="paragraph" style="text-align:justify;display:block;">Microsoft has built in multiple data flow performance features that you can read about here (<font color="#24678d">http://msdn.microsoft.com/en-us/library/ms141031.aspx</font>). &nbsp;The following list is not all-inclusive, but the following best practices will help you to avoid the majority of common SSIS oversights and mistakes.<br /><br />1.<span style="">	</span><font color="#8d2424"><strong>Give your SSIS process its own server. </strong></font>&nbsp;The resources needed for data integration, primary memory and lots of it, are different than for data storage. &nbsp;Granted, if your entire ETL process runs in just a few hours during the night when no end users are connecting, the case can be made to share servers; however, more often, real-time requirements and on-demand transformation are reasons to give your ETL process dedicated hardware.<br /><br />2.<span style="">	</span><font color="#8d2424"><strong>Only update rows in your data warehouse that have been changed or deleted from your source system(s).</strong></font><br /><ul><li><span style="line-height: 1.5; background-color: initial;">Use SQL Server&rsquo;s change data capture abilities whenever possible if working with many updates and deletes in the source system. &nbsp;This works well in a one-source-to-one-destination transform, but can become quite complex in a multi-source-to-one-destination ETL package</span></li><li><span style="line-height: 1.5; background-color: initial;">Consider the use of hash values when your source system does not indicate last update date time, or when you transforms involve multi-source-to-one-destination transforms.</span></li></ul><br />3.<span style="">	</span><font color="#8d2424"><strong>Install and test for adequate RAM on the SSIS server. &nbsp;</strong></font>Sorting 2TB of data requires 2TB of RAM, and SSIS will start to write the data to disc when all available memory is taken. &nbsp;As part of your Test / QA methodology, you should use Performance Monitor and have your network team notify you whenever Buffers Spooled goes above zero or Avg. Disk sec/Transfer gets above 10. &nbsp;Test and monitor the following PerfMon counters. &nbsp;<br /><ul><li><span style="line-height: 1.5; background-color: initial;">% Processor Time</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Avg Disk sec/Transfer</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Available Mbytes</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Buffer memory</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Buffers in use</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Buffers spooled</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Private buffer memory</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Private buffers in use</span><br /></li></ul><br />4.<span style="">	</span><font color="#8d2424"><strong>Take note of your MaxConcurrentExecutables package property.</strong></font> &nbsp;&nbsp;This defines how many tasks can run simultaneously. &nbsp;The default value is negative one (-1), which means the number of physical or logical processors plus two (2) is the number of control flow items that can be executed in parallel. &nbsp;It is generally recommended that you leave this setting at the default unless you are absolutely sure that parallelism is causing an issue.<br /><span style=""></span></div> <hr style="width:100%;clear:both;visibility:hidden;"></hr>  <span class='imgPusher' style='float:right;height:39px'></span><span style='display: table;width:209px;position:relative;float:right;max-width:100%;;clear:right;margin-top:20px;*margin-top:40px'><a><img src="https://www.delorabradish.com/uploads/5/3/4/3/53431729/7566427.png?191" style="margin-top: 5px; margin-bottom: 10px; margin-left: 0px; margin-right: 10px; border-width:1px;padding:3px; max-width:100%" alt="Picture" class="galleryImageBorder wsite-image" /></a><span style="display: table-caption; caption-side: bottom; font-size: 90%; margin-top: -10px; margin-bottom: 10px; text-align: center;" class="wsite-caption"></span></span> <div class="paragraph" style="text-align:justify;display:block;"><span style="">5.</span><span style="">	</span><font color="#8d2424"><strong>Adequately test and update data flow properties that impact performance.</strong></font><br /><ul style=""><li style=""><span style="">EngineThreads</span><br /></li><li style=""><span style="">DefaultBufferMaxRows</span><br /></li><li style=""><span style="">DefaultBufferSize</span><br /></li></ul><span style="">You can find a plethora of information on these package properties on the Internet, and so their definitions will not reiterated here; however, the important thing is not to push a large ETL project into production with the SSIS defaults. &nbsp;Best practice is to develop small, mid-sized and large data transforms with the SSIS package defaults and then take the time to test changes to each of these properties based on volume. &nbsp;Unfortunately, there is no exact science.</span></div> <hr style="width:100%;clear:both;visibility:hidden;"></hr>]]></content:encoded></item><item><title><![CDATA[Top 5 Best Practices for SSIS Design]]></title><link><![CDATA[https://www.delorabradish.com/ssis/best-practices-for-ssis]]></link><comments><![CDATA[https://www.delorabradish.com/ssis/best-practices-for-ssis#comments]]></comments><pubDate>Sun, 24 May 2015 05:50:40 GMT</pubDate><category><![CDATA[Best Practices]]></category><category><![CDATA[Design]]></category><guid isPermaLink="false">https://www.delorabradish.com/ssis/best-practices-for-ssis</guid><description><![CDATA[1.	A good SSIS package design will be repeatable. &nbsp;If you find yourself adding new tasks and data flow exceptions to your packages, you need to stop and reevaluate the original layout. &nbsp;One SSIS project will have several &ldquo;templates&rdquo; that are reused without a single design change for the life of the data warehouse. &nbsp;You should only have one or two template variations for each of these areas:Staging packageFact packagedimension packageSlowly changing dimension type 2 pac [...] ]]></description><content:encoded><![CDATA[<div class="paragraph" style="text-align:left;">1.<span style="">	</span><strong><font color="#8d2424">A good SSIS package design will be repeatable.</font> </strong>&nbsp;If you find yourself adding new tasks and data flow exceptions to your packages, you need to stop and reevaluate the original layout. &nbsp;One SSIS project will have several &ldquo;templates&rdquo; that are reused without a single design change for the life of the data warehouse. &nbsp;You should only have one or two template variations for each of these areas:<br /><ul><li><span style="line-height: 1.5; background-color: initial;">Staging package</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Fact package</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">dimension package</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Slowly changing dimension type 2 package</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">Control package</span><br /></li></ul>The key to manageable QA and the longevity of your ETL process is using a repeatable process. &nbsp;The same thing should happen over and over again as data is moved from staging to your IDS (Information Data Store) to your EDW (Enterprise Data Warehouse). &nbsp;Consider using t-SQL in USPs to work out complex business logic.<br /><br />2.<span style="">	</span><strong><font color="#8d2424">Plan for restartability. </font></strong>&nbsp;As of SQL 2014, SSIS checkpoint files still did not work with sequence containers. &nbsp;(The whole sequence container will restart including successfully completed tasks.) &nbsp;The solution is to build Restartability into your ABC framework.<br /><br />3.<span style="">	</span><strong><font color="#8d2424">Verify your ETL process. </font></strong>&nbsp;Just because your ETL finished without error &ndash; or you successfully handled your errors, doesn&rsquo;t necessarily mean the SUM() of SourceSystemNet equals the SUM() of SSAScubeNet. &nbsp;Use your verification process this way.<br />a.<span style="">	</span>It should be the final step of your ETL / ELT process<br />b.<span style="">	</span>It should confirm that strategic SUM() and row COUNT() are accurate<br />c.<span style="">	</span>It should report on dropped rows discarded during the ETL from an INNER JOIN or WHERE clause<br />d.<span style="">	</span>It should automatically send emails of errors that have been allowed to &ldquo;report and pass&rdquo;.<br /><br />4.<span style="">	</span><font color="#8d2424"><strong>Collect Metadata!</strong> &nbsp;</font>Audit, balance and control (ABC) should be planned for and implemented from the very first day. &nbsp;You should store this ABC data in a separate SQL Server database, and at any point in time be able to determine the following:<br /><ul><li><span style="line-height: 1.5; background-color: initial;">What packages are currently running? &nbsp;(Your SQL Agent will kick off a master package but won&rsquo;t tell you what child packages / sequence containers are in process.)</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">When did a package last successfully execute?</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">How many records were selected vs. inserted, updated or deleted from any given task?</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">At what data flow process did a package fail, and where should it restart?</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">How long did each task take to run?</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">How long did each package take to run?</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">What tasks are taking the longest to execute?</span><br /></li></ul><br />5.<span style="">	</span><strong><font color="#8d2424">Trap for Errors</font></strong> both through On Error events and through precedence constraints. &nbsp;There are two types of errors to successfully handle in an ETL / ELT process<br /><ul><li><span style="line-height: 1.5; background-color: initial;">Report and fail. &nbsp;Use Event Handles and your package properties for this.</span><br /></li><li><span style="line-height: 1.5; background-color: initial;">&nbsp;R</span><span style="line-height: 1.5; background-color: initial;">eport and pass. &nbsp;Use your precedence constraints to allow for errors, but always keep row-level reporting so that someone can be notified and the problem can be researched and corrected</span></li></ul>With everything else, there must be a balance. &nbsp;It is not appropriate to have every single task with an Error Output. &nbsp;Choose wisely. &nbsp;It is appropriate often to &lsquo;report and fail&rsquo;, but when there are hundreds of packages and tables churning through your ETL system, you cannot have it breaking constantly. (This is often the case when source systems keep sending through data quality &ldquo;surprises&rdquo;.) &nbsp;Don&rsquo;t jeopardize the reliability of your data warehouse by not handling a source system that continues to behave badly. &nbsp;The DW is your responsibility. &nbsp;The source system is often outside of your control.<br /><br /><font color="#8d2424">For a full list of MS BI Best practices, download the following file:</font><span style=""></span></div>  <div><div style="margin: 10px 0 0 -10px"> <a href="https://www.delorabradish.com/uploads/5/3/4/3/53431729/ms_bi_best_practices.docx"><img src="//www.weebly.com/weebly/images/file_icons/rtf.png" width="36" height="36" style="float: left; position: relative; left: 0px; top: 0px; margin: 0 15px 15px 0; border: 0;" /></a><div style="float: left; text-align: left; position: relative;"><table style="font-size: 12px; font-family: tahoma; line-height: .9;"><tr><td colspan="2"><b> ms_bi_best_practices.docx</b></td></tr><tr style="display: none;"><td>File Size:  </td><td>2137 kb</td></tr><tr style="display: none;"><td>File Type:  </td><td> docx</td></tr></table><a href="https://www.delorabradish.com/uploads/5/3/4/3/53431729/ms_bi_best_practices.docx" style="font-weight: bold;">Download File</a></div> </div>  <hr style="clear: both; width: 100%; visibility: hidden"></hr></div>]]></content:encoded></item></channel></rss>