<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Export CSV Task Archives | ZappySys Blog</title>
	<atom:link href="https://zappysys.com/blog/tag/export-csv-task/feed/" rel="self" type="application/rss+xml" />
	<link>https://zappysys.com/blog/tag/export-csv-task/</link>
	<description>SSIS / ODBC Drivers / API Connectors for JSON, XML, Azure, Amazon AWS, Salesforce, MongoDB and more</description>
	<lastBuildDate>Mon, 01 Jul 2024 20:47:45 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.4.4</generator>

 
	<item>
		<title>Load 10M rows from SQL Server to Snowflake in 3 minutes</title>
		<link>https://zappysys.com/blog/load-10-million-rows-from-sql-server-to-snowflake/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Tue, 17 Dec 2019 16:55:54 +0000</pubDate>
				<category><![CDATA[AWS (Amazon Web Services)]]></category>
		<category><![CDATA[Cloud Computing]]></category>
		<category><![CDATA[S3 (Simple Storage Service)]]></category>
		<category><![CDATA[SSIS CSV Export Task]]></category>
		<category><![CDATA[SSIS PowerPack]]></category>
		<category><![CDATA[SSIS Tasks]]></category>
		<category><![CDATA[CSV]]></category>
		<category><![CDATA[export]]></category>
		<category><![CDATA[Export CSV Task]]></category>
		<category><![CDATA[snowflake]]></category>
		<category><![CDATA[sql server]]></category>
		<category><![CDATA[zip]]></category>
		<guid isPermaLink="false">https://zappysys.com/blog/?p=8538</guid>

					<description><![CDATA[<p>Introduction In this article, we will cover the points on how to load 10 million rows from SQL Server to Snowflake in just 3 minutes. Snowflake is a data warehousing platform that resides in a cloud. Basically, it is a data warehouse software exposed as a service. It allows integrating many data sources via internal [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/load-10-million-rows-from-sql-server-to-snowflake/">Load 10M rows from SQL Server to Snowflake in 3 minutes</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p style="text-align: justify;"><img loading="lazy" decoding="async" class="size-thumbnail wp-image-8580 align= alignleft" src="https://zappysys.com/blog/wp-content/uploads/2019/12/sql-server-to-snowflake-150x150.png" alt="" width="150" height="150" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/sql-server-to-snowflake-150x150.png 150w, https://zappysys.com/blog/wp-content/uploads/2019/12/sql-server-to-snowflake-300x300.png 300w, https://zappysys.com/blog/wp-content/uploads/2019/12/sql-server-to-snowflake.png 400w" sizes="(max-width: 150px) 100vw, 150px" />In this article, we will cover the points on how to load 10 million rows from SQL Server to Snowflake in just 3 minutes.</p>
<p style="text-align: justify;">Snowflake is a data warehousing platform that resides in a cloud. Basically, it is a data warehouse software exposed as a service. It allows integrating many data sources via internal Snowflake Partner apps and loading them into the Snowflake storage engine. Another part of Snowflake is a computing engine that is responsible for serving your SQL queries. Both engines can work independently thus users that are querying a data warehouse are not affected by a data load that is happening at the same time. Snowflake is an elastic service, which means you pay only for the resources used. Specifically, you pay only for each second of the processing time you use.</p>
<p style="text-align: justify;">To show how data loading works in Snowflake, we will take 10 million rows from SQL Server and load them into Snowflake using SSIS and <a href="https://zappysys.com/products/ssis-powerpack/" target="_blank" rel="noopener">ZappySys SSIS PowerPack</a>. These SSIS PowerPack connectors will be used to achieve the task:</p>
<div class="content_block" id="custom_post_widget-8706"><div style="display: table-row; background: #f7f7f7;">
<div style="display: table-cell; padding: 1em; border: 1px solid #ccc;"><img loading="lazy" decoding="async" style="vertical-align: middle; width: 50px; height: 50px; max-width: 50px;" src="//zappysys.com/images/SSIS-PowerPack/ssis-export-csv-file-task.png" alt="JSON Parser Transform" width="50" height="50" /></div>
<div style="display: table-cell; padding: 1em; border: 1px solid #ccc; border-left: none; width: 100%;"><a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export CSV File Task</a></div>
</div></div>
<div class="content_block" id="custom_post_widget-2523"><h2><span id="Prerequisites">Prerequisites</span></h2>
Before we perform the steps listed in this article, you will need to make sure the following prerequisites are met:
<ol style="margin-left: 1.5em;">
 	<li><abbr title="SQL Server Integration Services">SSIS</abbr> designer installed. Sometimes it is referred to as <abbr title="Business Intelligence Development Studio">BIDS</abbr> or <abbr title="SQL Server Data Tools">SSDT</abbr> (<a href="https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt" target="_blank" rel="noopener">download it from the Microsoft site</a>).</li>
 	<li>Basic knowledge of SSIS package development using <em>Microsoft SQL Server Integration Services</em>.</li>
 	<li>Make sure <span style="text-decoration: underline;"><a href="https://zappysys.com/products/ssis-powerpack/" target="_blank" rel="noopener">ZappySys SSIS PowerPack</a></span> is installed (<a href="https://zappysys.com/products/ssis-powerpack/download/" target="_blank" rel="noopener">download it</a>, if you haven't already).</li>
 	<li>(<em>Optional step</em>)<em>.</em> <a href="https://zappysys.zendesk.com/hc/en-us/articles/360035974593" target="_blank" rel="noopener">Read this article</a>, if you are planning to deploy packages to a server and schedule their execution later.</li>
</ol></div>
<h2>Step-by-step &#8211; How to load 10 million rows from SQL Server to Snowflake in 3 minutes</h2>
<h3>Getting started</h3>
<p style="text-align: justify;">To achieve the goal, we will use a slightly modified Northwind database (example database from Microsoft) and SnowSQL &#8211; a command-line tool provided by Snowflake. So in the first steps, you will need to install the Northwind database and SnowSQL. After that, we will proceed to create a table in Snowflake &#8211; a table we will load the data into &#8211; and a file format which will be used to load data from the Snowflake staging area into the destination table. Once that&#8217;s complete, we will proceed in creating an SSIS package, adding and configuring the connectors, and finally running it to get the results. Let&#8217;s proceed!</p>
<h3>Install a Northwind database</h3>
<p>Download and run the creation script of a modified <a href="https://zappysys.com/blog/wp-content/uploads/2019/12/Northwind.zip">Northwind</a> database. The only thing that was modified is a &#8220;CustomersForSnowflake&#8221; view added which returns 10M rows.</p>
<h3>Install and configure SnowSQL command-line tool</h3>
<p>Download and install SnowSQL; you will find the instructions on how to do that in <a href="https://docs.snowflake.net/manuals/user-guide/snowsql-install-config.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/user-guide/snowsql-install-config.html</a>.</p>
<p>Once installed, you will need to configure the default Snowflake account name, user name, and password:</p>
<ol>
<li>Go to Windows File Explorer and enter this path:<br />
<code>%USERPROFILE%\.snowsql\</code><br />
Usually, when entered the path looks similar to this one:<br />
<code>C:\Users\myUserName\.snowsql</code></li>
<li>Find a file named <strong>config </strong>and open it.</li>
<li>Then configure your credentials:</li>
</ol>
<p><img loading="lazy" decoding="async" class="alignnone size-full wp-image-8587" src="https://zappysys.com/blog/wp-content/uploads/2019/12/016-sql-server-to-snowflake-configure-snowsql-username.png" alt="" width="712" height="427" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/016-sql-server-to-snowflake-configure-snowsql-username.png 712w, https://zappysys.com/blog/wp-content/uploads/2019/12/016-sql-server-to-snowflake-configure-snowsql-username-300x180.png 300w" sizes="(max-width: 712px) 100vw, 712px" /><br />
<div class="su-note"  style="border-color:#e5de9d;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-u-clearfix su-u-trim" style="background-color:#FFF8B7;border-color:#ffffff;color:#333333;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><strong>NOTE:</strong> When you deploy the package to a production server, instead of using your own user name, enter the username under which you run SSIS packages, e.g. SQL Server Agent&#8217;s account. Username and password can be specified <a href="https://docs.snowflake.net/manuals/user-guide/snowsql-start.html#connection-syntax" target="_blank" rel="noopener">as arguments</a> in a command-line when executing SnowSQL. A password can also be specified in <a href="https://docs.snowflake.net/manuals/user-guide/snowsql-start.html#specifying-passwords-when-connecting" target="_blank" rel="noopener">Environment variables</a>.</div></div>
<h3>Create a table in Snowflake</h3>
<p>Login to your Snowflake account, open a Worksheet and execute this query:</p><pre class="crayon-plain-tag">create or replace table Customers(
ID number,
CustomerID nchar(5),
CompanyName nvarchar(40),
ContactName nvarchar(30),
ContactTitle nvarchar(30),
Address nvarchar(60),
City nvarchar(15),
Region nvarchar(15),
PostalCode nvarchar(10),
Country nvarchar(15),
Phone nvarchar(24),
Fax nvarchar(24)
)</pre><p>
&nbsp;</p>
<p>We will load 10M customers from the Northwind database and load them into this table.</p>
<h3>Create a file format in Snowflake</h3>
<p>Then in the same Worksheet create a file format for zipped CSV files by executing this query:</p><pre class="crayon-plain-tag">create or replace file format GzipCsvFormat
type = csv
field_delimiter = ','
null_if = ('NULL', 'null')
empty_field_as_null = true
compression = gzip
field_optionally_enclosed_by = '"'</pre><p>
&nbsp;</p>
<p>We will use this file format when loading data from a Snowflake stage to the Snowflake destination table. This file format defines CSV format which is used by ZappySys Export CSV Task in SSIS.</p>
<h3>Create an SSIS package</h3>
<p>We are ready to create a new SSIS package and load some data into Snowflake. We are going to use two methods of how we load data into Snowflake. In one method we will load data into Snowflake&#8217;s local storage, while in the second one we will stage data in Amazon S3 bucket. Decide how you want to stage files, and then choose the appropriate workflow when creating a new package:</p>
<div id="attachment_8554" style="width: 640px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8554" class="wp-image-8554 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/001-sql-server-to-snowflake-create-ssis-package.png" alt="Loading 3 million rows from SQL Server to Snowflake" width="630" height="519" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/001-sql-server-to-snowflake-create-ssis-package.png 630w, https://zappysys.com/blog/wp-content/uploads/2019/12/001-sql-server-to-snowflake-create-ssis-package-300x247.png 300w" sizes="(max-width: 630px) 100vw, 630px" /><p id="caption-attachment-8554" class="wp-caption-text">Loading 10 million rows from SQL Server to Snowflake</p></div>
<h3>Add &amp; configure ZappySys Export CSV Task</h3>
<p>Firstly, drag and drop ZappySys Export CSV Task from SSIS toolbox and follow the instructions on how to configure it:</p>
<h4>Configure source</h4>
<ol>
<li>Proceed with configuring the data source, from which you will be exporting data. We will use the Northwind database as an example:
<div id="attachment_8555" style="width: 647px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8555" class="wp-image-8555 size-full" style="-webkit-user-drag: none; display: inline-block; margin-bottom: -1ex;" src="https://zappysys.com/blog/wp-content/uploads/2019/12/002-sql-server-to-snowflake-configure-ole-db-source-connection.png" alt="Export CSV Task: configuring OLE DB Connection to export data from SQL Server to Snowflake" width="637" height="292" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/002-sql-server-to-snowflake-configure-ole-db-source-connection.png 637w, https://zappysys.com/blog/wp-content/uploads/2019/12/002-sql-server-to-snowflake-configure-ole-db-source-connection-300x138.png 300w" sizes="(max-width: 637px) 100vw, 637px" /><p id="caption-attachment-8555" class="wp-caption-text">Export CSV Task: configuring OLE DB Connection to export data from SQL Server to Snowflake</p></div></li>
<li>Then move on and select it and enter the query or table name you want to export data from:
<div id="attachment_8557" style="width: 647px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8557" class="wp-image-8557 size-full" style="-webkit-user-drag: none; display: inline-block; margin-bottom: -1ex;" src="https://zappysys.com/blog/wp-content/uploads/2019/12/003-sql-server-to-snowflake-configure-sql-query.png" alt="Export CSV Task: configuring source SQL query for data loading from SQL Server to Snowflake" width="637" height="593" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/003-sql-server-to-snowflake-configure-sql-query.png 637w, https://zappysys.com/blog/wp-content/uploads/2019/12/003-sql-server-to-snowflake-configure-sql-query-300x279.png 300w" sizes="(max-width: 637px) 100vw, 637px" /><p id="caption-attachment-8557" class="wp-caption-text">Export CSV Task: configuring source SQL query for data loading from SQL Server to Snowflake</p></div>
<p>We will be using this query:<br />
<code>select top 10000000 * from CustomersForSnowflake</code></li>
<li>Then in the <em>Split Options</em> tab split the exported CSV into many files, e.g. into 50 MB chunks:
<div id="attachment_8558" style="width: 508px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8558" class="wp-image-8558 size-full" style="-webkit-user-drag: none; display: inline-block; margin-bottom: -1ex;" src="https://zappysys.com/blog/wp-content/uploads/2019/12/007-sql-server-to-snowflake-split-rows.png" alt="Using data split options in the Export CSV Task" width="498" height="309" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/007-sql-server-to-snowflake-split-rows.png 498w, https://zappysys.com/blog/wp-content/uploads/2019/12/007-sql-server-to-snowflake-split-rows-300x186.png 300w, https://zappysys.com/blog/wp-content/uploads/2019/12/007-sql-server-to-snowflake-split-rows-436x272.png 436w" sizes="(max-width: 498px) 100vw, 498px" /><p id="caption-attachment-8558" class="wp-caption-text">Using data split options in the Export CSV Task</p></div>
<div class="su-note"  style="border-color:#e5de9d;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-u-clearfix su-u-trim" style="background-color:#FFF8B7;border-color:#ffffff;color:#333333;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><strong>NOTE:</strong> Snowflake recommends having files that are 10-100 MB in size when <em><span style="text-decoration: underline;">compressed</span></em>. So it is perfectly safe to configure the value to 100 MB and above.</div></div></li>
</ol>
<h4>Configure target</h4>
<p>Depending on which staging approach you chose to use, set the export target to a local path or S3 bucket:</p>
<h5>Using local Snowflake storage</h5>
<p>Just set the appropriate <em>Save Mode</em> and a file path:</p>
<div id="attachment_8693" style="width: 568px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8693" class="wp-image-8693 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/006a-sql-server-to-snowflake-configure-local-target-path-to-export-csv-files-to-1.png" alt="Saving files locally to upload them to Snowflake local stage later" width="558" height="534" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/006a-sql-server-to-snowflake-configure-local-target-path-to-export-csv-files-to-1.png 558w, https://zappysys.com/blog/wp-content/uploads/2019/12/006a-sql-server-to-snowflake-configure-local-target-path-to-export-csv-files-to-1-300x287.png 300w" sizes="(max-width: 558px) 100vw, 558px" /><p id="caption-attachment-8693" class="wp-caption-text">Saving files locally to upload them to Snowflake local stage later</p></div>
<h5>Using S3 storage</h5>
<p>For storing staging files in S3, follow these steps:</p>
<ol>
<li>Set <em>Save Mode</em> to <strong>Save to Connection</strong> and select <strong>&lt;New ZS-AWS-STORAGE Connection&gt;</strong>:
<div id="attachment_8559" style="width: 559px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8559" class="wp-image-8559 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/005-sql-server-to-snowflake-configure-s3-target-connection.png" alt="Export CSV Task: configuring Amazon S3 endpoint as the target for data export" width="549" height="355" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/005-sql-server-to-snowflake-configure-s3-target-connection.png 549w, https://zappysys.com/blog/wp-content/uploads/2019/12/005-sql-server-to-snowflake-configure-s3-target-connection-300x194.png 300w" sizes="(max-width: 549px) 100vw, 549px" /><p id="caption-attachment-8559" class="wp-caption-text">Export CSV Task: configuring Amazon S3 endpoint as the target for data export</p></div></li>
<li>Then select <strong>S3</strong> as <em>Storage Service</em> and fill in <em>Access and Secret Key</em>, select your region (optional):
<div id="attachment_8561" style="width: 576px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8561" class="wp-image-8561 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/008-sql-server-to-snowflake-configure-s3-target-connection-entering-access-and-secret-keys.png" alt="Configuring Amazon Connection Manager to store staging data in S3" width="566" height="553" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/008-sql-server-to-snowflake-configure-s3-target-connection-entering-access-and-secret-keys.png 566w, https://zappysys.com/blog/wp-content/uploads/2019/12/008-sql-server-to-snowflake-configure-s3-target-connection-entering-access-and-secret-keys-300x293.png 300w" sizes="(max-width: 566px) 100vw, 566px" /><p id="caption-attachment-8561" class="wp-caption-text">Configuring Amazon Connection Manager to store staging data in S3</p></div></li>
<li>Then in the Export CSV Task window, in <strong>Target</strong> tab configure it similarly:
<div id="attachment_8698" style="width: 637px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8698" class="size-full wp-image-8698" src="https://zappysys.com/blog/wp-content/uploads/2019/12/006-sql-server-to-snowflake-configure-s3-target-path-to-export-csv-files-to-s3-2.png" alt="Export CSV Task: configuring target to store staging data in S3" width="627" height="593" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/006-sql-server-to-snowflake-configure-s3-target-path-to-export-csv-files-to-s3-2.png 627w, https://zappysys.com/blog/wp-content/uploads/2019/12/006-sql-server-to-snowflake-configure-s3-target-path-to-export-csv-files-to-s3-2-300x284.png 300w" sizes="(max-width: 627px) 100vw, 627px" /><p id="caption-attachment-8698" class="wp-caption-text">Export CSV Task: configuring target to store staging data in S3</p></div>
<p>Another option you may consider is to use <a href="https://zappysys.com/blog/ssis-amazon-s3-storage-task-examples-download-upload-move-delete-files-folders/">Amazon Storage Task</a> or <a href="https://zappysys.com/products/ssis-powerpack/ssis-azure-blob-storage-task/">Azure Storage Task</a> to upload files to S3 in a separate step after the Export CSV Step. For this perform the following steps.</p>
<ol>
<li>On the Target tab, set Save Mode to <strong>Save to Path (Local Disk) instead of Connection to S3 or Azure</strong>.</li>
<li>Drag &amp; Drop <a href="https://zappysys.com/blog/ssis-amazon-s3-storage-task-examples-download-upload-move-delete-files-folders/">Amazon Storage Task</a> or <a href="https://zappysys.com/products/ssis-powerpack/ssis-azure-blob-storage-task/">Azure Storage Task</a>  and connect with the previous Export CSV Step</li>
<li>Configure Storage Task to upload Local files to S3 or Azure Blob Storage.</li>
<li>Continue to the next section for more instructions.</li>
</ol>
</li>
</ol>
<h3>Add Execute Process Task to create a staging area</h3>
<p>We are ready to add Execute Process Task to create a staging area in Snowflake. Again, depending on where you will store staging files, Snowflake&#8217;s local storage or Amazon S3, use one of the approaches below:</p>
<p><strong>Create Snowflake local staging</strong></p>
<div id="attachment_8562" style="width: 788px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8562" class="wp-image-8562 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/009-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage.png" alt="Creating a local stage in Snowflake using SnowSQL command-line tool and SSIS" width="778" height="314" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/009-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage.png 778w, https://zappysys.com/blog/wp-content/uploads/2019/12/009-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage-300x121.png 300w, https://zappysys.com/blog/wp-content/uploads/2019/12/009-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage-768x310.png 768w" sizes="(max-width: 778px) 100vw, 778px" /><p id="caption-attachment-8562" class="wp-caption-text">Creating a local stage in Snowflake using SnowSQL command-line tool and SSIS</p></div>
<p>File path:</p>
<p><code>C:\Program Files\Snowflake SnowSQL\snowsql.exe</code></p>
<p>Arguments:</p>
<p><code>-q "CREATE OR REPLACE STAGE CustomersStaging" -d DEMO_DB -s Public</code></p>
<div class="su-note"  style="border-color:#e5de9d;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-u-clearfix su-u-trim" style="background-color:#FFF8B7;border-color:#ffffff;color:#333333;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><strong>NOTE:</strong> Replace <strong>DEMO_DB</strong> with the database name you are using. Also, do the same thing with schema <strong>Public</strong>.</div></div>
<p><strong>Create Amazon S3 staging</strong></p>
<p><img loading="lazy" decoding="async" class="wp-image-8563 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/011-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage-in-s3.png" alt="Creating an Amazon S3 stage in Snowflake using SnowSQL command-line tool and SSIS" width="737" height="314" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/011-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage-in-s3.png 737w, https://zappysys.com/blog/wp-content/uploads/2019/12/011-sql-server-to-snowflake-configure-snow-sql-to-create-a-stage-in-s3-300x128.png 300w" sizes="(max-width: 737px) 100vw, 737px" /></p>
<p>Creating an Amazon S3 stage in Snowflake using SnowSQL command-line tool and SSISFile path:</p>
<p><code>C:\Program Files\Snowflake SnowSQL\snowsql.exe</code></p>
<p>Arguments:</p>
<p><code>CREATE OR REPLACE STAGE CustomersStaging url='s3://your-bucket-name/destinationFolder/' credentials=(aws_key_id='AKIAXXXXXXXXXXXXXXXX' aws_secret_key='6p1ayaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaLXz88')</code></p>
<h3>Add Execute Process Task to upload files to the staging area (local staging approach only)</h3>
<p>We still need to upload CSV files that were saved locally for local Snowflake staging. So add another Execute Process Task and configure it similarly:</p>
<h3><img loading="lazy" decoding="async" class="wp-image-8564 size-full" style="font-size: 16px;" src="https://zappysys.com/blog/wp-content/uploads/2019/12/010-sql-server-to-snowflake-configure-snow-sql-to-put-CSV-files-into-stage.png" alt="Uploading local CSV files to Snowflake stage using SnowSQL command-line tool and SSIS" width="726" height="314" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/010-sql-server-to-snowflake-configure-snow-sql-to-put-CSV-files-into-stage.png 726w, https://zappysys.com/blog/wp-content/uploads/2019/12/010-sql-server-to-snowflake-configure-snow-sql-to-put-CSV-files-into-stage-300x130.png 300w" sizes="(max-width: 726px) 100vw, 726px" /></h3>
<p>Uploading local CSV files to Snowflake stage using SnowSQL command-line tool and SSISFile path:</p>
<p><code>C:\Program Files\Snowflake SnowSQL\snowsql.exe</code></p>
<p>Arguments:</p>
<p><code>-q "PUT file://e:/temp/*.csv @DEMO_DB.PUBLIC.CustomersStaging"</code></p>
<div class="su-note"  style="border-color:#e5de9d;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><div class="su-note-inner su-u-clearfix su-u-trim" style="background-color:#FFF8B7;border-color:#ffffff;color:#333333;border-radius:3px;-moz-border-radius:3px;-webkit-border-radius:3px;"><strong>NOTE:</strong> e:/temp/ is a directory we used in Export CSV Task. Replace <strong>DEMO_DB</strong> with the database name you are using. Also, do the same thing with schema <strong>Public</strong>.</div></div>
<h3>Add Execute Process Task to copy files from the Snowflake/Amazon S3 stage to Snowflake table</h3>
<p>Finally, add the final Execute Process Task to issue a command to load files from the stage into a real Snowflake table:</p>
<div id="attachment_8565" style="width: 736px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8565" class="wp-image-8565 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/012-sql-server-to-snowflake-configure-snow-sql-to-copy-data-from-stage-to-table.png" alt="Copying data from a staging area to a Snowflake table using SSIS" width="726" height="313" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/012-sql-server-to-snowflake-configure-snow-sql-to-copy-data-from-stage-to-table.png 726w, https://zappysys.com/blog/wp-content/uploads/2019/12/012-sql-server-to-snowflake-configure-snow-sql-to-copy-data-from-stage-to-table-300x129.png 300w" sizes="(max-width: 726px) 100vw, 726px" /><p id="caption-attachment-8565" class="wp-caption-text">Copying data from a staging area to a Snowflake table using SSIS</p></div>
<p>File path:</p>
<p><code>C:\Program Files\Snowflake SnowSQL\snowsql.exe</code></p>
<p>Arguments:</p>
<p><code>-q "COPY INTO CUSTOMERS FROM @CustomersStaging file_format = (format_name = 'GZIPCSVFORMAT')" -d DEMO_DB -s Public</code></p>
<h3>Execute the package</h3>
<p>We are ready to execute the package, so just run it. In this example you see the workflow when using Amazon S3 as stage:</p>
<div id="attachment_8569" style="width: 359px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8569" class="wp-image-8569 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-execute-the-package.png" alt="Executing an SSIS package to load data from Snowflake to SQL Server" width="349" height="389" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-execute-the-package.png 349w, https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-execute-the-package-269x300.png 269w" sizes="(max-width: 349px) 100vw, 349px" /><p id="caption-attachment-8569" class="wp-caption-text">Executing an SSIS package to load data from Snowflake to SQL Server</p></div>
<h3>The Results</h3>
<p>Once the package executes, we can check the results:</p>
<div id="attachment_8570" style="width: 363px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8570" class="wp-image-8570 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/014-sql-server-to-snowflake-the-results-ssis-package.png" alt="Successful execution of SSIS package when loading data from Snowflake to SQL Server" width="353" height="390" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/014-sql-server-to-snowflake-the-results-ssis-package.png 353w, https://zappysys.com/blog/wp-content/uploads/2019/12/014-sql-server-to-snowflake-the-results-ssis-package-272x300.png 272w" sizes="(max-width: 353px) 100vw, 353px" /><p id="caption-attachment-8570" class="wp-caption-text">Successful execution of SSIS package when loading data from Snowflake to SQL Server</p></div>
<p>If we go to Snowflake and execute the SELECT query, we see all 10M customers loaded:</p>
<div id="attachment_8568" style="width: 728px" class="wp-caption alignnone"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-8568" class="wp-image-8568 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-the-results.png" alt="Successful loading of 10 million rows from SQL Server to Snowflake using SSIS and ZappySys SSIS PowerPack" width="718" height="396" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-the-results.png 718w, https://zappysys.com/blog/wp-content/uploads/2019/12/013-sql-server-to-snowflake-the-results-300x165.png 300w" sizes="(max-width: 718px) 100vw, 718px" /><p id="caption-attachment-8568" class="wp-caption-text">Successful loading of 10 million rows from SQL Server to Snowflake using SSIS and ZappySys SSIS PowerPack</p></div>
<p>All done in less than 3 mins:</p>
<p><img loading="lazy" decoding="async" class="wp-image-8702 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/12/017-sql-server-to-snowflake-loads-data-in-less-than-3-mins-e1580146280905.png" alt="Loading 10 million rows from SQL Server to Snowflake using SSIS and ZappySys SSIS PowerPack just in 3 minutes" width="717" height="423" srcset="https://zappysys.com/blog/wp-content/uploads/2019/12/017-sql-server-to-snowflake-loads-data-in-less-than-3-mins-e1580146280905.png 717w, https://zappysys.com/blog/wp-content/uploads/2019/12/017-sql-server-to-snowflake-loads-data-in-less-than-3-mins-e1580146280905-300x177.png 300w" sizes="(max-width: 717px) 100vw, 717px" /></p>
<h2>Conclusion</h2>
<p style="text-align: justify;">We achieved just what we aimed for &#8211; we loaded 10 million rows from SQL Server to Snowflake; all done in less than 3 minutes! Two approaches were taken into consideration: in one we staged CSV files in the local Snowflake stage and in the other we staged files in the Amazon S3 bucket. To accomplish this we used <a href="https://zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">ZappySys Export CSV Task</a>, which allowed us to export data from the SQL Server view to CSV format, split files into smaller ones, zip them and store them locally or upload to the Amazon S3 bucket. Finally, we used a standard SSIS Task &#8220;Execute Process Task&#8221; to issue Snowflake commands such as creating the stage area in Snowflake, uploading CSVs into the local stage and loading data into the Snowflake table.</p>
<h2>Download a sample package</h2>
<p><a href="https://zappysys.com/blog/wp-content/uploads/2019/12/Load-10M-rows-from-SQL-Server-into-Snowflake-1.zip">Load 10M rows from SQL Server into Snowflake.dtsx.zip</a></p>
<h2>References</h2>
<p><a href="https://docs.snowflake.net/manuals/user-guide/data-load-bulk.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/user-guide/data-load-bulk.html</a></p>
<p><a href="https://docs.snowflake.net/manuals/user-guide/snowsql.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/user-guide/snowsql.html</a></p>
<p><a href="https://docs.snowflake.net/manuals/user-guide/data-load-s3-create-stage.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/user-guide/data-load-s3-create-stage.html</a></p>
<p><a href="https://docs.snowflake.net/manuals/sql-reference/sql/create-file-format.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/sql-reference/sql/create-file-format.html</a></p>
<p><a href="https://docs.snowflake.net/manuals/sql-reference/sql/put.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/sql-reference/sql/put.html</a></p>
<p><a href="https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html" target="_blank" rel="noopener">https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html</a></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>The post <a href="https://zappysys.com/blog/load-10-million-rows-from-sql-server-to-snowflake/">Load 10M rows from SQL Server to Snowflake in 3 minutes</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</title>
		<link>https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Wed, 27 Jul 2016 22:07:34 +0000</pubDate>
				<category><![CDATA[SSIS Amazon S3 CSV Dest]]></category>
		<category><![CDATA[SSIS Amazon Storage Task]]></category>
		<category><![CDATA[SSIS CSV Export Task]]></category>
		<category><![CDATA[Amazon S3]]></category>
		<category><![CDATA[Amazon S3 Task]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[CSV]]></category>
		<category><![CDATA[export]]></category>
		<category><![CDATA[Export CSV Task]]></category>
		<category><![CDATA[sql server]]></category>
		<guid isPermaLink="false">http://zappysys.com/blog/?p=704</guid>

					<description><![CDATA[<p>Introduction In this blog post you will see how easy it is to load large amount of data from SQL Server to Amazon S3 Storage. For demo purpose we will use SQL Server as relational source but you can use same steps for any database engine such as Oracle, MySQL, DB2. In this post we [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/">Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p>In this blog post you will see how easy it is to load large amount of data from <em>SQL Server to Amazon S3</em> Storage. For demo purpose we will use SQL Server as relational source but you can use same steps for any database engine such as Oracle, MySQL, DB2. In this post we will use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export CSV Task</a> and <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amazon S3 Storage Task </a> to achieve desired integration with Amazon S3 with drag and drop approach. You can also export JSON or XML data to Amazon S3 using same techniques (Use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-json-file-task/" target="_blank" rel="noopener">Export JSON Task</a>  or <a href="//zappysys.com/products/ssis-powerpack/ssis-export-xml-file-task/" target="_blank" rel="noopener">Export XML Task</a> ).</p>
<p>Our goal is to achieve following things</p>
<ul>
<li>Extract large amount of data from SQL Server Table or Query and export to CSV files</li>
<li>Generate CSV files in compressed format (*.gz) to speedup upload and save data transfer cost to S3</li>
<li>Split CSV files by row count</li>
<li>Upload data to Amazon S3 using highly parallel manner for maximum speed</li>
</ul>
<p>There are three different ways you can achieve data export to Amazon S3 using SSIS.</p>
<ol>
<li><strong>Method-1 (Fastest)</strong>: Use two step process (First export SQL Server data to local files using <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export Task</a> and then upload files to S3 using  <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amazon S3 Storage Task </a> )</li>
<li><strong>Method-2 (Slower)</strong>: Use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export Task</a> with Amazon S3 Connection as Target rather than save to Local files.</li>
<li><strong>Method-3 (Slower)</strong>: Use Data flow components like <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">Amazon S3 Destination for CSV</a>  (for JSON / XML  use Method1 or Method2)</li>
</ol>
<p>Each method has its own advantage / disadvantage. If you prefer to upload / compress / split large amount of data then we recommend Method#1 (Two steps). If you have not very huge dataset then you can use Method#2 or Method#3. For Last method you can only use CSV export option (we don&#8217;t have JSON/ XML Destination for Amazon S3 yet &#8211; we may add in future)</p>
<p><strong>Screenshot of SSIS Package</strong></p>
<div id="attachment_707" style="width: 710px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-707" class="wp-image-707" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png" alt="Extract SQL Server Data to CSV files in SSIS (Bulk export) and Split / GZip Compress / upload files to Amazon S3 (AWS Cloud)" width="700" height="365" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png 825w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3-300x156.png 300w" sizes="(max-width: 700px) 100vw, 700px" /></a><p id="caption-attachment-707" class="wp-caption-text">Extract SQL Server Data to CSV files in SSIS (Bulk export) and Split / GZip Compress / upload files to Amazon S3 (AWS Cloud)</p></div>
<h2>Method-1 : Upload SQL data to Amazon S3 in Two steps</h2>
<p>In this section we will see first method (recommended) to upload SQL data to Amazon S3. This is the fastest approach if you have lots of data to upload.  In this approach we first create CSV files from SQL Server data on local disk using <a href="https://zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">SSIS Export CSV Task</a>. After that in second step we upload all files to Amazon S3 using <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">SSIS Amazon Storage Task</a>.</p>
<h3>Step-1: Configure Source Connection in Export CSV Task</h3>
<p>To extract data from SQL Server you can use Export CSV Task. It has many options which makes it possible to split large amount of data into multiple files. You can specify single table or multiple tables as your data source.</p>
<p>For multiple table use vertical bar. e.g.  dbo.Customers|dbo.Products|dbo.Orders. When you export this it will create 3 files ( dbo.Customers.csv , dbo.Products.csv, dbo.Orders.csv )</p>
<p><strong>Steps:</strong></p>
<ol>
<li>Drag ZS Export CSV Task from Toolbox</li>
<li>Double click task to configure</li>
<li>From connection drop down select New connection option (OLEDB or ADO.net)</li>
<li>Once connection is configured for Source database specify SQL Query to extract data as below
<div id="attachment_705" style="width: 528px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-705" class="size-full wp-image-705" src="//zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png" alt="Export SQL Server Table or Query as CSV file (Bulk export in SSIS)" width="518" height="494" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png 518w, https://zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast-300x286.png 300w" sizes="(max-width: 518px) 100vw, 518px" /></a><p id="caption-attachment-705" class="wp-caption-text">Export SQL Server Table or Query as CSV file (Bulk export in SSIS)</p></div></li>
<li>Now go to target tab. Here you can specify full path for file. e.g. c:\ssis\temp\s3dump\cust.csv</li>
</ol>
<h3>Step-2: Compress CSV Files in SSIS ( GZIP format &#8211; *.gz )</h3>
<p>Above steps will export file as CSV format without splitting or compression. But to compress file once exported you can go to Target tab of Export CSV Task and check [<strong>Compress file to *.gz format</strong>] option.</p>
<div id="attachment_706" style="width: 579px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-706" class="size-full wp-image-706" src="//zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png" alt="Compress exported SQL Server data files to GZip ( *.gz) in SSIS Export CSV Task" width="569" height="462" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png 569w, https://zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis-300x244.png 300w" sizes="(max-width: 569px) 100vw, 569px" /></a><p id="caption-attachment-706" class="wp-caption-text">Compress exported SQL Server data files to GZip ( *.gz) in SSIS Export CSV Task</p></div>
<h3>Step-3: Split CSV files by row count or data size in SSIS</h3>
<p>Now lets look at how to split exported CSV files into multiple files so we can upload many files in parallel. Goto Split Options and check [<strong>Enable Split by Size/Rows</strong>]
<div id="attachment_708" style="width: 435px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-708" class="size-full wp-image-708" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png" alt="Using SSIS Split Exported CSV files (Split by row count or size)" width="425" height="489" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png 425w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data-261x300.png 261w" sizes="(max-width: 425px) 100vw, 425px" /></a><p id="caption-attachment-708" class="wp-caption-text">Using SSIS Split Exported CSV files (Split by row count or size)</p></div>
<h3>Step-4: Upload CSV files to Amazon S3 &#8211; Using multi threaded option</h3>
<p>Now final thing is use <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amzon S3 Task</a> to upload files to S3.</p>
<h3>Things to remember</h3>
<p>Sometimes times due to high network activity you may get timeout errors during upload. In that case you can adjust few settings <a href="https://zappysys.com/forums/topic/change-timeout-value-amazon-s3-operations/" target="_blank" rel="noopener">described here</a>. Also try to reduce total parallel threads on S3 Connection see that helps.</p>
<p><strong>Steps:</strong></p>
<ol>
<li>Drag ZS Amazon Storage Task from SSIS toolbox</li>
<li>Double click Amazon Storage Task to configure it</li>
<li>Specify Action = UploadFilesToAmazon</li>
<li>Specify Source file path (or pattern) e.g. c:\SSIS\temp\s3dump\*.*</li>
<li>Now in the Target connection dropdown click [New]</li>
<li>When Connection UI opens select Service Type = S3</li>
<li>Enter your Access Key, Secret Key and Region (Leave all other parameters default if you not sure)</li>
<li>Click Test and close connection UI</li>
<li>On the Target path on S3 Storage Task enter your bucket and folder path where you want to upload local files. For example your bucket name is bw-east-1 and folder is sqldata then enter as below<br />
<strong>bw-east-1/sqldata/</strong></li>
<li>Click ok and Run package to test full package</li>
</ol>
<div id="attachment_709" style="width: 709px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-709" class="wp-image-709" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png" alt="Upload local files to Amazon S3 using SSIS AWS Storage Task" width="699" height="466" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png 953w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3-300x200.png 300w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3-272x182.png 272w" sizes="(max-width: 699px) 100vw, 699px" /></a><p id="caption-attachment-709" class="wp-caption-text">Upload local files to Amazon S3 using SSIS AWS Storage Task</p></div>
<h2></h2>
<h2>Method-2 : Upload SQL data to Amazon S3 without local stage (One step)</h2>
<p>Now let&#8217;s change previous approach little bit to send SQL server data directly to Amazon S3 without any Landing area on local disk.  <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export CSV Task</a> , <a href="//zappysys.com/products/ssis-powerpack/ssis-export-json-file-task/" target="_blank" rel="noopener">Export JSON Task</a>  and <a href="//zappysys.com/products/ssis-powerpack/ssis-export-xml-file-task/" target="_blank" rel="noopener">Export XML Task</a> all of them supports Amazon S3 / Azure Blob and Secure FTP (SFTP) connection as target (Only available in <strong>Pro Edition</strong>). We will use this feature in following section.</p>
<p>This approach helps to avoid any local disk need and it may be useful for security reason for some users. However drawback of this approach is, it wont use parallel threads to upload large amount of data like previous method.</p>
<p>Following change will be needed on Export task to upload SQL data directly to S3 / FTP or Azure storage.</p>
<div id="attachment_5252" style="width: 859px" class="wp-caption alignnone"><a href="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-5252" class="size-full wp-image-5252" src="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png" alt="Export SQL data to multiple files to Amazon S3, Azure, Secure FTP (SFTP) in Stream Mode. Compress GZip, Overwrite, Split Options" width="849" height="627" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png 849w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip-300x222.png 300w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip-768x567.png 768w" sizes="(max-width: 849px) 100vw, 849px" /></a><p id="caption-attachment-5252" class="wp-caption-text">Export SQL data to multiple files to Amazon S3, Azure, Secure FTP (SFTP) in Stream Mode using SSIS. Configure Compress GZip, Overwrite, Split Options</p></div>
<h2>Method-3 : Using Amazon S3 destination &#8211; Generate Amazon S3 file from any source</h2>
<p>Now let&#8217;s look at third approach to save data from any SSIS Source to Amazon S3 file. Advantage of this approach is you are not limited to few source options provided by Export CSV Task. If you have complex data transformation needed in Data Flow before sending data to S3 then use this approach.  We will use <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">Amazon S3 Destination for CSV</a> as below</p>
<ol>
<li>Drag SSIS Data flow task from toolbox</li>
<li>Create necessary source connection (e.g. OLEDB connection)</li>
<li>Create Amazon S3 Connection (Right click in Connection Managers panel in bottom and click New connection and select <strong>ZS-AMAZON-STORAGE</strong> type )</li>
<li>Once connection managers are created Go to data flow designer and Drag OLEDB Source</li>
<li>Configure OLEDB Source to read desired data from source system (e.g. SQL Server / Oracle)</li>
<li>Once source is configured drag <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">ZS Amazon S3 CSV File Destination</a> from SSIS toolbox</li>
<li>Double click S3 Destination and configure as below
<ol>
<li>On Connection Managers tab select S3 Connection (We created in earlier section).</li>
<li>Properties tab configure like below screenshot</li>
<li>On Input Columns tab select desired column you like to write in the target file. Your name from upstream will be taken as is for target file. So make sure to name upstream columns correctly.</li>
<li>Click OK to save UI</li>
</ol>
</li>
<li>Execute package and check your S3 Bucket to see files got created.</li>
</ol>
<div id="attachment_5253" style="width: 729px" class="wp-caption alignnone"><a href="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-5253" class="size-full wp-image-5253" src="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png" alt="Loading SQL Server data into S3 Bucket Files (Split, Compress Gzip Options) - SSIS Amazon S3 CSV File Destination" width="719" height="782" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png 719w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options-276x300.png 276w" sizes="(max-width: 719px) 100vw, 719px" /></a><p id="caption-attachment-5253" class="wp-caption-text">Loading SQL Server data into S3 Bucket Files (Split, Compress Gzip Options) &#8211; SSIS Amazon S3 CSV File Destination</p></div>
<h2>Conclusion</h2>
<p>In this post you have seen how easy it is to upload / archive your SQL Server data (or any other RDBMS data) to Amazon S3 Storage in few clicks. <a href="//zappysys.com/products/ssis-powerpack/">Try SSIS PowerPack</a> for free and find out yourself how easy it is to integrate SQL Server and Amazon S3 using SSIS.</p>
<p>The post <a href="https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/">Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
