<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Amazon S3 Archives | ZappySys Blog</title>
	<atom:link href="https://zappysys.com/blog/tag/amazon-s3/feed/" rel="self" type="application/rss+xml" />
	<link>https://zappysys.com/blog/tag/amazon-s3/</link>
	<description>SSIS / ODBC Drivers / API Connectors for JSON, XML, Azure, Amazon AWS, Salesforce, MongoDB and more</description>
	<lastBuildDate>Wed, 16 Oct 2019 20:54:51 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.4.4</generator>

 
	<item>
		<title>Read Amazon S3 Storage Files in SSIS (CSV, JSON, XML)</title>
		<link>https://zappysys.com/blog/read-amazon-storage-s3-files-ssis-csv-json-xml/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Sat, 16 Mar 2019 09:43:13 +0000</pubDate>
				<category><![CDATA[S3 (Simple Storage Service)]]></category>
		<category><![CDATA[SSIS Amazon S3 Connection]]></category>
		<category><![CDATA[SSIS Amazon S3 CSV Source]]></category>
		<category><![CDATA[SSIS Amazon S3 JSON Source]]></category>
		<category><![CDATA[SSIS Amazon S3 XML Source]]></category>
		<category><![CDATA[amazon]]></category>
		<category><![CDATA[Amazon S3]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[CSV]]></category>
		<category><![CDATA[json]]></category>
		<category><![CDATA[ssis]]></category>
		<category><![CDATA[xml]]></category>
		<guid isPermaLink="false">https://zappysys.com/blog/?p=6544</guid>

					<description><![CDATA[<p>Introduction In our previous blog we saw how to load data into Amazon S3. Now in this blog, we will see How to read Amazon S3 Storage Files in SSIS (CSV, JSON, XML Format files). To illustrate, we will use ZappySys SSIS PowerPack, which includes several tasks to import/export data from multiples sources to multiple destinations [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/read-amazon-storage-s3-files-ssis-csv-json-xml/">Read Amazon S3 Storage Files in SSIS (CSV, JSON, XML)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/s3.png" target="_blank" rel="noopener"><img decoding="async" class="alignleft wp-image-6547 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/s3-e1553183075864.png" alt="Amazon S3 - AWS Storage" width="150" height="150" /></a>In our previous blog we saw <a href="https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/" target="_blank" rel="noopener">how to load data into Amazon S3</a>. Now in this blog, we will see <strong>How to read Amazon S3 Storage Files in SSIS (CSV, JSON, XML Format files)</strong>. To illustrate, we will use <a href="https://zappysys.com/products/ssis-powerpack/" target="_blank" rel="noopener">ZappySys SSIS PowerPack</a>, which includes several tasks to import/export data from multiples sources to multiple destinations like flat files, Azure, AWS, databases, Office files and more. They are Coding free, drag and drop high-performance suite of <em>Custom SSIS Components</em> and <em>SSIS Tasks.</em> If you like perform File operations on Amazon S3 Files (e.g. Download, Upload, Create, Delete) then <a href="https://zappysys.com/blog/category/ssis/tasks/ssis-amazon-storage-task/" target="_blank" rel="noopener">check these articles</a>.</p>
<p>In nutshell, this post will focus on how to Read Amazon S3 Storage CSV, JSON and XML Files using respective SSIS Source tasks.</p>
<p>&nbsp;</p>
<p><strong>Components Mentioned in this article</strong><br />
<div class="su-table su-table-alternate">
<table style="width: 407px;height: 187px">
<tbody>
<tr>
<td style="width: 35px"><img decoding="async" src="https://i2.wp.com/zappysys.com/onlinehelp/ssis-powerpack/scr/images/amazon-s3-csv-source/ssis-amazon-s3-csv-file-source.png?w=720&amp;ssl=1" alt="SSIS Amazon S3 CSV File Source" width="32" height="32" /></td>
<td style="width: 356px"><a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-source/" target="_blank" rel="noopener">Amazon S3 CSV File Source</a></td>
</tr>
<tr>
<td style="width: 35px"><img loading="lazy" decoding="async" src="https://i2.wp.com/zappysys.com/onlinehelp/ssis-powerpack/scr/images/amazon-s3-csv-destination/ssis-amazon-s3-csv-file-destination.png?w=720&amp;ssl=1" alt="SSIS Amazon S3 CSV File Destination" width="32" height="32" /></td>
<td style="width: 356px"><a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">Amazon S3 CSV File Destination</a></td>
</tr>
<tr>
<td style="width: 35px"><img loading="lazy" decoding="async" src="https://i0.wp.com/zappysys.com/onlinehelp/ssis-powerpack/scr/images/amazon-s3-xml-source/ssis-amazon-s3-xml-file-source.png?w=720&amp;ssl=1" alt="SSIS Amazon S3 XML File Source" width="32" height="32" /></td>
<td style="width: 356px"><a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-xml-file-source/" target="_blank" rel="noopener">Amazon S3 XML File Source</a></td>
</tr>
<tr>
<td style="width: 35px"><img loading="lazy" decoding="async" src="https://i2.wp.com/zappysys.com/onlinehelp/ssis-powerpack/scr/images/amazon-s3-json-source/ssis-amazon-s3-json-file-source.png?w=720&amp;ssl=1" alt="SSIS Amazon S3 JSON File Source" width="32" height="32" /></td>
<td style="width: 356px"><a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-json-file-source/" target="_blank" rel="noopener">Amazon S3 JSON File Source</a><b></b></td>
</tr>
</tbody>
</table>
</div>
<h2>Prerequisite</h2>
<ol>
<li>First, you will need to have SSIS installed</li>
<li>Secondly, make sure to have SSDT</li>
<li>You have obtained Amazon S3 account access key / secret key.</li>
<li>Finally, do not forget to install ZappySys <a href="https://zappysys.com/products/ssis-powerpack/" target="_blank" rel="noopener">SSIS PowerPack</a></li>
</ol>
<h2>What is Amazon S3 Storage</h2>
<p>Amazon Simple Storage Service (Amazon S3) is an object storage service that offers industry-leading scalability, data availability, security, and performance. This means customers of all sizes and industries can use it to store and protect any amount of data for a range of use cases, such as websites, mobile applications, backup and restore, archive, enterprise applications, IoT devices, and big data analytics. Amazon S3 provides easy-to-use management features so you can organize your data and configure finely-tuned access controls to meet your specific business, organizational, and compliance requirements. Amazon S3 is designed for 99.999999999% (11 9&#8217;s) of durability, and stores data for millions of applications for companies all around the world.</p>
<h2>Getting Started</h2>
<p>In order to start, we will show several examples. ZappySys includes an <a href="https://zappysys.com/products/ssis-powerpack/#cat_amazon_aws_cloud" target="_blank" rel="noopener">SSIS Amazon S3 Source for CSV/JSON/XML File</a> that will help you in reading CSV, JSON and XML Files from Amazon S3 to the Local machine, Upload files(s) to Amazon S3 Storage. It will also support Delete, Rename, List, Get Property, Copy, Move, Create, Set Permission … and many more operations. Here we are showing you is, How to download files from Amazon S3 Storage.</p>
<p>You can connect to your <a href="https://console.aws.amazon.com/?nc2=h_m_mc" target="_blank" rel="noopener">Amazon S3 Account</a> by entering your storage account credentials.</p>
<h2>Read Amazon S3 Storage Files in SSIS (CSV, JSON, XML)</h2>
<p>Let´s start with an example. In this SSIS Amazon S3 Source for CSV/JSON/XML File task example, we will read CSV/JSON/XML files from Amazon S3 Storage to SQL Server database.</p>
<ol>
<li>First of All, Drag and drop Data Flow Task from SSIS Toolbox and double click it to edit.
<div id="attachment_7934" style="width: 470px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/09/ssis-drag-drop-data-flow-task.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-7934" class="wp-image-7934 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/09/ssis-drag-drop-data-flow-task.png" alt="Drag and Drop SSIS Data Flow Task from SSIS Toolbox" width="460" height="155" srcset="https://zappysys.com/blog/wp-content/uploads/2019/09/ssis-drag-drop-data-flow-task.png 460w, https://zappysys.com/blog/wp-content/uploads/2019/09/ssis-drag-drop-data-flow-task-300x101.png 300w" sizes="(max-width: 460px) 100vw, 460px" /></a><p id="caption-attachment-7934" class="wp-caption-text">Drag and Drop SSIS Data Flow Task from SSIS Toolbox</p></div></li>
<li>Drag and Drop relevant Amazon S3 Source for CSV/JSON/XML File Task from the SSIS Toolbox.
<div id="attachment_6590" style="width: 710px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-add-tasks.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-6590" class="wp-image-6590 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-add-tasks.png" alt="Add Amazon S3 Source Tasks" width="700" height="244" srcset="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-add-tasks.png 700w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-add-tasks-300x105.png 300w" sizes="(max-width: 700px) 100vw, 700px" /></a><p id="caption-attachment-6590" class="wp-caption-text">Add Amazon S3 Source Tasks</p></div></li>
<li>Create a connection for Amazon S3 Storage Account.
<div id="attachment_6551" style="width: 765px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-select-connection-manager.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-6551" class="wp-image-6551 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-select-connection-manager.png" alt="Create Amazon S3 Storage Connection" width="755" height="528" srcset="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-select-connection-manager.png 755w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-amazon-s3-source-select-connection-manager-300x210.png 300w" sizes="(max-width: 755px) 100vw, 755px" /></a><p id="caption-attachment-6551" class="wp-caption-text">Create Amazon S3 Storage Connection</p></div></li>
<li>Select the relevant single file to read from Amazon S3 Storage in their relevant source of CSV/JSON/XML File Task.
<div id="attachment_6539" style="width: 944px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-6539" class="wp-image-6539 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File.png" alt="Select File From Azure Blob Storage" width="934" height="582" srcset="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File.png 934w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File-300x187.png 300w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File-768x479.png 768w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-select-File-436x272.png 436w" sizes="(max-width: 934px) 100vw, 934px" /></a><p id="caption-attachment-6539" class="wp-caption-text">Select File From Amazon S3 Storage</p></div></li>
<li>We can also read the multiple files stored in Amazon S3 Storage using wildcard pattern supported e.g. dbo.tblNames*.csv / dbo.tblNames*.json / dbo.tblNames*.xml in relevant source task
<div id="attachment_6540" style="width: 557px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-set-multiple-Filepath.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-6540" class="wp-image-6540 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-set-multiple-Filepath.png" alt="Use wildcard pattern .* to read multiple files data" width="547" height="178" srcset="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-set-multiple-Filepath.png 547w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-Azure-Blob-Source-set-multiple-Filepath-300x98.png 300w" sizes="(max-width: 547px) 100vw, 547px" /></a><p id="caption-attachment-6540" class="wp-caption-text">Use wildcard pattern .* to read multiple files data</p></div></li>
<li>We can also read the zip and gzip compressed files also without extracting it in the specific Amazon S3 Source for CSV/JSON/XML File Task.
<div id="attachment_6541" style="width: 698px" class="wp-caption aligncenter"><a href="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-azure-blob-storage-source-read-zip-gzip-compressed-files.png" target="_blank" rel="noopener"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-6541" class="wp-image-6541 size-full" src="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-azure-blob-storage-source-read-zip-gzip-compressed-files.png" alt="Reading zip and gzip compressed files (stream mode)" width="688" height="273" srcset="https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-azure-blob-storage-source-read-zip-gzip-compressed-files.png 688w, https://zappysys.com/blog/wp-content/uploads/2019/03/ssis-azure-blob-storage-source-read-zip-gzip-compressed-files-300x119.png 300w" sizes="(max-width: 688px) 100vw, 688px" /></a><p id="caption-attachment-6541" class="wp-caption-text">Reading zip and gzip compressed files (stream mode)</p></div></li>
<li>Finally, we are ready to load this file(s) data into the SQL Server.</li>
</ol>
<h2>Load Amazon S3 files data into SQL Server</h2>
<div class="content_block" id="custom_post_widget-5617"><p>ZappySys SSIS PowerPack makes it easy to load data from various sources such as REST, SOAP, JSON, XML, CSV or from other source into SQL Server, or PostgreSQL, or Amazon Redshift, or other  targets. The <strong>Upsert Destination</strong> component allows you to automatically insert new records and update existing ones based on key columns. Below are the detailed steps to configure it.</p>
<h3>Step 1: Add Upsert Destination to Data Flow</h3>
<ol>
<li>Drag and drop the <strong>Upsert Destination</strong> component from the SSIS Toolbox.</li>
<li>Connect your source component (e.g., JSON / REST / Other Source) to the Upsert Destination.</li>
</ol>
<div class="wp-caption aligncenter">
<a href="https://zappysys.com/blog/wp-content/uploads/2017/08/ssis-data-flow-drag-drop-upsert-destination.png">
<img loading="lazy" decoding="async" class="size-full" alt="" src="https://zappysys.com/blog/wp-content/uploads/2017/08/ssis-data-flow-drag-drop-upsert-destination.png" /></a>
<p class="wp-caption-text">SSIS - Data Flow - Drang and Drop Upsert Destination Component</p>
</div>
<h3>Step 2: Configure Target Connection</h3>
<ol>
<li>Double-click the <strong>Upsert Destination</strong> component to open the configuration window.</li>
<li>Under <strong>Connection</strong>, select an existing target connection or click <strong>NEW</strong> to create a new connection.
<ul>
<li>Example: SQL Server, or PostgreSQL, or Amazon Redshift.</li>
</ul>
</li>
</ol>
<h3>Step 3: Select or Create Target Table</h3>
<ol>
<li>In the <strong>Target Table</strong> dropdown, select the table where you want to load data.</li>
<li>Optionally, click <strong>NEW</strong> to create a new table based on the source columns.</li>
</ol>
<div class="wp-caption aligncenter">
<a href="https://zappysys.com/blog/wp-content/uploads/2020/09/upsert-destination-configuration.png">
<img loading="lazy" decoding="async" class="size-full" alt="" src="https://zappysys.com/blog/wp-content/uploads/2020/09/upsert-destination-configuration.png" /></a>
<p class="wp-caption-text">Configure SSIS Upsert Destination Connection - Loading data (REST / SOAP / JSON / XML /CSV) into SQL Server or other target using SSIS</p>
</div>
<h3>Step 4: Map Columns</h3>
<ol>
<li>Go to the <strong>Mappings</strong> tab.</li>
<li>Click <strong>Auto Map</strong> to map source columns to target columns by name.</li>
<li>Ensure you <strong>check the Primary key column(s)</strong> that will determine whether a record is inserted or updated.</li>
<li>You can manually adjust the mappings if necessary.</li>
</ol>
 <div class="wp-caption aligncenter">
<a href="https://zappysys.com/blog/wp-content/uploads/2020/09/upsert-destination-key.png">
<img loading="lazy" decoding="async" class="size-full" alt="" src="https://zappysys.com/blog/wp-content/uploads/2020/09/upsert-destination-key.png" /></a>
<p class="wp-caption-text">SSIS Upsert Destination - Columns Mappings</p>
</div>
<h3>Step 5: Save Settings</h3>
<ul>
<li>Click <strong>OK</strong> to save the Upsert Destination configuration.</li>
</ul>
<h3>Step 6: Optional: Add Logging or Analysis</h3>
<ul>
<li>You may add extra destination components to log the number of inserted vs. updated records for monitoring or auditing purposes.</li>
</ul>
<h3>Step 7: Execute the Package</h3>
<ul>
<li>Run your SSIS package and verify that the data is correctly inserted and updated in the target table.</li>
</ul>
<div class="wp-caption aligncenter">
<a href="https://zappysys.com/blog/wp-content/uploads/2018/12/ssis-upsert-destination-execute.png">
<img loading="lazy" decoding="async" class="size-full" alt="" src="https://zappysys.com/blog/wp-content/uploads/2018/12/ssis-upsert-destination-execute.png" /></a>
<p class="wp-caption-text">SSIS Upsert Destination Execution</p>
</div></div>
<h2><span id="Conclusion">Conclusion</span></h2>
<p>Above all, in this blog, we learned how to Read Amazon S3 Storage Files in SSIS. We used <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-source/" target="_blank" rel="noopener">Amazon S3 Source for CSV file</a>, <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-json-file-source/" target="_blank" rel="noopener">Amazon S3 Source for JSON file</a> and <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-xml-file-source/" target="_blank" rel="noopener">Amazon S3 Source for XML file</a> to read the file(s) from Amazon S3 Storage and load data into SQL server. You can <a href="https://zappysys.com/products/ssis-powerpack/">download SSIS PowerPack here</a> to try many other scenarios not discussed in this blog along with 70+ other components.</p>
<h2><span id="References">References</span></h2>
<p>Finally, you can use the following links for more information:</p>
<ul>
<li><a href="https://zappysys.com/products/ssis-powerpack/#cat_amazon_aws_cloud" target="_blank" rel="noopener">SSIS Amazon S3 Source for CSV/JSON/XML File</a></li>
<li><a href="http://aws.amazon.com/s3/faqs/" target="_blank" rel="noopener">Introduction to Amazon S3 Storage Service</a></li>
<li><a href="https://www.youtube.com/watch?v=zdWPTvtW0E0" target="_blank" rel="noopener">How to Get Your Amazon AWS Access Key &amp; Secret Key</a></li>
</ul>
<p>The post <a href="https://zappysys.com/blog/read-amazon-storage-s3-files-ssis-csv-json-xml/">Read Amazon S3 Storage Files in SSIS (CSV, JSON, XML)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>SSIS &#8211; Copy Amazon S3 files from AWS to Azure</title>
		<link>https://zappysys.com/blog/ssis-copy-move-amazon-s3-files-from-aws-to-azure/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Wed, 19 Apr 2017 21:40:45 +0000</pubDate>
				<category><![CDATA[SSIS Amazon Storage Task]]></category>
		<category><![CDATA[SSIS Azure Blob Storage Task]]></category>
		<category><![CDATA[Amazon S3]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[azure]]></category>
		<category><![CDATA[azure blob storage]]></category>
		<category><![CDATA[GCP]]></category>
		<category><![CDATA[Google Cloud]]></category>
		<category><![CDATA[ssis]]></category>
		<category><![CDATA[ssis amazon storage task]]></category>
		<category><![CDATA[ssis azure storage task]]></category>
		<category><![CDATA[SSIS PowerPack]]></category>
		<guid isPermaLink="false">http://zappysys.com/blog/?p=1097</guid>

					<description><![CDATA[<p>Introduction Azure and AWS both are most popular Cloud Platforms. In this blog post we will learn how to copy or move Amazon S3 files to Azure Blob Storage without any coding or scripting  (AWS to Azure File Copy / Migration Scenario). To achieve this objective we will use following Drag and Drop SSIS  Tasks (i.e. [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/ssis-copy-move-amazon-s3-files-from-aws-to-azure/">SSIS &#8211; Copy Amazon S3 files from AWS to Azure</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p>Azure and AWS both are most popular Cloud Platforms. In this blog post we will learn <em>how to copy or move Amazon S3 files to Azure Blob Storage</em> without any coding or scripting  (<strong>AWS to Azure File Copy / Migration Scenario</strong>). To achieve this objective we will use following Drag and Drop SSIS  Tasks (i.e. Microsoft SQL Server Integration Services &#8211; <a href="https://docs.microsoft.com/en-us/sql/integration-services/sql-server-integration-services" target="_blank">ETL Platform for SQL Server</a>). Following components are highly optimized for parallel copy/multi threading with secure connection (client side/serverside encryption).</p>
<table>
<tbody>
<tr>
<td width="36"><img decoding="async" src="//zappysys.com/onlinehelp/ssis-powerpack/scr/images/azure-storage-task/ssis-azure-cloud-storage-task.png" alt="Custom SSIS Tasks - Azure Blob Storage Task" width="32" /></td>
<td><a title="SSIS Azure Blob Storage Task (Manage Azure Blob Storage)" href="//zappysys.com/products/ssis-powerpack/ssis-azure-blob-storage-task/" target="_blank">SSIS Azure Blob Storage Task</a></td>
</tr>
</tbody>
</table>
<table>
<tbody>
<tr>
<td width="36"><img decoding="async" src="//zappysys.com/images/ssis-powerpack/SSIS-Amazon-S3-Cloud-Task.png" alt="Custom SSIS Components - Amazon S3 Task (AWS S3)" width="32" /></td>
<td><a title="SSIS Amazon Storage Task (Manage AWS S3)" href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank">SSIS Amazon Storage Task (AWS S3)</a></td>
</tr>
</tbody>
</table>
<h2>Concept : Fast Server Side Copy in Azure (Copy files into Azure Blob Storage)</h2>
<p>Azure provides unique feature called server side file copy. Using this feature you can load or copy files into Azure Blob Storage without landing data to your local machine. As of now (March 2017) this type of feature is still missing in other Cloud Platform such as <strong>Amazon AWS</strong> and <strong>Google Cloud Platform &#8211; GCP</strong>.</p>
<p>Using server side copy feature in Azure&#8230; you can achieve following scenarios in SSIS without coding</p>
<ul>
<li>Move / Copy files from Amazon S3 to Azure Blob Storage</li>
<li>Move / Copy files from Google Cloud Platform (GCP) to Azure Blob Storage</li>
<li>Copy files from any public URL to Azure Blob Storage (Assuming URL doesn&#8217;t require credentials or its intranet URL)</li>
</ul>
<p>In first two scenarios you need to have Source File URL using Authentication information inside URL (This is called Pre-Signed URL e.g. http://mycloud.com/myfile.zip?SIGNATURE-GOES-HERE ). All major Cloud Providers Support Pre-Signed URL so you can securely share files  with other without sharing your actual Credentials. You can configure Pre-Signed URL to expires in certain time frame (Check your Cloud API documentation). If your file doesn&#8217;t need credentials to access then you don&#8217;t have to worry about Pre-Signed URL. Your source can be file URL  (E.g. https://mysite.com/downloads/fil1.zip)</p>
<p><strong>Pre-Signed URL for Google Cloud (GCP)</strong><br />
Here is more information <a title="Create Presigned URL for Google Cloud Platform file (GCP file)" href="https://cloud.google.com/storage/docs/access-control/create-signed-urls-gsutil" target="_blank">how to get Pre-Signed URL for Google Cloud Platform (GCP) </a>&#8211; Use gsutil command line<br />
<strong>Pre-Signed URL for Amazon AWS (S3 file)</strong><br />
To create pre-signed URL for AWS S3 files you can use <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank">SSIS Amazon Storage Task</a> and use Get Pre-Signed URL Action (Described in the below section). This action supports creating pre-signed URLs for multiple files using wildcard (e.g /mybkt/*.zip)  or you can get single pre-signed URL. If you use pattern search then you will get DataTable back &#8230; which can be loop through using ForEach Loop task (Loop ADO Recordset option) . If you do not use wild card then only one URL returned in string format.</p>
<h2>Prerequisites</h2>
<p>Before we look at next section make sure following prerequisites are met</p>
<ol>
<li>You have obtained Amazon S3 Access Key and Secret Key to access desired files</li>
<li>You have obtained Azure Storage Account Name and Account Key to access desired Blob Container. If you don&#8217;t have access to Azure then you can download Azure Storage Emulator for testing Purpose. <a href="https://zappysys.com/forums/topic/azure-blob-storage-how-to-download-and-test-azure-storage-emulator/" target="_blank">Check this</a>.</li>
<li>You have basic knowledge of SSIS. If you dont then search for SSIS tutorial and there are many blogs / tutorials to get started 🙂</li>
</ol>
<h2>Step-By-Step &#8211; Create SSIS Package &#8211; Copy Amazon S3 Files to Azure Blob Storage</h2>
<p>Now lets look at how to copy Amazon S3 files to Azure in few clicks. This approach doesn&#8217;t bring any data to your local system so its purely server to server copy and that is why its very fast and secure.</p>
<ol>
<li>First Download and <a href="//zappysys.com/products/ssis-powerpack/" target="_blank">Install SSIS PowerPack</a></li>
<li>Once you install SSIS PowerPack. Create a new Sample SSIS Project and open Package</li>
<li>From SSIS Toolbox Drag <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank">ZS Amazon Storage Task</a> and drop on Control Flow surface. Rename it to <strong>Get S3 File Urls</strong></li>
<li>Double click S3 task to edit. From Action dropdown select (Get Amazon S3 Files Pre-Signed Url Option)</li>
<li>Click New next to the Connection Dropdown to create Amazon Storage Connection. Enter your credentials and bucket region and Click Test. If you are <a href="https://zappysys.com/blog/check-amazon-s3-bucket-location-region/" target="_blank">not sure about your bucket region then check this article</a>.
<div id="attachment_2196" style="width: 647px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-aws-to-azure-copy-create-pre-signed-url-multiple-files.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2196" class="size-full wp-image-2196" src="//zappysys.com/wp-content/uploads/2017/04/ssis-aws-to-azure-copy-create-pre-signed-url-multiple-files.png" alt="SSIS Amazon Storage Task - Create Pre-Signed URLs for multiple files stored in S3" width="637" height="824" /></a><p id="caption-attachment-2196" class="wp-caption-text">SSIS Amazon Storage Task &#8211; Create Pre-Signed URLs for multiple files stored in S3</p></div></li>
<li>Now click on Advanced Tab and check Use region specific endpoint option
<div id="attachment_2197" style="width: 414px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-amazon-s3-connection-region-specific-endpoint.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2197" class="size-full wp-image-2197" src="//zappysys.com/wp-content/uploads/2017/04/ssis-amazon-s3-connection-region-specific-endpoint.png" alt="SSIS Amazon S3 Connection - Region specific endpoint option" width="404" height="200" /></a><p id="caption-attachment-2197" class="wp-caption-text">SSIS Amazon S3 Connection &#8211; Region specific endpoint option</p></div></li>
<li>Once Test connection is green you can click OK to save connection.</li>
<li>Once you back to Amazon Storage Task UI&#8230; Click on browse next to S3 file path. Her you can select one file or enter pattern to create URLs for multiple files.</li>
<li>To save pre-signed URL(s) select Variable. If variable is not there then create new one&#8230; If you enter pattern in Source path then Variable must be Object datatype. For multiple URL it will return ADO Recordset (You can use with ForEach Loop Task). If you didn&#8217;t enter pattern in source path then variable can be string datatype.
<div id="attachment_2198" style="width: 845px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-get-presigned-url-for-multiple-amazon-s3-files-for-loop.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2198" class="size-full wp-image-2198" src="//zappysys.com/wp-content/uploads/2017/04/ssis-get-presigned-url-for-multiple-amazon-s3-files-for-loop.png" alt="SSIS Amazon Storage Task - Save Pre-Signed URL to Variable (Save Multiple URL as Recordset )" width="835" height="681" /></a><p id="caption-attachment-2198" class="wp-caption-text">SSIS Amazon Storage Task &#8211; Save Pre-Signed URL to Variable (Save Multiple URL as Recordset )</p></div></li>
<li>Now drag and drop ForEach Loop Container on the surface and configure like below
<div id="attachment_2199" style="width: 660px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-foreach-loop-amazon-s3-files-presigned-url.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2199" class="size-full wp-image-2199" src="//zappysys.com/wp-content/uploads/2017/04/ssis-foreach-loop-amazon-s3-files-presigned-url.png" alt="SSIS ForEach Loop Container Task - Loop through Amazon S3 file URL" width="650" height="550" /></a><p id="caption-attachment-2199" class="wp-caption-text">SSIS ForEach Loop Container Task &#8211; Loop through Amazon S3 file URL</p></div>
<div id="attachment_2200" style="width: 450px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-foreach-loop-variable-mapping-each-iteration.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2200" class="size-full wp-image-2200" src="//zappysys.com/wp-content/uploads/2017/04/ssis-foreach-loop-variable-mapping-each-iteration.png" alt="SSIS ForEach Loop Task - Variable Mappings" width="440" height="365" /></a><p id="caption-attachment-2200" class="wp-caption-text">SSIS ForEach Loop Task &#8211; Variable Mappings</p></div></li>
<li>Once Foreach Loop is configured you can drag <a href="//zappysys.com/products/ssis-powerpack/ssis-azure-blob-storage-task/" target="_blank">ZS Azure Storage Task</a> inside ForEach Loop Container</li>
<li>Double click Azure storage task and select [<strong>Copy from external file</strong>] option from Action dropdown.</li>
<li>In the source Path/URL we will enter variable name which holds current URL for ForEach Loop iteration. Easy way is click blue variable icon and select &lt;&lt;Insert Variable&gt;&gt;. Your placeholder may look like below.<br />
<pre class="crayon-plain-tag">{{User::varCurrentS3Url}}</pre>
</li>
<li>For target path &#8230; Click New Connection to create and configure new <strong>Azure Storage Connection</strong> like below. Enter Azure storage credentials and click test. If you don&#8217;t have real Azure credentials then you can use <a href="https://zappysys.com/forums/topic/azure-blob-storage-how-to-download-and-test-azure-storage-emulator/" target="_blank">Azure Storage Emulator (Check this)</a>. Once connection is Configured click Test connection and click OK to save connection.
<div id="attachment_2201" style="width: 898px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-configure-azure-storage-task-copy-aws-s3-file.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2201" class="size-full wp-image-2201" src="//zappysys.com/wp-content/uploads/2017/04/ssis-configure-azure-storage-task-copy-aws-s3-file.png" alt="Azure Storage Task - Configure Blob Storage connection- Serverside Copy S3 files - AWS to Azure" width="888" height="829" /></a><p id="caption-attachment-2201" class="wp-caption-text">Azure Storage Task &#8211; Configure Blob Storage connection- Serverside Copy S3 files &#8211; AWS to Azure</p></div></li>
<li>Once everything is configure you can save package and execute. This will copy S3 files to Azure Blob storage.</li>
<li>Here is execution log<br />
<pre class="crayon-plain-tag">SSIS package &quot;C:\SSIS\Amazon-To-Azure-Copy.dtsx&quot; starting.
Information: 0x0 at Get Signed URL for S3 Files, Get Signed URL for S3 Files: You are running TRIAL version. It will expire in 29 day(s)
Information: 0x0 at Get Signed URL for S3 Files, Get Signed URL for S3 Files: Reading [PRESIGNED-URL] property for zs-eu-west-2-london-bkt2/*.*
Information: 0x0 at Copy S3 File to Azure, Copy S3 File to Azure: Copy started: Source=https://zs-eu-west-2-london-bkt2.s3.eu-west-2.amazonaws.com/cloudfile1.csv, Target=test1/
Information: 0x0 at Copy S3 File to Azure, Copy S3 File to Azure: Copying https://zs-eu-west-2-london-bkt2.s3.eu-west-2.amazonaws.com/cloudfile1.csv ...
Information: 0x0 at Copy S3 File to Azure, Copy S3 File to Azure: Copy started: Source=https://zs-eu-west-2-london-bkt2.s3.eu-west-2.amazonaws.com/cloudfile3.txt, Target=test1/
Information: 0x0 at Copy S3 File to Azure, Copy S3 File to Azure: Copying https://zs-eu-west-2-london-bkt2.s3.eu-west-2.amazonaws.com/cloudfile3.txt ...
SSIS package &quot;C:\SSIS\Amazon-To-Azure-Copy.dtsx&quot; finished: Success.</pre>
<div id="attachment_1100" style="width: 712px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2017/04/ssis-package-copy-multiple-s3-files-amazon-aws-to-azure.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-1100" class="size-full wp-image-1100" src="//zappysys.com/blog/wp-content/uploads/2017/04/ssis-package-copy-multiple-s3-files-amazon-aws-to-azure.png" alt="SSIS Package Execution - Copy Multiple S3 Files from Amazon to Azure" width="702" height="380" srcset="https://zappysys.com/blog/wp-content/uploads/2017/04/ssis-package-copy-multiple-s3-files-amazon-aws-to-azure.png 702w, https://zappysys.com/blog/wp-content/uploads/2017/04/ssis-package-copy-multiple-s3-files-amazon-aws-to-azure-300x162.png 300w" sizes="(max-width: 702px) 100vw, 702px" /></a><p id="caption-attachment-1100" class="wp-caption-text">SSIS Package Execution &#8211; Copy Multiple S3 Files from<br />Amazon to Azure</p></div></li>
</ol>
<p>&nbsp;</p>
<h2>How to move Amazon S3 files to Azure</h2>
<p>In above example you saw how to copy files from Amazon S3 to Azure Blob Storage. But what if you want to Move&#8230; so once file is copied you want to delete from source. In that case its simple. Add one more Amazon S3 Task at the end so if all previous steps successful you can Issue delete files command using ZS Amazon Storage Task.</p>
<div id="attachment_2204" style="width: 602px" class="wp-caption alignnone"><a href="//zappysys.com/wp-content/uploads/2017/04/ssis-delete-amazon-s3-files.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-2204" class="size-full wp-image-2204" src="//zappysys.com/wp-content/uploads/2017/04/ssis-delete-amazon-s3-files.png" alt="SSIS Delete Amazon S3 Files after Successful S3 to Azure Copy operation (This will mimic Move)" width="592" height="550" /></a><p id="caption-attachment-2204" class="wp-caption-text">SSIS Delete Amazon S3 Files after Successful S3 to Azure Copy operation (This will mimic Move)</p></div>
<h2>Download Sample SSIS Package</h2>
<p><a href="//zappysys.com/blog/wp-content/uploads/2017/04/SSIS-Amazon-S3-To-Azure-Copy-Move.zip">Click here to download sample SSIS file for SQL Server 2012, 2014, 2016</a></p>
<h2>Conclusion</h2>
<p>If you want to bring data from Amazon S3 to Azure in fastest way then technique described in this article can help you to cut down your data transfer time by several times. <a href="//zappysys.com/products/ssis-powerpack/">SSIS PowerPack</a> comes with 45+ components and tasks which can give you drag and drop interface for your Cloud connectivity projects. It also comes with many connectors to help with JSON, XML, REST API related integration. Try <a href="//zappysys.com/products/ssis-powerpack/">SSIS PowerPack</a> for FREE without any limitation and find out what else you can do with it.</p>
<p>The post <a href="https://zappysys.com/blog/ssis-copy-move-amazon-s3-files-from-aws-to-azure/">SSIS &#8211; Copy Amazon S3 files from AWS to Azure</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</title>
		<link>https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Wed, 27 Jul 2016 22:07:34 +0000</pubDate>
				<category><![CDATA[SSIS Amazon S3 CSV Dest]]></category>
		<category><![CDATA[SSIS Amazon Storage Task]]></category>
		<category><![CDATA[SSIS CSV Export Task]]></category>
		<category><![CDATA[Amazon S3]]></category>
		<category><![CDATA[Amazon S3 Task]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[CSV]]></category>
		<category><![CDATA[export]]></category>
		<category><![CDATA[Export CSV Task]]></category>
		<category><![CDATA[sql server]]></category>
		<guid isPermaLink="false">http://zappysys.com/blog/?p=704</guid>

					<description><![CDATA[<p>Introduction In this blog post you will see how easy it is to load large amount of data from SQL Server to Amazon S3 Storage. For demo purpose we will use SQL Server as relational source but you can use same steps for any database engine such as Oracle, MySQL, DB2. In this post we [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/">Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p>In this blog post you will see how easy it is to load large amount of data from <em>SQL Server to Amazon S3</em> Storage. For demo purpose we will use SQL Server as relational source but you can use same steps for any database engine such as Oracle, MySQL, DB2. In this post we will use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export CSV Task</a> and <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amazon S3 Storage Task </a> to achieve desired integration with Amazon S3 with drag and drop approach. You can also export JSON or XML data to Amazon S3 using same techniques (Use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-json-file-task/" target="_blank" rel="noopener">Export JSON Task</a>  or <a href="//zappysys.com/products/ssis-powerpack/ssis-export-xml-file-task/" target="_blank" rel="noopener">Export XML Task</a> ).</p>
<p>Our goal is to achieve following things</p>
<ul>
<li>Extract large amount of data from SQL Server Table or Query and export to CSV files</li>
<li>Generate CSV files in compressed format (*.gz) to speedup upload and save data transfer cost to S3</li>
<li>Split CSV files by row count</li>
<li>Upload data to Amazon S3 using highly parallel manner for maximum speed</li>
</ul>
<p>There are three different ways you can achieve data export to Amazon S3 using SSIS.</p>
<ol>
<li><strong>Method-1 (Fastest)</strong>: Use two step process (First export SQL Server data to local files using <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export Task</a> and then upload files to S3 using  <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amazon S3 Storage Task </a> )</li>
<li><strong>Method-2 (Slower)</strong>: Use <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export Task</a> with Amazon S3 Connection as Target rather than save to Local files.</li>
<li><strong>Method-3 (Slower)</strong>: Use Data flow components like <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">Amazon S3 Destination for CSV</a>  (for JSON / XML  use Method1 or Method2)</li>
</ol>
<p>Each method has its own advantage / disadvantage. If you prefer to upload / compress / split large amount of data then we recommend Method#1 (Two steps). If you have not very huge dataset then you can use Method#2 or Method#3. For Last method you can only use CSV export option (we don&#8217;t have JSON/ XML Destination for Amazon S3 yet &#8211; we may add in future)</p>
<p><strong>Screenshot of SSIS Package</strong></p>
<div id="attachment_707" style="width: 710px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-707" class="wp-image-707" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png" alt="Extract SQL Server Data to CSV files in SSIS (Bulk export) and Split / GZip Compress / upload files to Amazon S3 (AWS Cloud)" width="700" height="365" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3.png 825w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-server-data-upload-to-amazon-s3-300x156.png 300w" sizes="(max-width: 700px) 100vw, 700px" /></a><p id="caption-attachment-707" class="wp-caption-text">Extract SQL Server Data to CSV files in SSIS (Bulk export) and Split / GZip Compress / upload files to Amazon S3 (AWS Cloud)</p></div>
<h2>Method-1 : Upload SQL data to Amazon S3 in Two steps</h2>
<p>In this section we will see first method (recommended) to upload SQL data to Amazon S3. This is the fastest approach if you have lots of data to upload.  In this approach we first create CSV files from SQL Server data on local disk using <a href="https://zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">SSIS Export CSV Task</a>. After that in second step we upload all files to Amazon S3 using <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">SSIS Amazon Storage Task</a>.</p>
<h3>Step-1: Configure Source Connection in Export CSV Task</h3>
<p>To extract data from SQL Server you can use Export CSV Task. It has many options which makes it possible to split large amount of data into multiple files. You can specify single table or multiple tables as your data source.</p>
<p>For multiple table use vertical bar. e.g.  dbo.Customers|dbo.Products|dbo.Orders. When you export this it will create 3 files ( dbo.Customers.csv , dbo.Products.csv, dbo.Orders.csv )</p>
<p><strong>Steps:</strong></p>
<ol>
<li>Drag ZS Export CSV Task from Toolbox</li>
<li>Double click task to configure</li>
<li>From connection drop down select New connection option (OLEDB or ADO.net)</li>
<li>Once connection is configured for Source database specify SQL Query to extract data as below
<div id="attachment_705" style="width: 528px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-705" class="size-full wp-image-705" src="//zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png" alt="Export SQL Server Table or Query as CSV file (Bulk export in SSIS)" width="518" height="494" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast.png 518w, https://zappysys.com/blog/wp-content/uploads/2016/07/export-sql-server-table-query-data-to-csv-fast-300x286.png 300w" sizes="(max-width: 518px) 100vw, 518px" /></a><p id="caption-attachment-705" class="wp-caption-text">Export SQL Server Table or Query as CSV file (Bulk export in SSIS)</p></div></li>
<li>Now go to target tab. Here you can specify full path for file. e.g. c:\ssis\temp\s3dump\cust.csv</li>
</ol>
<h3>Step-2: Compress CSV Files in SSIS ( GZIP format &#8211; *.gz )</h3>
<p>Above steps will export file as CSV format without splitting or compression. But to compress file once exported you can go to Target tab of Export CSV Task and check [<strong>Compress file to *.gz format</strong>] option.</p>
<div id="attachment_706" style="width: 579px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-706" class="size-full wp-image-706" src="//zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png" alt="Compress exported SQL Server data files to GZip ( *.gz) in SSIS Export CSV Task" width="569" height="462" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis.png 569w, https://zappysys.com/blog/wp-content/uploads/2016/07/compress-csv-files-in-ssis-300x244.png 300w" sizes="(max-width: 569px) 100vw, 569px" /></a><p id="caption-attachment-706" class="wp-caption-text">Compress exported SQL Server data files to GZip ( *.gz) in SSIS Export CSV Task</p></div>
<h3>Step-3: Split CSV files by row count or data size in SSIS</h3>
<p>Now lets look at how to split exported CSV files into multiple files so we can upload many files in parallel. Goto Split Options and check [<strong>Enable Split by Size/Rows</strong>]
<div id="attachment_708" style="width: 435px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-708" class="size-full wp-image-708" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png" alt="Using SSIS Split Exported CSV files (Split by row count or size)" width="425" height="489" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data.png 425w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-split-csv-files-sql-data-261x300.png 261w" sizes="(max-width: 425px) 100vw, 425px" /></a><p id="caption-attachment-708" class="wp-caption-text">Using SSIS Split Exported CSV files (Split by row count or size)</p></div>
<h3>Step-4: Upload CSV files to Amazon S3 &#8211; Using multi threaded option</h3>
<p>Now final thing is use <a href="//zappysys.com/products/ssis-powerpack/ssis-amazon-s3-task/" target="_blank" rel="noopener">Amzon S3 Task</a> to upload files to S3.</p>
<h3>Things to remember</h3>
<p>Sometimes times due to high network activity you may get timeout errors during upload. In that case you can adjust few settings <a href="https://zappysys.com/forums/topic/change-timeout-value-amazon-s3-operations/" target="_blank" rel="noopener">described here</a>. Also try to reduce total parallel threads on S3 Connection see that helps.</p>
<p><strong>Steps:</strong></p>
<ol>
<li>Drag ZS Amazon Storage Task from SSIS toolbox</li>
<li>Double click Amazon Storage Task to configure it</li>
<li>Specify Action = UploadFilesToAmazon</li>
<li>Specify Source file path (or pattern) e.g. c:\SSIS\temp\s3dump\*.*</li>
<li>Now in the Target connection dropdown click [New]</li>
<li>When Connection UI opens select Service Type = S3</li>
<li>Enter your Access Key, Secret Key and Region (Leave all other parameters default if you not sure)</li>
<li>Click Test and close connection UI</li>
<li>On the Target path on S3 Storage Task enter your bucket and folder path where you want to upload local files. For example your bucket name is bw-east-1 and folder is sqldata then enter as below<br />
<strong>bw-east-1/sqldata/</strong></li>
<li>Click ok and Run package to test full package</li>
</ol>
<div id="attachment_709" style="width: 709px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-709" class="wp-image-709" src="//zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png" alt="Upload local files to Amazon S3 using SSIS AWS Storage Task" width="699" height="466" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3.png 953w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3-300x200.png 300w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-uploading-files-to-amazon-s3-272x182.png 272w" sizes="(max-width: 699px) 100vw, 699px" /></a><p id="caption-attachment-709" class="wp-caption-text">Upload local files to Amazon S3 using SSIS AWS Storage Task</p></div>
<h2></h2>
<h2>Method-2 : Upload SQL data to Amazon S3 without local stage (One step)</h2>
<p>Now let&#8217;s change previous approach little bit to send SQL server data directly to Amazon S3 without any Landing area on local disk.  <a href="//zappysys.com/products/ssis-powerpack/ssis-export-csv-file-task/" target="_blank" rel="noopener">Export CSV Task</a> , <a href="//zappysys.com/products/ssis-powerpack/ssis-export-json-file-task/" target="_blank" rel="noopener">Export JSON Task</a>  and <a href="//zappysys.com/products/ssis-powerpack/ssis-export-xml-file-task/" target="_blank" rel="noopener">Export XML Task</a> all of them supports Amazon S3 / Azure Blob and Secure FTP (SFTP) connection as target (Only available in <strong>Pro Edition</strong>). We will use this feature in following section.</p>
<p>This approach helps to avoid any local disk need and it may be useful for security reason for some users. However drawback of this approach is, it wont use parallel threads to upload large amount of data like previous method.</p>
<p>Following change will be needed on Export task to upload SQL data directly to S3 / FTP or Azure storage.</p>
<div id="attachment_5252" style="width: 859px" class="wp-caption alignnone"><a href="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-5252" class="size-full wp-image-5252" src="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png" alt="Export SQL data to multiple files to Amazon S3, Azure, Secure FTP (SFTP) in Stream Mode. Compress GZip, Overwrite, Split Options" width="849" height="627" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip.png 849w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip-300x222.png 300w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-export-sql-data-to-s3-csv-compress-gzip-768x567.png 768w" sizes="(max-width: 849px) 100vw, 849px" /></a><p id="caption-attachment-5252" class="wp-caption-text">Export SQL data to multiple files to Amazon S3, Azure, Secure FTP (SFTP) in Stream Mode using SSIS. Configure Compress GZip, Overwrite, Split Options</p></div>
<h2>Method-3 : Using Amazon S3 destination &#8211; Generate Amazon S3 file from any source</h2>
<p>Now let&#8217;s look at third approach to save data from any SSIS Source to Amazon S3 file. Advantage of this approach is you are not limited to few source options provided by Export CSV Task. If you have complex data transformation needed in Data Flow before sending data to S3 then use this approach.  We will use <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">Amazon S3 Destination for CSV</a> as below</p>
<ol>
<li>Drag SSIS Data flow task from toolbox</li>
<li>Create necessary source connection (e.g. OLEDB connection)</li>
<li>Create Amazon S3 Connection (Right click in Connection Managers panel in bottom and click New connection and select <strong>ZS-AMAZON-STORAGE</strong> type )</li>
<li>Once connection managers are created Go to data flow designer and Drag OLEDB Source</li>
<li>Configure OLEDB Source to read desired data from source system (e.g. SQL Server / Oracle)</li>
<li>Once source is configured drag <a href="https://zappysys.com/products/ssis-powerpack/ssis-amazon-s3-csv-file-destination/" target="_blank" rel="noopener">ZS Amazon S3 CSV File Destination</a> from SSIS toolbox</li>
<li>Double click S3 Destination and configure as below
<ol>
<li>On Connection Managers tab select S3 Connection (We created in earlier section).</li>
<li>Properties tab configure like below screenshot</li>
<li>On Input Columns tab select desired column you like to write in the target file. Your name from upstream will be taken as is for target file. So make sure to name upstream columns correctly.</li>
<li>Click OK to save UI</li>
</ol>
</li>
<li>Execute package and check your S3 Bucket to see files got created.</li>
</ol>
<div id="attachment_5253" style="width: 729px" class="wp-caption alignnone"><a href="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-5253" class="size-full wp-image-5253" src="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png" alt="Loading SQL Server data into S3 Bucket Files (Split, Compress Gzip Options) - SSIS Amazon S3 CSV File Destination" width="719" height="782" srcset="https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options.png 719w, https://zappysys.com/blog/wp-content/uploads/2016/07/ssis-amazon-s3-csv-destination-split-compress-gzip-options-276x300.png 276w" sizes="(max-width: 719px) 100vw, 719px" /></a><p id="caption-attachment-5253" class="wp-caption-text">Loading SQL Server data into S3 Bucket Files (Split, Compress Gzip Options) &#8211; SSIS Amazon S3 CSV File Destination</p></div>
<h2>Conclusion</h2>
<p>In this post you have seen how easy it is to upload / archive your SQL Server data (or any other RDBMS data) to Amazon S3 Storage in few clicks. <a href="//zappysys.com/products/ssis-powerpack/">Try SSIS PowerPack</a> for free and find out yourself how easy it is to integrate SQL Server and Amazon S3 using SSIS.</p>
<p>The post <a href="https://zappysys.com/blog/load-data-sql-server-to-amazon-s3/">Loading data from SQL Server to Amazon S3 in SSIS (Split Files, GZip)</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Amazon Redshift data load in Informatica PowerCenter</title>
		<link>https://zappysys.com/blog/amazon-redshift-data-load-in-informatica-powercenter/</link>
		
		<dc:creator><![CDATA[ZappySys]]></dc:creator>
		<pubDate>Wed, 17 Feb 2016 23:42:41 +0000</pubDate>
				<category><![CDATA[Redshift]]></category>
		<category><![CDATA[Amazon Redshift]]></category>
		<category><![CDATA[Amazon S3]]></category>
		<category><![CDATA[aws]]></category>
		<category><![CDATA[Informatica]]></category>
		<category><![CDATA[PowerCenter]]></category>
		<category><![CDATA[redshift]]></category>
		<category><![CDATA[ZappyShell]]></category>
		<guid isPermaLink="false">http://zappysys.com/blog/?p=360</guid>

					<description><![CDATA[<p>Introduction In our previous post you learned how to load data into Redshift using SSIS. Now in this post you will learn how to load data into Redshift using Informatica PowerCenter. For PowerCenter we will use ZappyShell Command line for Redshift Data Load. This small powerful command line utility can handle load of several millions [&#8230;]</p>
<p>The post <a href="https://zappysys.com/blog/amazon-redshift-data-load-in-informatica-powercenter/">Amazon Redshift data load in Informatica PowerCenter</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2>Introduction</h2>
<p>In our <a href="https://zappysys.com/posts/sql-server-to-redshift-data-load-using-ssis/#Command_line_approach_for_SQL_Server_to_Redshift_data_load" target="_blank">previous post</a> you learned how to load data into Redshift using SSIS. Now in this post you will learn how to load data into Redshift using <strong>Informatica PowerCenter</strong>. For PowerCenter we will use <a href="//zappysys.com/products/zappyshell/amazon-redshift-command-line-tools/" target="_blank">ZappyShell Command line for Redshift Data Load</a>. This small powerful command line utility can handle load of several millions or billions records in few minutes. It uses parallel execution engine which takes care most tedious steps of loading data into Amazon Redshift.</p>
<h2>Using informatica command task to load data into redshift</h2>
<p>Informatica PowerCenter has simple task to execute command lines. you can use this task to execute ZappyShell command line. Perform following steps to load data.</p>
<ol>
<li>Download and install <a href="//zappysys.com/products/zappyshell/amazon-redshift-command-line-tools/" target="_blank">ZappyShell for Amazon Redshift from here</a></li>
<li>Now you ready to data into Amazon Redshift</li>
<li>Drag new command task into your Informatica workflow</li>
<li>Enter below command to load data from SQL Server to Redshift. If you have ODBC connectivity then use ODBC DSN to read your data.</li>
<li>You can use script file approach too where each command line parameter can be nicely wrapped in new line.<br />
Here is sample command you can try to load data from SQL Server to Redshift. if you have ODBC connectivity then use ODBC DSN to read your data.<br />
<pre class="crayon-plain-tag">c:\zappyshell\aws.exe import Db 
	--source-driver ADONET_MSSQL 
	--source-query "select ROW_NUMBER()Over(order by a.CustomerID) Id, a.*,b.*,c.OrderID,c.OrderDate,c.Freight  from customers a,products b,orders c" 
	--source-archivemethod None 
	--source-stage-archivemethod Delete 
	--target-stage-archivemethod Delete 
	--source-stagepath "c:\redshift\stage" 
	--target-table "customerdata" 
	--target-stagepath "bw-rstest/cmdstage" 
	--target-truncate 
	--logfile "c:\redshift\log.txt" 
	--maxrows-perfile 100000 
	--region us-east-1 
	--accesskey "AKIA*****************" 
	--secretkey "lPi+XQ************************"  
	--source-connstr "Data Source=localhost;Initial Catalog=Northwind;Integrated Security=SSPI;"  
	--target-connstr "Host=mytestcluster-1.csu********.us-east-1.redshift.amazonaws.com;Port=5439;Database=dev;UserName=masteruser;Password=*********;EnableSsl=true;Timeout=30;CommandTimeout=3600;"</pre>
</li>
<li>For complete help on import command check help file //zappysys.com/onlinehelp/zappyshell/scr/aws/aws-redshift-import.htm</li>
</ol>
<div id="attachment_405" style="width: 746px" class="wp-caption alignnone"><a href="//zappysys.com/blog/wp-content/uploads/2016/02/amazon-redshift-import-command-line-tools-copy-s3-aws.png"><img loading="lazy" decoding="async" aria-describedby="caption-attachment-405" class="size-full wp-image-405" src="//zappysys.com/blog/wp-content/uploads/2016/02/amazon-redshift-import-command-line-tools-copy-s3-aws.png" alt="Informatica PowerCenter Redshift Data Load - Amazon Redshift Import Command line tools (COPY, S3, AWS)" width="736" height="571" srcset="https://zappysys.com/blog/wp-content/uploads/2016/02/amazon-redshift-import-command-line-tools-copy-s3-aws.png 736w, https://zappysys.com/blog/wp-content/uploads/2016/02/amazon-redshift-import-command-line-tools-copy-s3-aws-300x233.png 300w" sizes="(max-width: 736px) 100vw, 736px" /></a><p id="caption-attachment-405" class="wp-caption-text">Informatica PowerCenter Redshift Data Load &#8211; Amazon Redshift Import Command line tools (COPY, S3, AWS)</p></div>
<h2>ZappyShell Command Line for Redshift</h2>
<p>Below are few feature for Command line for Amazon Redshift</p>
<p>◾Import data to AWS Redshift database from files or relational source (e.g. MySQL, Oracle, SQL Server)<br />
◾Import huge amount of data (millions of rows) in few minutes with parallel load techniques<br />
◾Load local flat files to Redshift in just single line command (option to compress data files to *.gz to speed up transfer)<br />
◾Support for Client side encryption using AES 256 Key<br />
◾Load data from any data source (ODBC,ADO.net or OLEDB) using sql query<br />
◾Import compressed data files (*.gz) to redshift<br />
◾Archive files, error reporting, file splitting, many other features</p>
<a href="https://zappysys.com/blog/amazon-redshift-data-load-in-informatica-powercenter/"><img decoding="async" src="https://zappysys.com/blog/wp-content/plugins/wp-youtube-lyte/lyteCache.php?origThumbUrl=%2F%2Fi.ytimg.com%2Fvi%2FOz5Rfztzo0U%2Fhqdefault.jpg" alt="YouTube Video"></a><br /><br /></p>
<p>The post <a href="https://zappysys.com/blog/amazon-redshift-data-load-in-informatica-powercenter/">Amazon Redshift data load in Informatica PowerCenter</a> appeared first on <a href="https://zappysys.com/blog">ZappySys Blog</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
