SQL Azure Import/Export Service has hit Production

Many folks have been anxious about when the service will go into production, the answer is now! A new version of the service has been deployed. The production release of the service comes with:

Improved Performance

The service has implemented a new connection pooling and parallelization strategy to deliver significantly improved performance for all types of databases. While actual results may vary, the average import or export should now be approximately three times faster!

 

Improved Resiliency

Several connectivity issues both transient and permanent have been identified and addressed in order to provide a more reliable experience.

 

New Feature: Selective Export

Customers who only want to export certain tables for performance reasons or because the data doesn’t change often can provide a list of tables to export. The resultant BACPAC will contain the full schema definition plus the table data only for the specified tables. The selectively exported BACPAC can be imported just like a fully exported BACPAC. For now, the sample EXE must be used to submit these types of requests. Customers using the service’s REST endpoints directly can always bypass the EXE.

 

New Feature: Progress Reporting

The current progress for a request will be shown as a percentage in order to provide better feedback on the current state of the request.

 

Production Support

The service is no longer provided as a CTP, it is a fully supported production service. Support questions can be moved from the labs section to the main SQL Azure section.

 

New sample EXE

The new EXE provides a reference implementation of the selective export feature. Older versions of the EXE will continue to work without disruption. As always, the sample EXE and its sources are available at the DAC examples CodePlex site.

 

 

The basics on how to use the service are covered in this previously shown video which shows you how to export:

 

Combined with this previously shown video which covers importing:

 

Then this new video covers the new features now available as a part of the production release:

 

 

We hope you find this release provides a vastly improved experience and look forward to your feedback!

 

About these ads

11 Responses to SQL Azure Import/Export Service has hit Production

  1. [...] This service is provided free of charge to customers using SQL Azure and Windows Azure Storage. For more information and video tutorials visit the DAC blog. [...]

  2. [...] This service is provided free of charge to customers using SQL Azure and Windows Azure Storage. For more information and video tutorials visit the DAC blog. [...]

  3. [...] more information visit the following blog post Advertisement GA_googleAddAttr("AdOpt", "1"); GA_googleAddAttr("Origin", "other"); [...]

  4. [...] このサービスは、SQL AzureとWindows Azureストレージを使っているお客様に無料で提供されます。追加情報とビデオ チュートリアルについては、DAC Blogをご覧ください。 [...]

  5. [...] This service is provided free of charge to customers using SQL Azure and Windows Azure Storage. For more information and video tutorials visit the DAC blog. [...]

  6. arunsun says:

    Just gave this a try and worked like a charm. one piece of feedback – the task of checking the status is just too painful currently. if I click on the status button, I just expect to see the tasks and the respective status and not have to enter username and password all over again.

    great video though. will keep you posted on how the service works for me.
    Azure Rocks!!

  7. Great article, I was having a lot of truble copying databases between my azure servers, this just solved all my problems. Many thanks.

  8. I am using this to the max! However I have hit an issue: no matter what I do I can’t get the return status for a single backup using -R (-RequestID). This was fine, as for now I am parsing the full output to pick the status out. However now I’m in trouble: “The maximum message size quota for incoming messages has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.” This is on the North American datacentre (Chicago), Asia and EME are working fine. Any idea when either will be fixed? Ideally making the -R parameter work would be the best, as without this I have to keep retrieving the whole backup history and parsing that. Cool tool, just needs that last little bit to make it ready for the big time. I’m currently backing up 20+ dbs a night and checking that they all went through ok (via SSIS). Cheers. James

  9. Jasbir Singh says:

    Many thanks. I wan aving trouble entering the proper format for the URL.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: