Skip to content

Latest commit

 

History

History

SqlImportSpeedTest

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

SqlImportSpeedTest

Some SQL Server professionals are under the impression that PowerShell is slow. This module was created to demonstrate PowerShell's performance capabilities.

This module has imported over 250,000 rows per second with ten-column customer CSV datasets and 636,000 rows per second with two-column CSV datasets. This performance is on-par with or often exceeds bcp.exe, a command line utility known for its fast import speeds.

See Get-Help Test-SqlImportSpeed -Detailed for more information.

The results.jpg shows a sample run for me. The results tend to vary, but this was an average. Highest I ever hit was with the non-memory optimized Customers dataset at 277,145 rows/sec (16,628,671 rows/min). That was a one-off though. Most average 230k/sec.

Datasets

This Module supports 4 datasets. Custom datasets coming later.

  1. Default: CSV: 143 MB Table: 222 MB Million-row, 10-column Customer data from the Chinook sample database with data generated by RedGate Data Generator.
  2. Geonames: CSV: 122 MB Table: 175 MB Million-row, 19-column longitudes and latitudes data from geonames.org Includes varchar(max) which slows down import.
  3. SuperSmall: CSV: 13.5 MB Table: 27 MB Million-row, 2 column (int and a varchar(10)). This one can import over 25 million rows a minute.
  4. VeryLarge: CSV: 3.6 GB Table: 5566 MB 25 million-row, 10-column Customer data from the Chinook sample database with data generated by RedGate Data Generator.

The VeryLarge dataset requires 6GB of free disk space on the SQL Server and 4G on the client peforming the test. A one-time download of a 1.5 GB file is required.

Examples

Test-SqlImportSpeed -SqlServer sqlserver2014a

Imports a million row dataset filled with longitude and latitude data. Once it's downloaded and extracted, you can find it in Documents\longlats.csv

Test-SqlImportSpeed -SqlServer sqlserver2014a -Dataset Customers

Just another million row dataset, but this one contains a classic customer table. Once it's downloaded and extracted, you can find it in Documents\customers.csv

$cred = Get-Credential
Test-SqlImport -SqlServer sqlserver2014a -SqlCredential $cred -MinRunspaces 5 -MaxRunspaces 10 -BatchSize 50000

This allows you to login using SQL auth, and sets the MinRunspaces to 5 and the MaxRunspaces to 10. Sets the batchsize to 50000 rows.

$cred = Get-Credential
Test-SqlImport -SqlServer sqlserver2014a -Dataset supersmall

Imports the a small, two column (int, varchar(10)) dataset.