How to export large SQL Server data table into a local CSV file without reading the entire table into memory and without external libraries? It is not so hard with using combination of DbCommand.ExecuteReader and StreamWriter.
For a reliable export from Azure SQL, I've also implemented ReliableSqlDmConnection and ReliableSqlDbCommand which uses Polly retry Policy for handling transient exceptions:
public SqlDbRetryPolicy()
{
_sqlRetryPolicy = Policy
.Handle<TimeoutException>()
.Or<SqlException>(AnyRetryableError)
.WaitAndRetry(RETRY_COUNT, ExponentialBackoff, (exception, attempt) =>
{
// Capture some info for logging/telemetry.
Console.WriteLine($"Execute: Retry {attempt} due to {exception}.");
});
}
After successful export, CSV file is compressed withing origin directory using GZipStream.
- Visual Studio 2017 15.9 or greater
- Sql DbCommand.ExecuteReader
- Transient-fault-handling with Polly
- Compress and decompress streams with GZipStream
Enjoy!