Notes on SQL

Random articles from a puzzled DBA

Introduction

I needed a process that could copy the contents of a specific list of tables, from one database to another. PowerShell seemed to be the best fit, for my purposes. As with any tool that you don’t use regularly, more than half of the battle is finding the correct commands and methodology.

Background

A lightweight process is required to copy data from several tables from a database on one instance, to a database on another. At this time, the names of the tables are unknown and being early in the project, it is always possible that tables could be added to the requirement. Therefore the process needs to cater for an unspecified number of tables.

I found several examples of processes to achieve this, but most of them appeared to be overly complex. Some of these were from articles that were quite old, so I hoped that new commands and libraries had since been developed, that were more suitable.

What was apparent, is that there are quite a few people trying to solve the same task, so I’m posting my basic solution here, on the chance that it might be of help to somebody. It also means there’s less chance that I’ll lose the code if I should require it again, in the future.

Detail

I’m not going to describe the days of scouring the Internet with a variety of search engines, to find something that I understood, was suitable and as simple as possible. You wouldn’t want to follow the meanderings of my mind.

The requirements for this task were fairly simple:

  1. Parameters to specifiy the source and destination servers and databases.
  2. Read the list of required tables, from a table within the source database.
  3. Copy the contents of the tables specified in the previous step, to the destination database.
  4. Log the actions taken by the process.

Required Libraries

One of the delights of PowerShell, is the large number of libraries out there – there’s a very good chance that somebody has already provided a ‘cmdlet’ that does the type of thing that you’re after. That can also be a problem with PowerShell too, in that you have to find them first.

The first library to look in is the ‘SQLServer’ library. As this is a task I require against SQL Server, it seemed a logical starting point.

Read-SQLTableData reads data from a table. The output from this can be stored into an array for later processing.

The second library is dbatools, which contains the Cmdlet Copy-DbaDbTableData. The description for this cmdlet contains the magic phrase “streaming a copy of the data in the most speedy and least resource-intensive way” – precisely what I was after.

PowerShell Code

Below, is the listing of the PowerShell code. Quite basic but may be of use to others.

Listing 1: PowerShell Script

#Read-SQLTableData is in the 'SQLServer' module.
#Copy-DbaDbTableData is in the 'DBATools' module

#Mandatory parameters for source server, database, destination server, database and log file location

Param ( 
      [parameter(Mandatory = $true)]  
      [string] $SrcServer, 
      [parameter(Mandatory = $true)]  
      [string] $SrcDatabase, 
      [parameter(Mandatory = $true)]  
      [string] $DestServer, 
      [parameter(Mandatory = $true)]  
      [string] $DestDatabase,
      [parameter(Mandatory = $false)]  
      [string] $LogLocation = "C:\Temp\"

  ) 

#Where to store the log of this process
  $ScriptLog = $LogLocation+$SrcDatabase+"_"+$DestDatabase+".log"

  Try
  {
#Execution start time

	Get-Date >> $ScriptLog 
  }
  Catch [System.Exception]
  {
     $ex = $_.Exception 
    Write-Host $ex.Message 
  } 

  Try
  {
#Read the data from TableList and store in in the $TableNames array

	$TableNames = Read-SQLTableData -ServerInstance $SrcServer -DatabaseName $SrcDatabase -SchemaName "dbo" -TableName "TableList"

  }
  Catch [System.Exception]
  {
     $ex = $_.Exception 
    Write-Host $ex.Message 
  } 

ForEach($TableName in $TableNames)
{
  Try 
  { 
#For each entry in the $TableNames array, output the table name, copy the table data between source and target and write 
#the output of this process to $ScriptLog 

	Write-Host $TableName.TableName
        Copy-DbaDbTableData -SQLInstance $SrcServer -Destination $DestServer -Database $SrcDatabase -DestinationDatabase $DestDatabase -Table $TableName.TableName -DestinationTable $TableName.TableName >> $ScriptLog 

  } 
  Catch [System.Exception] 
  { 
    $ex = $_.Exception 
    Write-Host $ex.Message 
  }
} 
 

Example Project

Databases

First of all, the source database needs to be created. Within that are three tables. The first table is used to hold a list of table names that are to have their contents copied across. The other tables will contain data that will be copied.

Listing 2: Create Source Database and Tables

USE [master]
GO

CREATE DATABASE [SourceDB];
GO

USE [SourceDB]
GO

CREATE TABLE [dbo].[TableList](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[TableName] [nvarchar](100) NOT NULL
) ON [PRIMARY]
GO

CREATE TABLE [dbo].[TestTable](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[TestText] [nvarchar](50) NULL,
PRIMARY KEY CLUSTERED 
(
	[ID] ASC
) ON [PRIMARY]
) ON [PRIMARY]
GO

CREATE TABLE [dbo].[TestTable2](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[TestText2] [nvarchar](20) NULL,
	[TestValue] [decimal](8, 2) NULL
) ON [PRIMARY]
GO


Now create the target database and tables. The target database can be on another server, as the parameters for the PowerShell script cater for server names.

Listing 3: Create Target Database and Tables

USE [master]
GO

CREATE DATABASE TargetDB;
GO

USE [TargetDB]
GO

CREATE TABLE [dbo].[TestTable](
	[ID] [INT] IDENTITY(1,1) NOT NULL,
	[TestText] [NVARCHAR](50) NULL,
PRIMARY KEY CLUSTERED 
(
	[ID] ASC
) ON [PRIMARY]
) ON [PRIMARY]
GO


USE [TargetDB]
GO

CREATE TABLE [dbo].[TestTable2](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[TestText2] [nvarchar](20) NULL,
	[TestValue] [decimal](8, 2) NULL
) ON [PRIMARY]
GO

Test Data

To create test data I took the easy route, and used Redgate’s ‘SQL Data Generator’ to provide a decent quantity of test data. Within Source DB, I created 1,000 rows of data within ‘TestTable1’ and ‘TestTable2’.

Within SourceDB, the table ‘TableList’ needs a row for each table that will be copied.

Listing 4: Inserting the Control Data

USE SourceDB;
GO

INSERT INTO dbo.TableList
(
    TableName
)
VALUES
(N'TestTable'),
(N'Testtable2');

Executing the PowerShell Script

I’ve saved the PowerShell script as ‘C:\Temp\Copy-SQLTable.ps1’ and from within Windows PowerShell ISE I execute it. While running, it displays the names of the tables that it is copying, as stored within the ‘TableList’ table of ‘SourceDB’.

Image 1: Executing the PowerShell Script

Executing the PowerShell

Checking the Results

Being lazy, I have used Redgate’s Data Compare to check the SourceDB and TargetDB tables – showing that the data within the tables is identical for the two databases.

Image 2: Checking the Data Has Been Copied

Data Compare Results

A log is also produced, recording the details of the actions taken.

Listing 5: Log File Showing the PowerShell Script Output

28 January 2021 19:26:49

SourceInstance      : ..\SQL2019
SourceDatabase      : SourceDB
SourceSchema        : dbo
SourceTable         : TestTable
DestinationInstance : ..
DestinationDatabase : TARGETDB
DestinationSchema   : dbo
DestinationTable    : TestTable
RowsCopied          : 1000
Elapsed             : 38.99 ms

SourceInstance      : ..\SQL2019
SourceDatabase      : SourceDB
SourceSchema        : dbo
SourceTable         : TestTable2
DestinationInstance : ..
DestinationDatabase : TARGETDB
DestinationSchema   : dbo
DestinationTable    : TestTable2
RowsCopied          : 1000
Elapsed             : 6.59 ms

Summary

This is a basic example, which I would expect to be considerably improved upon by those more experienced in PowerShell. However, it serves a purpose and demonstrates some useful features, for my own enlightenment.

References

dbatools

Redgate products (most of which have free trial periods)

One thought on “PowerShell to Copy the Contents of Selected Tables to Another Database

  1. Steve Mack says:

    I like this solution and have implemented it with success. In order to eliminate inbound traffic to our internal network from our DMZ IIS applications, we enforce a strict outbound only firewall policy, and I use this solution to “push” data from only those few, necessary internal tables to identical tables in a DMZ-resident SQL database, where our IIS applications can retrieve data as needed. I have a SQL Agent job that runs the script twice a day, thereby “refreshing” (truncate and Copy-DbaDbTableData) our DMZ-resident database tables twice daily. Very simple yet comprehensive solution for our needs.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: