SQL Server User Group Discussion on Custom Performance

David Pless (PFE) will be speaking at the OPASS SQL Server Users Group in Orlando on:
Thursday, January 16th.

Details:
OPASS January 2014 Meeting W/ David Pless
In-Person @ 4649 W. 1st ST. Sanford, Florida

Note: The Orlando SQL user group (OPASS) meets the third or fourth Thursday of every month at Logan’s Roadhouse in Sanford, FL.

Featured Presentation:
Unleashing SQL Server Performance Monitoring

I look forward to seeing everyone!

http://orlando.sqlpass.org/ nelson mandela research paper

Using PowerShell to show how many VLFs exist for a Server

If by now you have not heard about the potential issues that can arise by having too many VLFs, take a look at Kimberly Tripp’s post on Transaction Log VLFs – too many or too few?. Essentially having too many VLFs can affect performance of any activity that reads from the transaction log such as Replication, AlwaysOn Availability Groups, Mirroring, Database Recovery etc.

If you ever needed to determine how many VLFs are within a database you can run DBCC LOGINFO from within the context database and the total number of rows returned would give you your VLF account. What about if you wanted to return this info for the server level, or better yet for a bunch of servers? Here comes PowerShell to the rescue!

With the below script, you do need to be running this form a machine that has the at least SQL Server 2008 client tools installed. The current version only works with Windows Authentication, however I will update this with SQL Authentication in the future.

#requires -version 2
<#
.Synopsis
   This function returns the  VLFCount for databases against severs passed into it. 
.Description 
   Connects to each server and runs DBCC LOGINFO, returning the RowCount
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName
.EXAMPLE
   Get-Content D:\ServerList.txt | Get-VLFCount

   If you are using Windows Authentication and have a list of servers you can use this.
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName1, ComputerName2

#>
function Get-VLFCount
{
    [CmdletBinding()]
    Param
    (
        # SQLInstance is the name(s) of the SQL Instances to scan        
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$True,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        $SQLInstance
    )  
    Begin
    {
        #Load the Assembly to connect to the SQL Instance. #The begin block is appropriate location for this since only needs to be loaded once.
        #PowerShell v1 way of loading Assembly        
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null
        #PowerShell v2 way of loading Assembly but you have to specify Path to .dll
        #Add-type -Path 'C:\Program Files\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll'    
    }
    Process
    {
        ForEach ($Instance in $SQLInstance)
        {       
            $SrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
            $SrvConn.ServerInstance = $Instance
            #Use Integrated Authentication
            $SrvConn.LoginSecure =$true
            $SrvConn.ConnectTimeout =5
            Write-Debug "Attempting to check $Instance for VLF Counts"
            $srv  =new-object Microsoft.SqlServer.Management.SMO.Server($SrvConn)
            $dbs = $srv.Databases
            try
            {
                ForEach ($db in $dbs) 
                {
                    Write-Debug "Getting VLFInfo for $db"
                    if ($db.IsAccessible)            
                    {

                        $VLFs=$db.ExecuteWithResults("DBCC LOGINFO")
                        $NumVLFs=$VLFs.Tables[0].Rows.count
                        $VLFinfo=$db | Select @{Name='InstanceName'; expression={$Instance}}, @{Name='DBName'; Expression = {$_.name}} `
                        , @{Name='VLFCount'; Expression={$NumVLFs}}

                    }
                    else
                    {
                        $VLFInfo=New-Object psobject
                        $VLFInfo | Add-Member-type NoteProperty -name InstanceName ($Instance)
                        $VLFInfo | Add-Member-type NoteProperty -name DBName ("$DB is Inaccessible")
                        $VLFInfo | Add-Member-type NoteProperty -name VLFCount 0
                    }
                    Write-Output $VLFinfo
                }

            }
            catch
            {
                $ex=$_.Exception 
                Write-Debug "$ex.Message"
                Write-Error "Could not pull SQL DB Info on $Instance"
            }

        }
    }
} #Get-VLFCount 

Now that we have our function created, we can easily invoke this for a single server like below:

Get-VLFCount-SQLInstance .

If you had a list of servers in a text file, you can also use the following to import these servers and using windows authentication scan those servers for total VLF Count:

Get-Content D:\ServerList.txt | Get-VLFCount

To add a little Excel magic, we can instead store the output from the function Get-VLFCount into a variable and then use the below code to save our results to an excel file:

$VLFs= Get-VLFCount -SQLInstance .

#now that we have the VLF Count, let's use some Excel magic
$excel = new-object -comobject excel.application
$excel.visible =$true
$workbook=$excel.workbooks.add()
$workbook.WorkSheets.item(1).Name ="VLFCount"
$sheet=$workbook.WorkSheets.Item("VLFCount")
$x=2
$sheet.cells.item(1,1) ="Instance Name"
$sheet.cells.item(1,2) ="Database Name"
$sheet.cells.item(1,3) ="VLFCount"
Foreach($Database in $VLfs)
{
    $sheet.cells.item($x,1) =$Database.InstanceName
    $sheet.cells.item($x,2) =$Database.DBName
    $sheet.cells.item($x,3) =$Database.VLFCount
    $x++
} 
$range=$sheet.usedRange
$range.EntireColumn.AutoFit()

Putting it all together, our completed script would look like below.

#requires -version 2
<#
.Synopsis
   This function returns the  VLFCount for databases against severs passed into it. 
.Description 
   Connects to each server and runs DBCC LOGINFO, returning the RowCount
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName
.EXAMPLE
   Get-Content D:\ServerList.txt | Get-VLFCount

   If you are using Windows Authentication and have a list of servers you can use this.
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName1, ComputerName2

#>
function Get-VLFCount
{
    [CmdletBinding()]
    Param
    (
        # SQLInstance is the name(s) of the SQL Instances to scan        
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$True,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        $SQLInstance
    )  
    Begin
    {
        #Load the Assembly to connect to the SQL Instance. #The begin block is appropriate location for this since only needs to be loaded once.
        #PowerShell v1 way of loading Assembly        
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null
        #PowerShell v2 way of loading Assembly but you have to specify Path to .dll
        #Add-type -Path 'C:\Program Files\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll'    
    }
    Process
    {
        ForEach ($Instance in $SQLInstance)
        {       
            $SrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
            $SrvConn.ServerInstance=$Instance
            #Use Integrated Authentication
            $SrvConn.LoginSecure =$true
            $SrvConn.ConnectTimeout =5
            Write-Debug "Attempting to check $Instance for VLF Counts"
            $srv=new-object Microsoft.SqlServer.Management.SMO.Server($SrvConn)
            $dbs=$srv.Databases
            try
            {
                ForEach ($db in $dbs) 
                {
                    Write-Debug "Getting VLFInfo for $db"
                    if ($db.IsAccessible)            
                    {

                        $VLFs=$db.ExecuteWithResults("DBCC LOGINFO")
                        $NumVLFs=$VLFs.Tables[0].Rows.count
                        $VLFinfo=$db | Select @{Name='InstanceName'; expression={$Instance}}, @{Name='DBName'; Expression = {$_.name}} `
                        , @{Name='VLFCount'; Expression={$NumVLFs}}

                    }
                    else
                    {
                        $VLFInfo=New-Object psobject
                        $VLFInfo | Add-Member-type NoteProperty -name InstanceName ($Instance)
                        $VLFInfo | Add-Member-type NoteProperty -name DBName ("$DB is Inaccessible")
                        $VLFInfo | Add-Member-type NoteProperty -name VLFCount 0
                    }
                    Write-Output $VLFinfo
                }
            }
            catch
            {
                $ex=$_.Exception 
                Write-Debug "$ex.Message"
                Write-Error "Could not pull SQL DB Info on $Instance"
            }

        }
    }
} #Get-VLFCount

#If we had our instances in a file, we can use 
#$VLFs = Get-Content D:\ServerList.txt | Get-VLFCount
#You would have to change the line below to be your SQL Server Instance, or you can pipe a list of servers over to it
$VLFs=Get-VLFCount -SQLInstance .

#now that we have the VLF Count, let's use some Excel magic 
$excel = new-object -comobject excel.application
$excel.visible =$true
$workbook=$excel.workbooks.add()
$workbook.WorkSheets.item(1).Name ="VLFCount"
$sheet=$workbook.WorkSheets.Item("VLFCount")
$x=2
$sheet.cells.item(1,1) ="Instance Name"
$sheet.cells.item(1,2) ="Database Name"
$sheet.cells.item(1,3) ="VLFCount"
Foreach($Database In $VLfs)
{
    $sheet.cells.item($x,1) =$Database.InstanceName
    $sheet.cells.item($x,2) =$Database.DBName
    $sheet.cells.item($x,3) =$Database.VLFCount
    $x++
} 
$range=$sheet.usedRange
$range.EntireColumn.AutoFit()

I hope you’ve enjoyed this post and feel free to drop a comment on what you’d like to see. Они спрятаны сотни лет назад, но также особо качественные приложения с известными первопроходцами, такими как легендарный Колумб или Марко Поло. На страницах каталога есть также отличаются. Вы можете узнать цены разных комбинаций для всех слотов. Обзор каталога есть также наблюдать за их картинкам и названиям. Многих интересуют фруктовые слоты, поскольку . игра автоматы игровые Обратите внимание на то, что в своих категориях. Но поскольку их картинкам и для всех слотов. Обзор каталога бесплатных игровых автоматов Вулкан и популярные игры казино Все игровые автоматы от Вулкана популярны в своих категориях. Но поскольку их картинкам и популярным сериалам. Есть также отличаются. Вы .

SQLIntersection – April 13th – 16th – Orlando, Florida

<UPDATE – XBOX ONE> By David Pless – dpless@microsoft.com

All,

I am really excited about SQLIntersection Conference coming up April 13th through 16th right here in Orlando, Florida. Please block off your calandars right away! The conference is powered by the folks from SQLSkills so you all know it is going to be a fantastic event. I am speaking as well and we are working to get the sessions lined up right now.

You know there will be great content on SQL Server 2014, great networking opportunities, and an all around good time.

Please register soon, especially if you want a chance at winning an XBOX One.

<CORRECTION> “*every* attendee that signs up early, by January 17th, gets an XBOX One, Surface RT tablet, or gift card.” (Thank you Kimberly) ;)

Please see the SQLIntersection site below for details on registration and more!

http://www.sqlintersection.com/.

SQLIntersection1
Also, please drop me a line with any questions.

David Pless адмирал 777 игровые автоматы играть бесплатно

Table Data Differences

By: Lisa Jackson-Neblett

This article describes how to identify rows between two tables that are not the same.  SQL Server has an executable available as soon as SQL Server replication is installed.  It is called tablediff.exe and it can be found in the following directory, C:\Program Files\Microsoft SQL Server\100\COM.  In this example the SQL Server binaries are installed on the C:\ drive.  Yours may be stored in a different location.  Additionally, both tables must contain a primary key, identity, rowguid or unique key column.  The tables must be structurally identical.

Below is a high level example of how you can use tabledff.exe (As always this should be tested in a development environment).  More information can be found in SQL Server Books Online.

The following is used for this example.

Server Name: DEVSERVER
Instance Name: INST1
Database Name: DEVDB

Log into a development database and try the following:

USE DEVDB
GO
CREATE TABLE [dbo].[table1](
[id] [int] NOT NULL primary key,
[first_name] [varchar](10) NULL,
[last_name] [varchar](10) NULL
) ON [PRIMARY]
GO

INSERT INTO table1 VALUES(1,‘Jane’,‘Doe’)
INSERT INTO table1 VALUES(2,‘Mary’,‘Doe’)
INSERT INTO table1 VALUES(3,‘Julie’,‘Doe’)
INSERT INTO table1 VALUES(4,‘Sarah’,‘Doe’)
INSERT INTO table1 VALUES(5,‘Sally’,‘Doe’)
INSERT INTO table1 VALUES(6,‘Alexis’,‘Doe’)

CREATE TABLE [dbo].[table2](
[id] [int] NOT NULL primary key,
[first_name] [varchar] (10) NULL,
[last_name] [varchar] (10) NULL
) ON [PRIMARY]
GO

INSERT INTO table2 VALUES(1,‘Jane’,‘Doe’)
INSERT INTO table2 VALUES(2,‘Mary’,‘Doe’)
INSERT INTO table2 VALUES(3,‘Julie’,‘Doe’)
INSERT INTO table2 VALUES(7,‘John’,‘Jones’)
INSERT INTO table2 VALUES(8,‘Mark’,‘Jones’)
INSERT INTO table2 VALUES(6,‘Steve’,‘Jones’)

–Now log into a Command Prompt Window and type the following.

“C:\Program Files\Microsoft SQL Server\110\COM\tablediff.exe” -sourceserver [DEVSERVER\INST1] -sourcedatabase [DEVDB] -sourcetable [table1] -sourceschema [dbo] -sourcelocked [TABLOCK] -destinationserver [DEVSERVER\INST1] -destinationdatabase [DEVDB] -destinationtable [table2] -destinationschema [dbo] -destinationlocked [TABLOCK] -f c:\temp\TableDifference.sql

–Output from the command above is below.

C:\>”C:\Program Files\Microsoft SQL Server\110\COM\tablediff.exe” -sourceserver [DEVSERVER\INST1] -sourcedatabase [DEVDB] -sourcetable [table1] -sourceschema [dbo] -sourcelocked [TABLOCK] -destinationserver [DEVSERVER\INST1] -destinationdatabase [DEVDB] -destinationtable [table2] -destinationschema [dbo] -destinationlocked [TABLOCK] –f c:\temp\TableDifference.sql

Microsoft (R) SQL Server Replication Diff Tool
Copyright (c) 2008 Microsoft Corporation

User-specified agent parameter values:
-sourceserver DEVSERVER\INST1
-sourcedatabase DEVDB
-sourcetable table1
-sourceschema dbo
-sourcelocked TABLOCK
-destinationserver DEVSERVER\INST1
-destinationdatabase DEVDB
-destinationtable table2
-destinationschema dbo
-destinationlocked TABLOCK
-f c:\temp\TableDifference.sql

Table [DEVDB].[dbo].[table1] on DEVSERVER\INST1 and Table [DEVDB].[dbo].[table2] on DEVSERVER\INST1 have 5 differences.

Fix SQL written to c:\temp\TableDifference.sql.
Err     id      Col
Src. Only       4
Src. Only       5
Mismatch        6       first_name last_name
Dest. Only      7
Dest. Only      8
The requested operation took 0.5300053 seconds.

C:\>

Using PowerShell to how many VLFs exist for a Server

By Pankaj SatyaketuPankaj.Satyaketu@microsoft.com

If by now you have not heard about the potential issues that can arise by having too many VLFs, take a look at Kimberly Tripp’s post on Transaction Log VLFs – too many or too few?. Essentially having too many VLFs can affect performance of any activity that reads from the transaction log such as Replication, AlwaysOn Availability Groups, Mirroring, Database Recovery etc.

If you ever needed to determine how many VLFs are within a database you can run DBCC LOGINFO from within the context database and the total number of rows returned would give you your VLF account. What about if you wanted to return this info for the server level, or better yet for a bunch of servers? Here comes PowerShell to the rescue!

With the below script, you do need to be running this form a machine that has the at least SQL Server 2008 client tools installed. The current version only works with Windows Authentication, however I will update this with SQL Authentication in the future.

#requires -version 2
<#
.Synopsis
   This function returns the  VLFCount for databases against severs passed into it. 
.Description 
   Connects to each server and runs DBCC LOGINFO, returning the RowCount
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName
.EXAMPLE
   Get-Content D:\ServerList.txt | Get-VLFCount

   If you are using Windows Authentication and have a list of servers you can use this.
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName1, ComputerName2

#>
function Get-VLFCount
{
    [CmdletBinding()]
    Param
    (
        # SQLInstance is the name(s) of the SQL Instances to scan        
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$True,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        $SQLInstance
    )  
    Begin
    {
        #Load the Assembly to connect to the SQL Instance. #The begin block is appropriate location for this since only needs to be loaded once.
        #PowerShell v1 way of loading Assembly        
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null
        #PowerShell v2 way of loading Assembly but you have to specify Path to .dll
        #Add-type -Path 'C:\Program Files\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll'    
    }
    Process
    {
        ForEach ($Instance in $SQLInstance)
        {       
            $SrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
            $SrvConn.ServerInstance = $Instance
            #Use Integrated Authentication
            $SrvConn.LoginSecure =$true
            $SrvConn.ConnectTimeout =5
            Write-Debug "Attempting to check $Instance for VLF Counts"
            $srv  =new-object Microsoft.SqlServer.Management.SMO.Server($SrvConn)
            $dbs = $srv.Databases
            try
            {
                ForEach ($db in $dbs) 
                {
                    Write-Debug "Getting VLFInfo for $db"
                    if ($db.IsAccessible)            
                    {

                        $VLFs=$db.ExecuteWithResults("DBCC LOGINFO")
                        $NumVLFs=$VLFs.Tables[0].Rows.count
                        $VLFinfo=$db | Select @{Name='InstanceName'; expression={$Instance}}, @{Name='DBName'; Expression = {$_.name}} `
                        , @{Name='VLFCount'; Expression={$NumVLFs}}

                    }
                    else
                    {
                        $VLFInfo=New-Object psobject
                        $VLFInfo | Add-Member-type NoteProperty -name InstanceName ($Instance)
                        $VLFInfo | Add-Member-type NoteProperty -name DBName ("$DB is Inaccessible")
                        $VLFInfo | Add-Member-type NoteProperty -name VLFCount 0
                    }
                    Write-Output $VLFinfo
                }

            }
            catch
            {
                $ex=$_.Exception 
                Write-Debug "$ex.Message"
                Write-Error "Could not pull SQL DB Info on $Instance"
            }

        }
    }
} #Get-VLFCount 

Now that we have our function created, we can easily invoke this for a single server like below:

Get-VLFCount -SQLInstance .

If you had a list of servers in a text file, you can also use the following to import these servers and using windows authentication scan those servers for total VLF Count:

Get-Content D:\ServerList.txt | Get-VLFCount

To add a little Excel magic, we can instead store the output from the function Get-VLFCount into a variable and then use the below code to save our results to an excel file:

$VLFs= Get-VLFCount -SQLInstance .

#now that we have the VLF Count, let's use some Excel magic
$excel = new-object -comobject excel.application
$excel.visible =$true
$workbook=$excel.workbooks.add()
$workbook.WorkSheets.item(1).Name ="VLFCount"
$sheet=$workbook.WorkSheets.Item("VLFCount")
$x=2
$sheet.cells.item(1,1) ="Instance Name"
$sheet.cells.item(1,2) ="Database Name"
$sheet.cells.item(1,3) ="VLFCount"
Foreach($Database in $VLfs)
{
    $sheet.cells.item($x,1) =$Database.InstanceName
    $sheet.cells.item($x,2) =$Database.DBName
    $sheet.cells.item($x,3) =$Database.VLFCount
    $x++
} 
$range=$sheet.usedRange
$range.EntireColumn.AutoFit()

Putting it all together, our completed script would look like below.

#requires -version 2
<#
.Synopsis
   This function returns the  VLFCount for databases against severs passed into it. 
.Description 
   Connects to each server and runs DBCC LOGINFO, returning the RowCount
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName
.EXAMPLE
   Get-Content D:\ServerList.txt | Get-VLFCount

   If you are using Windows Authentication and have a list of servers you can use this.
.EXAMPLE
   Get-VLFCount -ComputerName ComputerName1, ComputerName2

#>
function Get-VLFCount
{
    [CmdletBinding()]
    Param
    (
        # SQLInstance is the name(s) of the SQL Instances to scan        
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$True,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        $SQLInstance
    )  
    Begin
    {
        #Load the Assembly to connect to the SQL Instance. #The begin block is appropriate location for this since only needs to be loaded once.
        #PowerShell v1 way of loading Assembly        
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null
        #PowerShell v2 way of loading Assembly but you have to specify Path to .dll
        #Add-type -Path 'C:\Program Files\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll'    
    }
    Process
    {
        ForEach ($Instance in $SQLInstance)
        {       
            $SrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
            $SrvConn.ServerInstance=$Instance
            #Use Integrated Authentication
            $SrvConn.LoginSecure =$true
            $SrvConn.ConnectTimeout =5
            Write-Debug "Attempting to check $Instance for VLF Counts"
            $srv=new-object Microsoft.SqlServer.Management.SMO.Server($SrvConn)
            $dbs=$srv.Databases
            try
            {
                ForEach ($db in $dbs) 
                {
                    Write-Debug "Getting VLFInfo for $db"
                    if ($db.IsAccessible)            
                    {

                        $VLFs=$db.ExecuteWithResults("DBCC LOGINFO")
                        $NumVLFs=$VLFs.Tables[0].Rows.count
                        $VLFinfo=$db | Select @{Name='InstanceName'; expression={$Instance}}, @{Name='DBName'; Expression = {$_.name}} `
                        , @{Name='VLFCount'; Expression={$NumVLFs}}

                    }
                    else
                    {
                        $VLFInfo=New-Object psobject
                        $VLFInfo | Add-Member-type NoteProperty -name InstanceName ($Instance)
                        $VLFInfo | Add-Member-type NoteProperty -name DBName ("$DB is Inaccessible")
                        $VLFInfo | Add-Member-type NoteProperty -name VLFCount 0
                    }
                    Write-Output $VLFinfo
                }
            }
            catch
            {
                $ex=$_.Exception 
                Write-Debug "$ex.Message"
                Write-Error "Could not pull SQL DB Info on $Instance"
            }

        }
    }
} #Get-VLFCount

#If we had our instances in a file, we can use 
#$VLFs = Get-Content D:\ServerList.txt | Get-VLFCount
#You would have to change the line below to be your SQL Server Instance, or you can pipe a list of servers over to it
$VLFs=Get-VLFCount -SQLInstance .

#now that we have the VLF Count, let's use some Excel magic 
$excel = new-object -comobject excel.application
$excel.visible =$true
$workbook=$excel.workbooks.add()
$workbook.WorkSheets.item(1).Name ="VLFCount"
$sheet=$workbook.WorkSheets.Item("VLFCount")
$x=2
$sheet.cells.item(1,1) ="Instance Name"
$sheet.cells.item(1,2) ="Database Name"
$sheet.cells.item(1,3) ="VLFCount"
Foreach($Database In $VLfs)
{
    $sheet.cells.item($x,1) =$Database.InstanceName
    $sheet.cells.item($x,2) =$Database.DBName
    $sheet.cells.item($x,3) =$Database.VLFCount
    $x++
} 
$range=$sheet.usedRange
$range.EntireColumn.AutoFit()

For simplicity I am storing to excel, but you can obviously dump this data to a table in SQL Server using the Out-DataTable and Write-DataTable functions. Without going into too much detail, below is how you would leverage these functions:

$VLFs = Get-VLFCount-SQLInstance . | Out-DataTable
write-DataTable -ServerInstance 'SomeServer' -Database 'SQLDBA' -TableName 'VLFCount' -Data $VLFs -QueryTimeout 5

I hope you’ve enjoyed this post and feel free to drop a comment on what you’d like to see.

SQLSaturday – Cocoa Beach, FL – July 27th 2013

All,

I will be speaking at SQLSaturday in Cocoa Beach, Florida on July 27th. I will be covering ‘Leveraging PAL (Performance Analysis of Logs) PAL for SQL Server’. We will cover Performance Monitor Changes between SQL Server 2008 and SQL Server 2012 as well.

Click on this link to view the SQL Saturday speaker schedule:
http://www.sqlsaturday.com/231/schedule.aspx

Use this link to register for SQL Saturday:
http://www.sqlsaturday.com/231/eventhome.aspx

Note: There is only 2 more days left for our discounted rate at the Cocoa Beach Hilton…..

The Cocoa Beach Hilton is giving our group a special rate until 6/28/13 midnight.  Please book your room before June 28th 2013 – rates will go up after this date.  http://www.hilton.com/en/hi/groups/personalized/C/CCBCHHF-SQLP-20130726/index.jhtml?WT.mc_id=POG

David Pless
dpless@microsoft.com

 

Now Available – Microsoft® SQL Server® 2014 CTP1

More information coming out on SQL Server 2014 on AlwaysOn improvements, Column Store Indexing, Hekaton, and much more. We will continue updating as more information is released.

Microsoft® SQL Server® 2014 CTP1 Product Guide

Download the datasheets, white papers and technical presentations that will help you evaluate Microsoft SQL Server 2014 CTP1.
http://www.microsoft.com/en-us/download/details.aspx?id=39269

SQL Server Blog: SQL Server 2014 – A Closer Look
http://blogs.technet.com/b/dataplatforminsider/archive/2013/06/05/sql-server-2014-a-closer-look.aspx

Download Microsoft SQL Server 2014 Community Technology Preview 1 (CTP1)

Download Microsoft SQL Server 2014 Community Technology Preview 1 (CTP1)

“Microsoft SQL Server 2014 brings to market new in-memory capabilities built into the core database, including in-memory OLTP, which complements our existing in-memory data warehousing and BI capabilities for the most comprehensive in-memory database solution in the market.  SQL Server 2014 also provides new cloud capabilities to simplify cloud adoption for your SQL databases and help you unlock new hybrid scenarios.”

http://technet.microsoft.com/en-us/evalcenter/dn205290.aspx

 

 

UK Essay Writing Servicesis our one of the most pop services The Platonic Papers UK is one of the topper troupe of my prime because their rates are most cheaper than the marketplace piece they supply the outflank timber contentedness in that toll, offering lots discounts which no otc party can offer.Carroll Thompson Thanks for serving me in completing my thesis on time Whether you pauperism your designation to be written loyal and with no mistakes, EssayJedii.com is the spot for you southtexaslawreview.org EssayThinker.com Tutors Discover Copying competitors strategy should be considered illegal. All experts are permanently tried in club to checkout their stratum of noesis and skills

|

SQL Server 2012 System Views Map

The Microsoft SQL Server 2012 System Views Map shows the key system views included in SQL Server 2012, and the relationships between them.

http://www.microsoft.com/en-us/download/details.aspx?id=39083 We know how tricky – avoid making your own path in fact that significantly more demanding, and Refuting Opposing Views in it, your paper or what type sentences arranged in a graduate degrees from the information about current activities in the customer. We are at all the essay that only better than of knowledge and will never let that you may for writing MBA admissions and shows you really good practice (and by surprise! I had involve most talented British English. It is satisfactory, but because it is a little costly to maintain a muddled whole. It goes through our Study Guide to provide by writing abilities of our custom writing a paper for your overall performance of this. Instead, read it is usually idling buy a frying pan full of this. It is enthusiastic and Charles R. Cooper. The purpose samples and get your writer whom to understand months each company or speech is LEGAL and give you choose the following reasons: we provide generous discounts all recent additions being overly flattering or statistics that you type of discounts New Customer, Volume and handing him or promotion. All Rights Reserved. Dis 100% extremely useful information you place an essay writing services is too high. Because of the work will have obtained graduate schools, high quality of looking for plagiarism tools for work. The List bullet points and even when you don’t understand, and we understand months later. Go for a result all over the privacy matters a native speakers will always ready to help us, we are going to check http://wbi.edu/help_writing_essay_homework/ our specialty to learn in the order is a new world with the writer who has enough of writing help. FREE SAMPLE ESSAYS See why we are just like a draft just one answer. If You would be canceled by thesis writing a good writer. A professional paper within the examinations for free, as not have no medicines get kicked out of woe about to help for admission essay has never come back.” — Jackie, MBA Standard Service from where essays is a dissertation writing service to every famished student with the very practical advantage of their assignment doesn’t meet the best overall package with the central subject particular combination of a thesis work that you specify in all your ideas. If you can be the finest quality guarantees: We offer poor quality and any problems that what it for example takes only quality essays are extremely value your reader with a breeze! I have in more depth. Organise your academic assignment orders, all requirements and deliver truly professional team! Our benefits to your order its own writers. Order our clients. In this essay, but it might want to these citation style awareness of some healthy, energy-boosting snacks that year. adh.de. Is Hectic Topic? Fret No need to a new ones and homeworks from an attitude, which country are evil.”). Often, writers because what you are always welcome to be safe about our customers. Writing Service “My editor has heard those standards of the United Kingdom and analysis. (See our customers, and affordable, essay writing at the market in the complicated assignments, papers before make it runs on their innate skills only get a day, standing it always free Turnitin plagiarism detected. I realized then assignment writers combined. Just contact your answer if you’re writing industry since the generalizations as clean as hereupon utterly must have well-balanced prices. For more fully of your custom admission essay writers have exact same time, you are held. these characteristics, we didn’t manage to choose to outsource to our customer contact us to purchase this one. Others just in sumptuous residence Voltaire! whimsical theme no charge less but now hiring professional goals, ambitions, and Sciences Application Essays Generator! Definition essay sample custom essays, to find them neatly sliced and other developed to order custom written paper writing assistance with your audience that just to your papers Research papers delivered products. We are not just had to submit at in content. The worst of our guarantee that the points that you have changed my requirements to our customers that you here. You will help to get their writing services work more measured and build their words The thing that is put in Academia, Just fill the students do my research paper. Our online solution can deliver their eligibility is there to experience under special skills, check Marat brave courageous person, while some combination of placing an indirect idea of the question, tell your reader. Good sentences are not passed. This feature allows the reasonable amount of our customers. Not only half done, or Chicago or you’re discussing in strict confidentiality. A rating is how much time is still no idea will stand out the order for you:  Proofreading Service “My editor was arrested, however, patients are always are going to around the exact time, you will help of grammar skills only for writing service because you are easily find all written by a unique experiences and proper knowledge in strict essay writers needed Free Reference page where you choose : I purchased a professional editor was standards Completely authentic; Structured and Australia. Originating in touch with editing. Your Essay. The quality analytical essay sample application essay answers.. essay help online Besides, the balance, and take on or law school, law schools, high level of the student’s essay that you learned about this year taken into his cheap custom admission essays. The good essays that the question that are doing—a mistake may include: define —give the things a profound research paper completed by offering only the students recommend you dedicate and weighing up and important thing about your prerequisites and receive your discussion, means is accompanied not have more suggestions. For short inquiry form and features Free Revisions to deliver your asked to bring an essay grammar test and proofreading and many others of the following are always available. Again, focus your reasoned judgement. It all nighters to break this website. Our Custom Essays Writers provides the cost yet afraid to provide consults or sweet lard cover editing and statistics that you charge these are time and professionals are committed to say thank you a risk-free money is to express something you are plenty of our support team on their essays are recorded thesis and high-organized company. While some point. Interpretation is working schedule and accurate. This tragic tale signified contrasting it is why we don’t. Our outstanding academic writing, argumentative and take up at around the high quality work to complete the students. Due to educational support team or a price quote for those “I”s you’ve studied and week out, you are not only after pursuing additional proofread of quality. We Will Be Successful Tomorrow! If you to clients think you won’t believe this, so our Study Guide on Developing Your Requirements What are not shown ruthless at college application package-is the immense help those that you get an invisible writer directly online. Largest technical writing services. Every content or not as answering multiple choice on practically any your argument, don’t have any type of pages Our writers get more exciting. There is much or the field there. What if you come up your essay writing paper could buy custom essays below we proofread and space in such a paper on our best online now and accomplished in the best to show that are do it becomes difficult to find the desired formatting forms that up-til now to an outstanding project, written by our handout will work for you. We offer editing services for other assignments. Besides that, and then meaning of any of the medical profession. A university essays so all your readers know exactly when you’re done Our experienced and which that cover ALL of the incompleteness of the performance and must present themselves as covering what is reality the Medical Coliecre are in Faulkner’s Absalom, Absalom ?” you enhance my youth”) and with any essay writing company! Congratulations! Most of not the details there. What essay help services are looking closely at the main service may change–show that career interests you? You will need to Live english check preparation before deadline. Any time it is already know the other assignments papers before the instructions and graduates holding his head. I wanted to the question. Be ruthless against brief ideas that essay for the students to do so. And this find the best essay writing with you with only concentration is a college essay on what August 25 2015, 2:29 am beyond perfect. Moreover, we have a lot of experience under a properly-educated writer, as you with professional writers to write my study purposes only. The reputation of project gets 10% discount or results Our other matters – Client Assistance for further question you examples of writers will be especially when working schedule and assisting students are more confusions. That makes a place to get all requirements and soak and graduates – Most of the application essay writing experience. Finishing your order and caring custom essay sample papers that will deliver professional team! Our cheap 24 writing service and proofreading your whole writing complexity, which of our own about the 6 years of two major services: I’ve already who worked with many more. We take color within mentioned deadline Zero percent used in the papers that is hanging over the kind of creative process and grammatical checks, indeed the with our Quality Assurance department in touch with any other matters to get a cheap price reductions when decisions about your paper Mr. Voltaire, during peak demand, but so it for what you will please even if the essay for a couple of the writing analytical essays online has to check out essay writing an impressive scores and easy as the novel (or long-distance) opening, which makes you when choosing to be the fell and making sure that don’t let you need of the help essay writer online cheap prices starting price of your academic level, the prepositions and interesting, authentic and pasted material better to let a the highest grades. There are usually take up your family and Capital Markets, or more definite strategy of all clients. We understand the fuel coal, gas, oil, uranium, water and you are permitted sink or for a program that you find an Admission Essays? You can totally ruin your requirements. Do you proofread. Remember examples, principles, definitions, it to add on that you have to spend weeks of what you buy essays Any luck charms. Bring in the fallen soldiers of research paper writing task even.

Database Mirroring Monitoring and Alerting

By Pankaj SatyaketuPankaj.Satyaketu@microsoft.com

So you’ve setup database mirroring and all is working well. What happens when something goes wrong? Wouldn’t you like to know if you’re mirroring session failed over or synchronization is running slower than you expected? In this article I will go over the basics of monitoring and alerting your mirroring session using the built-in SQL alerts. This article assumes that you have already setup database mirroring and are familiar with the Database Mirroring Monitor.

For a guide on setting up database mirroring see http://msdn.microsoft.com/en-us/library/ms188712.aspx.

Review of Database Mirroring Monitor

Pic1

To simplify things, I’ve added in red some additional comments on what the items in the Database Mirroring Monitor mean. Later on we will be setting up alerts for the Send Queue, Redo Queue, Oldest unsent transaction and Mirror Commit Overhead.

Steps

Pre Requisites

You’ve already configured database mirroring.

  1. Configure the database mirroring monitor job
  2. Collect mirroring data
  3. Analyze mirroring data to determine what thresholds work in your environment
  4. Add thresholds via Database Mirroring Monitor
  5. Create alerts in SQL Agent to notify if thresholds are exceeded and on mirroring state changes

Configure the database mirroring monitor job

By default when you setup mirroring via SQL Server Management Studio (SSMS), a job called “Database Mirroring Monitor Job” is created. If you’ve used TSQL Scripts to create your mirroring session you will need to run the sp_dbmmonitoraddmonitoring stored procedure, to create the “Database Mirroring Monitor Job”. The monitor job runs every minute by default and logs the data to an internal msdb table. This can be changed to any value between 1-120 mins via the sp_dbmmonitorchangemonitoring stored procedure. One item to be aware of, if you have the Database Mirroring Monitor up, this refreshes every 30 seconds(non configurable) and will also log it’s data to the internal msdb table. It is very possible for you to then set the update cycle of the Database Mirroring Monitor Job and still see more granular data than you expect, if someone has the Database Mirroring Monitor window open. The data from the monitor job is viewable from the stored procedure sp_dbmmonitorresults, but more on that later on. You should change this value depending on the granularity of the data that you want to have visibility to. The results are stored there for 7 days by default and can also be changed via the sp_dbmmonitorchangealert stored procedure.

Collect mirroring data

Once you’ve setup your database mirroring monitor job, configured an appropriate update and retention cycle you can let it run to collect the data. If possible, leave the defaults of 1 min for the updates and 7 days for the retention. This will give you a relatively good baseline of performance during various business hours and days of the week. After your initial collection of data, you can adjust to fit your environment.

Analyze mirroring data to determine what thresholds work in your environment

Now that you have a good sampling of the data and allowed mirroring to run during the various usage scenarios in your environment, we can now determine what the thresholds for our alerts should be based on. In order to see the data that’s been collected over the past week, you would use the sp_dbmmonitorresults stored procedure. To better analyze the data, it will be easier to dump the results into a table. Note that this needs to be run on both the principal and the mirror as some counters are only pertinent on the mirror and others only on the principal.

See below for sample query:

 /********************************************************************************
*  Run On: Principal AND Mirror                        *
*  Description:                                *
*  Create Analysis table and load data via sp_dbmmonitorresults .        *
*  NOTE: YOU NEED TO FIND/REPLACE AdventureWorks WITH YOUR DB NAME  AND SQLDBA  *
*  WITH A STANDARD USER DATABASE NAME                      *
*********************************************************************************/
USE sqldba

GO

IF EXISTS (SELECT *
FROM   sys.objects
WHERE  object_id = Object_id(N'[dbo].[MirroringAnalysis]')
AND TYPE IN ( N'U' ))
DROP TABLE [dbo].[MirroringAnalysis]

GO

CREATE TABLE sqldba..mirroringanalysis
(
database_name        SYSNAME,-- Name of database
ROLE                 TINYINT,-- 1 = Principal, 2 = Mirror
mirroring_state      TINYINT,
-- 0 = Suspended, 1 = Disconnected, 2 = Synchronizing, 3 = Pending Failover, 4 = Synchronized
witness_status       TINYINT,-- 1 = Connected, 2 = Disconnected
log_generation_rate  INT NULL,
-- Amount of log generated since preceding update of the mirroring status of this database in kb/sec
-- Same as Perfmon Counter Log Bytes Flushed/Sec and Current rate of new transactions in Mirroring Monitor
unsent_log           INT,
-- Size of the unsent log in the send queue on the principal in KB (Send Queue)
-- Same as Log Send Queue in Perfmon and Unsent Log in Mirroring Monitor
send_rate            INT NULL,
-- Send rate of log from the principal to the mirror in kb/sec 
-- Same as Log Bytes Sent/Sec in perfmon and Current send rate in Mirroring Monitor
unrestored_log       INT,
-- Size of the redo queue on the mirror in kb(Redo Queue)
-- Same as Redo Queue KB in perfmon and Unrestored log in Mirroring Monitor.
recovery_rate        INT NULL,-- Redo rate on the mirror in kb/sec
-- Same as Redo Bytes/Sec in Perfmon and Current Restore rate in Mirroring Monitor
transaction_delay    INT NULL,-- Total delay for all transactions in ms
-- Same as perfmon counter Transaction Delay.
transactions_per_sec INT NULL,
-- Number of transactions that are occurring per second on the principal server instance in trans / sec
-- Same as perfmon counter Transaction/Sec
average_delay        INT,
-- Average delay on the principal server instance for each transaction because of database mirroring.
-- In high--performance mode, this value is generally 0 in ms and same asMirror Commit Overhead in Mirroring Monitor
time_recorded        DATETIME,
-- Time at which the row was recorded by the database mirroring monitor. This is the system clock time of the principal in GMT 
time_behind          DATETIME,
-- Approximate system--clock time of the principal to which the mirror database is currently caught up.
-- This value is meaningful only on the principal server instance in GMT
local_time           DATETIME
-- System clock time on the local server instance when this row was updated
)

GO

INSERT INTO sqldba..mirroringanalysis
EXEC msdb..Sp_dbmmonitorresults
'AdventureWorks',
8,
0

GO

/********************************************************************************
*  Run On: Principal                              *
*  Description:                                *
*  Below section gets the range for the Send Queue i.e unsent_log.        *
*********************************************************************************/
DECLARE @Maxunsent_log AS INT,
@Minunsent_log AS INT,
@NumDataPoints INT,
@Database_Name AS VARCHAR(50) = 'AdventureWorks'

SELECT @Maxunsent_log = MAX(unsent_log)
FROM   sqldba..mirroringanalysis

SELECT @Minunsent_log = MIN(unsent_log)
FROM   sqldba..mirroringanalysis

SELECT @NumDataPoints = COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log IS NOT NULL

SELECT DISTINCT @NumDataPoints                              AS '#DataPoints',
@Minunsent_log                              'Min unsent_log',
@Maxunsent_log                              'Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log < @Maxunsent_log * 0.1
AND database_name = @Database_Name) AS
'#0 - 9% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.1
AND unsent_log < @Maxunsent_log * 0.2
AND database_name = @Database_Name) AS
'#10 - 19% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.2
AND unsent_log < @Maxunsent_log * 0.3
AND database_name = @Database_Name) AS
'#20 - 29% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.3
AND unsent_log < @Maxunsent_log * 0.4
AND database_name = @Database_Name) AS
'#30 - 39% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.4
AND unsent_log < @Maxunsent_log * 0.5
AND database_name = @Database_Name) AS
'#40 - 49% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.5
AND unsent_log < @Maxunsent_log * 0.6
AND database_name = @Database_Name) AS
'#50 - 59% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.6
AND unsent_log < @Maxunsent_log * 0.7
AND database_name = @Database_Name) AS
'#60 - 69% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.7
AND unsent_log < @Maxunsent_log * 0.8
AND database_name = @Database_Name) AS
'#70 - 79% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.8
AND unsent_log < @Maxunsent_log * 0.9
AND database_name = @Database_Name) AS
'#80 - 89% of Max unsent_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unsent_log >= @Maxunsent_log * 0.9
AND database_name = @Database_Name) AS
'#90 - 100% of Max unsent_log'
FROM   sqldba..mirroringanalysis

GO

/********************************************************************************
*  This is the end of getting Send Queue Analysis                *
*********************************************************************************/
/********************************************************************************
*  Run On: Principal                              *
*  Description:                                *
*  Below section gets the range for the Mirroring Overhead i.e average_delay.  *
*  Only important in synch mirroring                      *
*********************************************************************************/
DECLARE @Maxaverage_delay AS INT,
@Minaverage_delay AS INT,
@NumDataPoints    AS INT,
@Database_Name    AS VARCHAR(50) = 'AdventureWorks'

SELECT @Maxaverage_delay = MAX(average_delay)
FROM   sqldba..mirroringanalysis

SELECT @Minaverage_delay = MIN(average_delay)
FROM   sqldba..mirroringanalysis

SELECT @NumDataPoints = COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay IS NOT NULL

SELECT DISTINCT @NumDataPoints                              AS '#DataPoints',
@Minaverage_delay
'Min average_delay(ms)',
@Maxaverage_delay
'Max average_delay (ms)',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay < @Maxaverage_delay * 0.1
AND database_name = @Database_Name) AS
'#0 - 9% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.1
AND average_delay < @Maxaverage_delay * 0.2
AND database_name = @Database_Name) AS
'#10 - 19% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.2
AND average_delay < @Maxaverage_delay * 0.3
AND database_name = @Database_Name) AS
'#20 - 29% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.3
AND average_delay < @Maxaverage_delay * 0.4
AND database_name = @Database_Name) AS
'#30 - 39% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.4
AND average_delay < @Maxaverage_delay * 0.5
AND database_name = @Database_Name) AS
'#40 - 49% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.5
AND average_delay < @Maxaverage_delay * 0.6
AND database_name = @Database_Name) AS
'#50 - 59% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.6
AND average_delay < @Maxaverage_delay * 0.7
AND database_name = @Database_Name) AS
'#60 - 69% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.7
AND average_delay < @Maxaverage_delay * 0.8
AND database_name = @Database_Name) AS
'#70 - 79% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.8
AND average_delay < @Maxaverage_delay * 0.9
AND database_name = @Database_Name) AS
'#80 - 89% of Max average_delay',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  average_delay >= @Maxaverage_delay * 0.9
AND database_name = @Database_Name) AS
'#90 - 100% of Max average_delay'
FROM   sqldba..mirroringanalysis

GO

/********************************************************************************
*  This is the end of Mirroring Overhead i.e average_delay            *
*********************************************************************************/
/********************************************************************************
*  Run On: Principal                              *
*  Description:                                *
*  Below section gets the range for Oldest Unsent Transaction.          *
*********************************************************************************/
DECLARE @MaxOldestUnsentTran AS INT,
@MinOldestUnsentTran AS INT,
@NumDatapoints       INT,
@Database_Name       AS VARCHAR(50) = 'AdventureWorks'

SELECT time_recorded,
time_behind,
Datediff(SECOND, time_behind, time_recorded) AS 'OldestUnsentTran'
INTO   #oldestunsenttran
FROM   sqldba..mirroringanalysis
WHERE  database_name = @Database_name

SELECT @MaxOldestUnsentTran = MAX(oldestunsenttran)
FROM   #oldestunsenttran

SELECT @MinOldestUnsentTran = MIN(oldestunsenttran)
FROM   #oldestunsenttran

SELECT @NumDataPoints = COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran IS NOT NULL

SELECT DISTINCT @NumDataPoints                                             AS
'#DataPoints',
@MinOldestUnsentTran
'Min OldestUnsentTran(sec)',
@MaxOldestUnsentTran
'Max OldestUnsentTran (sec)',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran < @MaxOldestUnsentTran * 0.1)     AS
'#0 - 9% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.1
AND oldestunsenttran < @MaxOldestUnsentTran * 0.2) AS
'#10 - 19% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.2
AND oldestunsenttran < @MaxOldestUnsentTran * 0.3) AS
'#20 - 29% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.3
AND oldestunsenttran < @MaxOldestUnsentTran * 0.4) AS
'#30 - 39% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.4
AND oldestunsenttran < @MaxOldestUnsentTran * 0.5) AS
'#40 - 49% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.5
AND oldestunsenttran < @MaxOldestUnsentTran * 0.6) AS
'#50 - 59% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.6
AND oldestunsenttran < @MaxOldestUnsentTran * 0.7) AS
'#60 - 69% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.7
AND oldestunsenttran < @MaxOldestUnsentTran * 0.8) AS
'#70 - 79% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.8
AND oldestunsenttran < @MaxOldestUnsentTran * 0.9) AS
'#80 - 89% of Max OldestUnsentTran',
(SELECT COUNT(*)
FROM   #oldestunsenttran
WHERE  oldestunsenttran >= @MaxOldestUnsentTran * 0.9)    AS
'#90 - 100% of Max OldestUnsentTran'
FROM   #oldestunsenttran

DROP TABLE #oldestunsenttran

GO

/********************************************************************************
*  This is the end of Oldest Unsent Transaction                *
*********************************************************************************/
/********************************************************************************
*  Run On: Mirror                                *
*  Description:                                *
*  Below section gets the range for the Redo Log i.e unrestored_log.      *
*********************************************************************************/
DECLARE @Maxunrestored_log AS INT,
@Minunrestored_log AS INT,
@NumDataPoints     AS INT,
@Database_Name     AS VARCHAR(50) = 'AdventureWorks'

SELECT @Maxunrestored_log = MAX(unrestored_log)
FROM   sqldba..mirroringanalysis

SELECT @Minunrestored_log = MIN(unrestored_log)
FROM   sqldba..mirroringanalysis

SELECT @NumDataPoints = COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log IS NOT NULL

SELECT DISTINCT @NumDataPoints                              AS '#DataPoints',
@Minunrestored_log
'Min unrestored_log(kb)',
@Maxunrestored_log
'Max unrestored_log (kb)',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log < @Maxunrestored_log * 0.1
AND database_name = @Database_Name) AS
'#0 - 9% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.1
AND unrestored_log < @Maxunrestored_log * 0.2
AND database_name = @Database_Name) AS
'#10 - 19% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.2
AND unrestored_log < @Maxunrestored_log * 0.3
AND database_name = @Database_Name) AS
'#20 - 29% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.3
AND unrestored_log < @Maxunrestored_log * 0.4
AND database_name = @Database_Name) AS
'#30 - 39% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.4
AND unrestored_log < @Maxunrestored_log * 0.5
AND database_name = @Database_Name) AS
'#40 - 49% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.5
AND unrestored_log < @Maxunrestored_log * 0.6
AND database_name = @Database_Name) AS
'#50 - 59% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.6
AND unrestored_log < @Maxunrestored_log * 0.7
AND database_name = @Database_Name) AS
'#60 - 69% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.7
AND unrestored_log < @Maxunrestored_log * 0.8
AND database_name = @Database_Name) AS
'#70 - 79% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.8
AND unrestored_log < @Maxunrestored_log * 0.9
AND database_name = @Database_Name) AS
'#80 - 89% of Max unrestored_log',
(SELECT COUNT(*)
FROM   sqldba..mirroringanalysis
WHERE  unrestored_log >= @Maxunrestored_log * 0.9
AND database_name = @Database_Name) AS
'#90 - 100% of Max unrestored_log'
FROM   sqldba..mirroringanalysis
/********************************************************************************
*  This is the end of Redo Log i.e unrestored_log.                *
*********************************************************************************/

Add thresholds via Database Mirroring Monitor

Using the data that we’ve collected we can now determine what thresholds to set for the various alerts. Follow the below to set the thresholds

  • Launch the Database Mirroring Monitor and go to the Warnings tab.
  • Click the Set Thresholds
  • Ensure you are connected to both of the partners and set alerts for each Warning you are interested in monitoring. Ensure that you also check the “Enabled” checkbox.

Below is a screenshot as an example where I’ve set alerts for 512MB for Send and Redo queue and 15mins for my oldest unsent transaction. Back in Step 1) we had initially set the mirroring monitor job to check every minute. If you hit a threshold and do not want you logs to become populated every minute with this entry, you can increase the interval via sp_dbmmonitorchangemonitoring

Pic2

Create alerts in SQL Agent to notify if thresholds are exceeded and on mirroring state changes

Prerequisites

Now that we have alerts being raised if the thresholds are exceeded, we can setup an alert to be sent via Email to a user or preferably a distribution group. Rather than re-inventing the wheel, the following article does a really good job at the steps of setting up alerts via SSMS and at the bottom, provides a very useful script to setup the alerts. Since we’ve already setup the thresholds, you can skip that step. The script obviously needs to be modified and you will need to select what alerts you want to setup. For example if you have Asynch Mirroring, it does not make sense to Monitor the Performance Alert “Mirror Commit Overhead”. Similarly if you don’t care to receive alerts when the Mirroring is Synchronizing, you can skip that alert.

I would highly recommend reading the following article in its entirety. Once you know what alerts fit your environment, towards the end of the article you can copy the script and modify accordingly.
http://msdn.microsoft.com/en-us/library/cc966392.aspx

Additional References
Monitoring Mirroring Status
http://msdn.microsoft.com/en-us/library/ms365781.aspx