http://www.serverwatch.com/tutorials/article.php/2175621/How-to-Optimize-Queries-Theory-an-Practice.htm
http://www.sql-server-performance.com/articles/per/query_tuning_p1.aspx
Everyday brings something new. This is an attempt to blog things learnt on a daily basis to solve issues in our environment from DBAs, colleagues and the great collection of articles in the field. Please share your comments, alternative solutions and suggestions.
Wednesday, November 24, 2010
Friday, November 19, 2010
Codename "Denali"
Always on high availability feature
Columnstore index
File storage and creating file system and searching thru file text like usual file
Better compression
Parallel data warehouse PDW solution > 30TB
Faster queries, better query time in power pivot
Columnstore index
File storage and creating file system and searching thru file text like usual file
Better compression
Parallel data warehouse PDW solution > 30TB
Faster queries, better query time in power pivot
Monday, September 27, 2010
Deadlock Monitoring
DBCC TRACEON(1222,-1). Once you run that in SSMS on that server, the deadlock information will go into the SQL Server Error Log once a deadlock is encountered for all sessions on the server. Then you can read through the events to find out the issues.
Don't forget to set DBCC TRACEOFF(1222), since its an intensive process.
Don't forget to set DBCC TRACEOFF(1222), since its an intensive process.
Tuesday, August 17, 2010
Pages
Page size: 8KB, Extent : 64KB. So when an insert to a table exceeds 8KB storage space, a page split occurs and SQl shifts half of data on prev page to new page causing page splits. Somtimes if extent is full it moves data onto a page on different extent.
There are two types of page faults: Soft and Hard. Soft are the best; thus making hard the worst (speaking in performance terms).
In paged memory systems there are three levels of memory pages: In memory, in cache, and on disk. Most people are familiar with the in memory / on disk concept. However, most paged memory implementations keep a few recently used memory blocks in a page pool that gets written to disk when they are untouched for a certain amount of time. Basically, it's a collection of pages that are on their way out. When requesting a bit of memory that is not in the main memory banks, it goes to the page pool to see if it is a recent main memory visitor. If so, a soft page fault occurs, and the page is placed back into memory from memory (a quick operation). Pages not even in the pool have to be fetched from disk, causing a hard fault. This is the slowest and most horrible of operations for a computer and should be avoided at all costs. Too many hard page faults are often referred to as "thrashing".
There are two types of page faults: Soft and Hard. Soft are the best; thus making hard the worst (speaking in performance terms).
In paged memory systems there are three levels of memory pages: In memory, in cache, and on disk. Most people are familiar with the in memory / on disk concept. However, most paged memory implementations keep a few recently used memory blocks in a page pool that gets written to disk when they are untouched for a certain amount of time. Basically, it's a collection of pages that are on their way out. When requesting a bit of memory that is not in the main memory banks, it goes to the page pool to see if it is a recent main memory visitor. If so, a soft page fault occurs, and the page is placed back into memory from memory (a quick operation). Pages not even in the pool have to be fetched from disk, causing a hard fault. This is the slowest and most horrible of operations for a computer and should be avoided at all costs. Too many hard page faults are often referred to as "thrashing".
Max capacity specifications SQL2008
Clustered Index:1
Non clusetered index: 999
xml index: 249
Fk's: 253
columns per FK, PK, index: 16
Databases, Filegroups, Files per instance: 32,767
Max DB size: 524272 TB
Instances per computer: 50 on satnd alone, 25 on failover
Max number of objects in a DB(views, procs, index,keys, triggers): 2,147,483,647 (Triggers are limited only by max no of objects)
For more details: http://msdn.microsoft.com/en-us/library/ms143432.aspx
Non clusetered index: 999
xml index: 249
Fk's: 253
columns per FK, PK, index: 16
Databases, Filegroups, Files per instance: 32,767
Max DB size: 524272 TB
Instances per computer: 50 on satnd alone, 25 on failover
Max number of objects in a DB(views, procs, index,keys, triggers): 2,147,483,647 (Triggers are limited only by max no of objects)
For more details: http://msdn.microsoft.com/en-us/library/ms143432.aspx
Wednesday, August 11, 2010
Transaction Logs
Transaction log full:
If you reach 100% of the log size and 100% of the disk size, you'll need to clear it and shrink it using DBCC SHRINKFILE
If you reach 100% of the log size but not 100% of the disk size, and Database is in simple mode, you either need to do a database checkpoint (CHECHPOINT) or increase log file size (though this is a temporary solution)
If you reach 100% of the log size but not 100% of the disk size, and Database is in Full/Bulk Logged recovery model, you (most likely) need to take a log backup (and do it regularly).
If you reach 100% of the log size and 100% of the disk size, you'll need to clear it and shrink it using DBCC SHRINKFILE
If you reach 100% of the log size but not 100% of the disk size, and Database is in simple mode, you either need to do a database checkpoint (CHECHPOINT) or increase log file size (though this is a temporary solution)
If you reach 100% of the log size but not 100% of the disk size, and Database is in Full/Bulk Logged recovery model, you (most likely) need to take a log backup (and do it regularly).
Tuesday, June 1, 2010
List all columns in all table in a database XML PATH
SELECT db_name() as databaseName, table_name=sysobjects.name,
(Select syscolumns.name + ',' FROM
syscolumns where sysobjects.id = syscolumns.id
order by syscolumns.name FOR XML PATH('')) as columnlist
FROM sysobjects
WHERE sysobjects.xtype='U'
ORDER BY sysobjects.name
(Select syscolumns.name + ',' FROM
syscolumns where sysobjects.id = syscolumns.id
order by syscolumns.name FOR XML PATH('')) as columnlist
FROM sysobjects
WHERE sysobjects.xtype='U'
ORDER BY sysobjects.name
Friday, May 14, 2010
Passing variable to .sql from a bat file
sqlcmd -S %1 -i select.sql -v dbname = %2 tablename = %3 -o %4
select.sql
use $(dbname)
select * from $(tablename)
or
then you can also do in select.sql
DECLARE @dbname1 as varchar(150)
SET @dbname1 = '$(dbname)'
in sql and use it across the script to create dynamic sql
select.sql
use $(dbname)
select * from $(tablename)
or
then you can also do in select.sql
DECLARE @dbname1 as varchar(150)
SET @dbname1 = '$(dbname)'
in sql and use it across the script to create dynamic sql
Friday, March 5, 2010
Dynamically passing variables to .sql through bat file
runscript.bat
REM echo %2 >> log\all.log
sqlcmd -S -d -E -i upgrade.sql -olog\upgrade%2.log -v servername=%1 sitecode=%2
type log\upgrade%2.log >> log\all.log
:END
upgrade.sql:
DECLARE @vsql NVARCHAR(4000)
SET @vsql = ''
Declare @serverName NVARCHAR(255)
Declare @sitecode NVARCHAR(50)
set @serverName = '$(servername)'
set @sitecode = '$(sitecode)'
Print @serverName
Print @sitecode
SET @vsql = 'SET IDENTITY_INSERT dbo.[] ON
insert into dbo.[GL_Site_SourceTests]
(,)
select distinct,
from ['+@serverName +'].'+ @sitecode +'_dmss.dbo.[] i
where not exists (select 1 from dbo.[] s
where i.site_tk = s.site_tk
and i.gl_site_sourceTest_tk = s.gl_site_sourceTest_tk)
SET IDENTITY_INSERT dbo.[] OFF'
--Print @vsql
EXEC sp_executesql @vsql
REM echo %2 >> log\all.log
sqlcmd -S
type log\upgrade%2.log >> log\all.log
:END
upgrade.sql:
DECLARE @vsql NVARCHAR(4000)
SET @vsql = ''
Declare @serverName NVARCHAR(255)
Declare @sitecode NVARCHAR(50)
set @serverName = '$(servername)'
set @sitecode = '$(sitecode)'
Print @serverName
Print @sitecode
SET @vsql = 'SET IDENTITY_INSERT dbo.[
insert into dbo.[GL_Site_SourceTests]
(
select distinct
from ['+@serverName +'].'+ @sitecode +'_dmss.dbo.[
where not exists (select 1 from dbo.[
where i.site_tk = s.site_tk
and i.gl_site_sourceTest_tk = s.gl_site_sourceTest_tk)
SET IDENTITY_INSERT dbo.[
--Print @vsql
EXEC sp_executesql @vsql
Monday, March 1, 2010
Drop all contraints on a table
-- t-sql scriptlet to drop all constraints on a table
SELECT
'ALTER TABLE ' + object_name(parent_obj) + ' DROP CONSTRAINT ' + [name]
AS Script into #temp
from sysobjects where --xtype IN ('F','TR')
object_name(parent_obj) like '%%'
and (name not in (',')
and name not like '%%')
--drop table #temp
Select * from #temp
DECLARE get_sqlstmt CURSOR FOR
select Script
FROM #temp
OPEN get_sqlstmt
FETCH NEXT FROM get_sqlstmt
INTO @vsql
WHILE @@FETCH_STATUS = 0
BEGIN
Print (@vsql)
EXEC(@vsql)
FETCH NEXT FROM get_sqlstmt INTO @vsql
END
CLOSE get_sqlstmt
DEALLOCATE get_sqlstmt
Drop table #temp
SELECT
'ALTER TABLE ' + object_name(parent_obj) + ' DROP CONSTRAINT ' + [name]
AS Script into #temp
from sysobjects where --xtype IN ('F','TR')
object_name(parent_obj) like '%
and (name not in ('
and name not like '%
--drop table #temp
Select * from #temp
DECLARE get_sqlstmt CURSOR FOR
select Script
FROM #temp
OPEN get_sqlstmt
FETCH NEXT FROM get_sqlstmt
INTO @vsql
WHILE @@FETCH_STATUS = 0
BEGIN
Print (@vsql)
EXEC(@vsql)
FETCH NEXT FROM get_sqlstmt INTO @vsql
END
CLOSE get_sqlstmt
DEALLOCATE get_sqlstmt
Drop table #temp
Monday, February 8, 2010
Asp.net and SQL privileges
When a .net app stops working and it has connections to database.
eg: If the executeReader works and executenonQuery fails, it means either the user does not have write privilages or the process runs
under an identity (usually ASPNET) that only has read access
to the database. For this Right click
on the database file in Windows Explorer, go to the
Security tab, and give the ASPNET user write and modify rights to the file.
eg: If the executeReader works and executenonQuery fails, it means either the user does not have write privilages or the process runs
under an identity (usually ASPNET) that only has read access
to the database. For this Right click
on the database file in Windows Explorer, go to the
Security tab, and give the ASPNET user write and modify rights to the file.
Tuesday, January 12, 2010
Generate Insert statement for multiple tables dynamically
DROP PROC InsertGenerator
go
CREATE PROC InsertGenerator
(@tableName varchar(100)) as
--Declare a cursor to retrieve column specific information for the specified table
DECLARE cursCol CURSOR FAST_FORWARD FOR
SELECT column_name,data_type,table_schema FROM information_schema.columns WHERE table_name = @tableName
OPEN cursCol
DECLARE @string nvarchar(3000) --for storing the first half of INSERT statement
DECLARE @stringData nvarchar(3000) --for storing the data (VALUES) related statement
DECLARE @dataType nvarchar(1000) --data types returned for respective columns
DECLARE @schemaType nvarchar(100) --table schema value for correct schema execution
SET @stringData=''
DECLARE @Query nvarchar(4000)
DECLARE @colName nvarchar(50)
FETCH NEXT FROM cursCol INTO @colName,@dataType,@schemaType
IF @@fetch_status<>0
begin
print 'Table '+@tableName+' not found, processing skipped.'
close curscol
deallocate curscol
return
END
WHILE @@FETCH_STATUS=0
BEGIN
SET @stringData=@stringData + @colName+','
--SET @string=@string+@colName+','
FETCH NEXT FROM cursCol INTO @colName,@dataType,@schemaType
END
set @stringData = substring(@stringData,0,len(@stringData))
SET @string='INSERT INTO '+@schemaType+'.'+@tableName+'(' +@stringData +')'
SET @query =@string +'SELECT '+@stringData+' FROM '+@schemaType+'.'+@tableName
Print @query
--exec sp_executesql @query
--select @query
CLOSE cursCol
DEALLOCATE cursCol
GO
Run: InsertGenerator @tablename
go
CREATE PROC InsertGenerator
(@tableName varchar(100)) as
--Declare a cursor to retrieve column specific information for the specified table
DECLARE cursCol CURSOR FAST_FORWARD FOR
SELECT column_name,data_type,table_schema FROM information_schema.columns WHERE table_name = @tableName
OPEN cursCol
DECLARE @string nvarchar(3000) --for storing the first half of INSERT statement
DECLARE @stringData nvarchar(3000) --for storing the data (VALUES) related statement
DECLARE @dataType nvarchar(1000) --data types returned for respective columns
DECLARE @schemaType nvarchar(100) --table schema value for correct schema execution
SET @stringData=''
DECLARE @Query nvarchar(4000)
DECLARE @colName nvarchar(50)
FETCH NEXT FROM cursCol INTO @colName,@dataType,@schemaType
IF @@fetch_status<>0
begin
print 'Table '+@tableName+' not found, processing skipped.'
close curscol
deallocate curscol
return
END
WHILE @@FETCH_STATUS=0
BEGIN
SET @stringData=@stringData + @colName+','
--SET @string=@string+@colName+','
FETCH NEXT FROM cursCol INTO @colName,@dataType,@schemaType
END
set @stringData = substring(@stringData,0,len(@stringData))
SET @string='INSERT INTO '+@schemaType+'.'+@tableName+'(' +@stringData +')'
SET @query =@string +'SELECT '+@stringData+' FROM '+@schemaType+'.'+@tableName
Print @query
--exec sp_executesql @query
--select @query
CLOSE cursCol
DEALLOCATE cursCol
GO
Run: InsertGenerator @tablename
Subscribe to:
Posts (Atom)