Thursday, October 29, 2009

Dimension processing olap --- Processing options

--Processing options for olap dimensions

The process update option is a more light weight processing option, this allow to process and updates, inserts and deletes the data, the cube need to have to be processed.

The process default option does not detect data changes.

Process Add detects member additions only.

The process full option requires to process the cube as well.

creating Index, rebuilding index

--drop a existing index
drop index idx_repid on tblrepsmaster
--create or rebuild the index, it is advised to rebuild once a month
Create unique index idx_repid on dbo.tblrepsmaster(repid)

Monday, October 26, 2009

Execute Packages with DTExecUI and DTExec

The ssis package can be executed using DTExecUI and DTExec.

The DTExecUI can be called from command prompt and this will bring up the windows interface to run the ssis package.

The DTExec can be executed from the dos command prompt. The syntax for this will be

DTExec /f mypackage /set \package.variables[user::packagepath].value;"c:\ssis"

Maximum error count property - Control flow

The maximum error count Control flow property will allow a package to continue execution even after the errors occured. The maximum number specifies the maximum number of errors allowed.

For example in a sequence container of for loop container the maximum error count can be set.

Report Builder Model

There are two options to create the report builder models.
1. Relational report models
2. ssas report models

relational report models are best created using the bids(business intelligence development studio).

ssas report models are best created using reporting services and ssms, they cannot be created using bids.

creating a report model is a three step process.
using the bids 1. create the data source 2. create the data source view and 3. create the report model.

creating a report ssas report model, using the reporting services, login to http://localhost/reports , select the datasource folder, create a datasource, click on the generate modle button, click on the change location of the model path, click ok to create the model.

Thursday, October 22, 2009

High Availability - Data

Level 1
Our company has certain data which requires high availability so we use the database mirroring along with witness ( debatable) to make sure we can switch over with no data loss.
Having said that, the challenge here is to maintain not only the database but also the ssis packages hosted on the server, ssrs reports on the server.
Since this information is stored on msdb, it becomes very critical as we cannot replicate the msdb database as far as I know, as it will have the reference to the original msdb database.
So we had planned to replicate the entire scenario of ssis packages and ssrs reports on to the mirror server.
The next challege here is the principal server has certain data integrations to another sql server. These integrations have to be replicated but the server reference has to be changed if linked servers are used.
The next challenge we had is to do database mirroring between different locations for eg. US and europe, we achieved this by creating a common domain for the sql servers and it worked like charm. Now the speed could be a issue will keep posted based on the performance.
The level1 gives the comfort of high availability of data with automatic failover even in case of a server failure.

The following may be expensive but it worth based on the data requirement.


Level 2

The data is backed up daily and transaction backups are done every 2 hours.
Level 3

The data is stored external to the building.

Fail over --- Role Switching

Fail over --- Role Switching - Database Mirroring

1. Automatic failover using a witness server.

2. Manual failover

  • This can be done in the sql server management studio
  • Click the database --- properties ---mirroring --- click failover

3. Forced Failover

Use this query for the forced failover

ALTER DATABASE SET PARTNER FORCE_SERVICE_ALLOW_DATA_LOSS

Thursday, October 8, 2009

Copy date with Heading from Query analyzer

-- To copy the data from the results returned from the query analyser

In the sql server managment studio --- click Query --- Query options ---- Under Results ---click Grid --- check the box under "Include column headers when copying the results"

Replace String Function

-- to replace characters in a field the following query can be userd
--in this example the quartor was spelt wrong so we use this query to update
quarter_name --- is the field name
quartor -- is the characters to be replaced
quarter -- is the one to be replace with
update tblbsbcalendar set quarter_name = replace(quarter_name,'Quartor','Quarter')

Thursday, October 1, 2009

SQL Server service pack version for sql 2005

--select @@version

Release Sqlservr.exe
RTM 2005.90.1399
SQL Server 2005 Service Pack 1 2005.90.2047
SQL Server 2005 Service Pack 2 2005.90.3042
SQL Server 2005 Service Pack 3 2005.90.4035

SQL SSRS Subscriptions and schedule

-- ssrs subscriptions list from msdb database
select * from dbo.Subscriptions


-- ssrs schedule list from msdb database
select * from dbo.Schedule

SQL DTS Jobs and schedules - Maintenance

--to identify the sql jobs in a server, this is run under msdb
select * from dbo.sysjobs

--to identify the sql jobs schedules in a server, this is run under msdb
select * from dbo.sysjobschedules

--to identify the sql jobs history in a server, this is run under msdb
select * from dbo.sysjobhistory


--this gives the complete schedules listed under the sql agent jobs
select * from dbo.sysschedules

Shrink SQL Transaction Log file ldf

--If the transaction log file has grown large the following steps can be taken.
option 1
--To back the transaction log file with no truncate optin and shrink the file this is not
--recommended because there could be a data loss, this can be avoided if can take a backup
--immediately after the shrink is completed. I have the code posted in my blog.

option 2
check the database properties --- options --- if the recovery is set to simple
and the Autoshrink to True.

We had a scenario where the ldf file has grown to 600 GB, we had the nightly and transaction back up set, but since the recovery was set to full and autoshrink was false this has increased to 600 GB.
With the setting of option2 this worked and brought back the log file to 1 GB.