Currently Browsing: SQL Server
Posted by Barac in Azure, SQL Server
on Oct 30th, 2017 | 0 comments
First of all, we need to login to Azure Portal and to create Logical SQL Server on Azure Next, we need to put the name of the server, as well as server admin login account for this Azure Logical SQL Server. After that, we can create a new one or to use existing resource group. When we finish this part of configuration we can start with creating Azure SQL Server Database. We need to define database name, and subscription we want to use for this SQL Database After that, we need to select source of our database (blank, sample, or restore from backup) Than we need to choose...
Posted by Barac in Azure, SQL Server
on Oct 26th, 2017 | 6 comments
You want to create a new login on Azure Logical SQL Server using your Azure Admin account. To do that you must be connected to the master database on SQL Azure with the administrative login. CREATE LOGIN [azure_db_user] WITH PASSWORD=N'password' GO After that, you need to create new DB user for that login (db_owner in this example). To be able to do that you must be connected as an admin to the database where you want to create that user. CREATE USER azure_db_user FOR LOGIN azure_db_user GO EXEC sp_addrolemember N'db_owner', N'azure_db_user' GO Now, if you try to connect to that db using...
Posted by Barac in SQL Server, SQL Tips and Tricks
on Oct 11th, 2017 | 2 comments
“Operating system error 32 (The process cannot access the file because it is being used by another process)” message in SQL Server error log BackupDiskFile::OpenMedia: Backup device ‘\\NAS\Shared_LogShip\DB_log.trn’ failed to open. Operating system error 32(The process cannot access the file because it is being used by another process.). If you are using shared NAS for your logs backup location for the primary server and you restoring those logs directly to the secondary server (without copy job) you can face this kind of issues. If backup and restore job are scheduled at the...
Posted by Barac in SQL Server, SQL Tips and Tricks
on Oct 9th, 2017 | 0 comments
If you planning to upgrade to the new SQL Server Version 2016 or 2017 and if there is log shipping in place between those servers you should follow some upgrading order. Always upgrade the secondary SQL Server instances first, before upgrading the primary server instance. If you do the opposite log shipping will fail because a log backup created on a newer version of SQL Server cannot be restored on an older version of SQL Server. But this is not a reason why I am writing this post. I am writing to alert you that in case you have a secondary database configured by using RESTORE WITH STANDBY,...
Posted by Barac in SQL Monitoring, SQL Server
on Oct 9th, 2017 | 0 comments
Recently I did upgrade from SQL Server 2012 towards version 2016, and I had some PRTG maps in place for two really important queries. After upgrade monitoring failed. Query hash has been changed for the exact same query. Honestly, I did not pay much attention to that previously since the majority of those PRTG maps checking same versions of SQL Server. Also, I was under impression that query optimizer computes the query hash during query compilation on base on query logic. In that case, if query hash is changed that also means query has been changed. I am calculating query speed on the base of...
Posted by Barac in SQL Monitoring, SQL Server
on Oct 8th, 2017 | 0 comments
Yesterday, one of my SQL alerts just sent me an email about my daily backups size. I noticed that my backup has doubled in size. What happened is that I changed my Task Scheduler which is running my PowerShell backup script and accidentally ran backup twice for the same backup set. Nevertheless, SQL Server is smart enough and instead of overwriting your existing backup file its just append the new one. You can see from the picture below that default for SQL server (2012 in this case) is “append to the existing backup set”. Can be confusing but it is really the safest option for the user. In this...
Posted by Barac in Index Usage Report Project, SQL Monitoring, SQL Server
on Oct 8th, 2017 | 0 comments
I created this ETL to fetch and save expensive execution plans daily. When ETL process runs again firstly it creates directory and copy all plans from previous day to that directory, and only after that fetch new execution plans. This way I can look old plan later if I like. SSIS Package Description. Control Flow Screen I am using File System Task to create new directory using the name of day before, to store my yesterday’s execution plans. As you can see I am using variable FolderPath to create folder name with day before date. This is Expression I used From the valuated value...
Posted by Barac in Index Usage Report Project, SQL Monitoring, SQL Server
on Oct 8th, 2017 | 0 comments
Index Usage Report Project / Expensive Queries Table I created this ETL process (SSIS) to populate my WH Expensive Queries table, so I can later join this table using plan_handle to determine if an index is used within those expensive queries. Every time I run this ETL I am truncating this WH table and inserting new rows. Control flow > SQL Task SQL Task After that under the Data Flow, I created data source and destination of my ETL process For OLE DB Source I used SQL command SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED ; WITH XMLNAMESPACES (DEFAULT...
Posted by Barac in Index Usage Report Project, SQL Monitoring, SQL Server
on Oct 5th, 2017 | 0 comments
Index Usage Report Project The general project idea was to create the report about every index usage within DB. To be able to see a daily number of scans, seeks, page splits for all indexes or key lookups for clustered. You also can change the date range to see how this index behaved in past. Also, to be able to see all expensive queries using that index (red one are expensive). To see and save their execution plans as well. Project concept Index Usage Stats system views [sys].[indexes] – Basic index information (name, type, etc)...
Posted by Barac in Azure, Powershell, SQL Server
on Oct 4th, 2017 | 5 comments
With this simple script, you can backup all databases to Azure daily, you can hardcode the parameters values or you can use task manager to pass them to the PS script. You need to have Azure PowerShell modules installed on your machine (how to). First off all, let’s try just to connect to your Azure Storage Account You can use azure credential-publishsettings file to do that, or putting the variables with access key, subscription and storage account name as in this example: #Variables param ( [string]$AzureKey= "AzureStorageAccessKey", [string]$AzureSubs= "YourSubscription", [string]$AzureStor=...