How to determine Azure SQL Database Tier

Recently, I got lot of questions on how to determine tier and performance level for Azure SQL Database. Therefore, I decided to write my next post about this topic.

To determine correct tier for specific workload, you have to take into account multiple parameters.

First, let us talk about Azure SQL database tiers. Azure SQL comes in three tiers types: Basic, Standard and Premium. Basic tier has only one performance level, for Standard you can choose from S0 to S3 and for Premium, you can choose from P1 to P15. Every tier has its limitations and every performance level additional limits.

Basic tier is suited for small databases that are used for development or small applications with low activity. Standard tier is for applications with low to medium IO performance. Premium tier is for transactional volume with high IO performance and supports many concurrent queries.

First limit is database size. For Basic tier, max size of database is 2GB. Standard tier has max size of 250GB and Premium tier has max size of 500GB(P1-P6) and 1024GB(P11-P15).

There are limits on concurrent logins and sessions as well. Basic tier supports up to 30 concurrent logins and up to 300 sessions. For Standard and Premium tier, logins and sessions limits depend on performance level. Standard is limited to 60 (S0) – 200 (S3) concurrent logins and 600 (S1) – 2400 (S3) concurrent sessions. Premium is limited to 200 (P1) – 6400 (P15) concurrent logins and 30.000 concurrent sessions.

Most confusing parameter in Azure SQL database is DTUs or Database Transaction Units. Basic tier is limited to 5 DTUs, Standard tier has from 10 DTUs for S0 to 100 DTUs for S3, and Premium has 125 DTUs for P1 to 4000 DTUs for P15.

As I already said, this is most confusing part. It is confusing because most people don’t understand what DTUs are and what impact they have on performance. DTU is blende measure of CPU, memory, data I/O and transaction log I/O. As I/O mostly depends on disk speed, much can’t be changed in this department, there are only two options: Standard and Premium storage. However, what can be changed in many ways are CPU and memory. So changing performance level to increase DTUs, is basically giving your Azure SQL database more computing power (cores and memory). Changing your performance level from S0 (that has 10 DTUs) to S3 (with 100 DTUs) is changing your database performance 10x by giving it 10x more computing power.

So, all parameters must be taken into account: size of database, change in size over time, number of concurrent users, workload…

Correct tier is especially hard to determine if you choose to migrate your database from on-prem SQL server to Azure SQL. Amount of memory and CPUs isn’t expressed traditionally but in DTUs. Fortunately, there is tool that can help you there. Azure SQL Database DTU Calculator is tool that you can run on your local server to give you estimate of DTUs needed for your database. Trick is that it monitors performance of SQL Server and not performance of single database, so if you have more than one database on your server, estimate can be very wrong. There is also option with Elastic Database Pool where resources are limited on server level and not for single database. In this case, you can move all your databases to single database pool with DTU that will cover estimated performance of your local server.

Once again, when determining Azure SQL database tier and performance level, you have to take into account: size for database (and its change over time), number of users (connections/sessions) and DTUs.

Article by Mustafa Toroman

I’m System Engineer at Authority Partners. I love to test and explore new technologies. Have over 20 active MS certificates such as MCSA for Windows Server and SQL Server, MCSE Private Cloud, Data Platform, Business Intelliegence and Server Infrastracture. Lately most interested in BI and Azure solutions. MCT since 2012. From 2016 MVP for Microsoft Azure.