Cloud technologies provide many benefits as an infrastructure for your data tier. It’s an ideal platform for development and offers greater flexibility and scalability for production workloads too. Service providers can take care of routine management tasks such as patches, backups, security and DR, leaving data centre staff free to tackle other potentially more interesting projects.
Many of our customers have already relocated their database(s) to the Cloud while others are still in the planning stages. Some are simply moving their on-premises database to a Cloud Infrastructure as a Service (IaaS) platform while others are migrating to a native Cloud database such as Azure SQL and using Platform as a Service (PaaS) to build next generation solutions.
A question that crops up frequently relates to the preparatory work that needs to be done before moving an on-premises SQL Server database to the Cloud. Much depends on whether the migration introduces incompatibilities which might be the case if you’re moving from SQL Server to Azure SQL. That aside, there are some more general preparatory steps that can be taken and the Grey Matter Services team can guide you through the logistics.
I’m going to deal with one aspect of this that can make a real difference to the success of the project. It will help if the database and the systems it supports have been optimised before migration, and here’s why:
If you migrate your on-premises chaos to the Cloud you end up with chaos in the Cloud (plus an extra layer of complexity to deal with)!
Databases are no exception and there are a number of useful things that can be done to get the database in good shape prior to migration. These mostly involve optimisation and performance tuning which covers a broad range of things such as queries, memory, stored procedures, execution plans and more. Migrations also provide an opportunity to look beyond the database itself and consider other aspects of your IT that can be optimised or re-thought in order to maximise ROI from the migration project.
If you’re already working with SQL Server you’ll be familiar with the tools Microsoft provides for monitoring database events and fine tuning the physical design to optimise performance. These tools are useful when things go wrong but they can also be used to give the database a pre-flight check. The following is a brief overview if the database hasn’t received routine check-ups in the past and you’re less familiar with these tools.
Microsoft Performance and Monitoring tools
Getting things finely tuned requires an understanding of the steps involved in executing a database process such as a transaction. SQL Server Profiler allows you to track and record the steps that a process executes and the data can be stored in a table or file for subsequent analysis. The captured information can be used to run a step by step replay to establish exactly what occurred. The SQL Server Distributed Replay tool will let you replay the trace data on multiple machines to simulate a production workload and provide a better indication of real life conditions.
Performance and resource utilisation are closely related and you can gain insight into this using System Monitor which will track things like the number of buffer manager page requests in use. This tool mainly deals with the statistical side of things allowing you to set thresholds for the number of active transactions, CPU activity and the like. The SQL Server Management Studio (SSMS) has an Activity Monitor which is useful for getting a view into current activity. It provides real-time feedback about query execution steps and will help illuminate and debug the cause of performance degradation.
A number of System Stored Procedures will help with monitoring tasks too. Database Console Command statements allow you to check performance statistics and the logical and physical consistency of the database. Some of SQL Server’s Built-in Functions can display snapshot statistics about SQL Server activity since the server was started up and this includes things like CPU execution time, SQL Server connection counts and network packet errors. If you need to diagnose other problematic issues, you can use Trace Flags.
There’s also the Database Engine Tuning Advisor which analyses the performance effects of T-SQL statements executed against databases you want to tune. Database Engine Tuning Advisor provides recommendations to add, remove, or modify indexes and assist with partitioning.
Tools that are part of the Windows Server operating system are a further resource that can help to get things into shape ahead of migration. Utilities such as the error log can provide information about events in SQL Server that isn’t available elsewhere. The Windows application event log is another useful tool that can help build an overall picture of events occurring on the Windows Server and Windows operating systems as a whole, as well as events in SQL Server and SQL Server Agent.
There are plenty of third party tools that can help improve performance, availability and reliability throughout the database lifecycle and we would be happy to discuss these with you. Please call us to discuss your needs on: +44 (0)1364 654100.
A database migration project is a good opportunity to consider optimisation in a wider context too, including hardware, licensing and other aspects of the system that interact with it. I mentioned hardware because that is still likely to feature in your data centre architecture somewhere as hybrid is the most common deployment model for data centre operations today. Licensing and hardware go hand in hand in the database world so it’s worth talking to our database licensing experts to ensure the most economic solution is arrived at for your particular set of circumstances. Please contact us on +44 (0)1364 564100, firstname.lastname@example.org or fill out the contact form on our website.
Migration may well put new features at your disposal. Indeed, this may be the driving force behind the project in the first place. Optimisation and some rejigging may be necessary in order to take advantage of new features. For instance, the Stretch Database introduced with SQL Server 2016 allows you to expand your on-premises storage options into the Cloud. It’s not suitable for all use cases but can be invaluable in other situations even though it may require some rethinking around the database structure and overall system design. The same might be true of other features such as In-memory if your database isn’t set up for this in its current form. A more general review of both the logical and physical database model might form part of your pre-migration prep.
It’s also possible certain features may have been ‘turned-off’ or may need to be handled differently in the Cloud. This sort of thing should have been established at the evaluation or proof of concept stages but it’s not unknown for projects to throw a last minute curveball! My earlier blog Modernising Your SQL Server Database dealt with the subject of SQL modernisation and tools which can help with migrations that involve different editions of SQL Server and other databases.
As part of the preparation to relocate the database, it will also be necessary to carry out some basic house-keeping tasks to cover things like back-up, integrity checks, history clean-up, index maintenance etc. Grey Matter’s migration team can advise and take care of some of this as part of the database migration service they provide. They can be contacted here or by calling +44 (0)1364 654100, or emailing email@example.com
Having all or part of the data tier in the Cloud has become an important component of systems modernisation projects, many of which will involve either new ‘born in the Cloud’ applications or the updating of legacy systems. Platform as a Service (PaaS) provides many of the tools and services needed for this. The coupling of Azure SQL, Azure PaaS and the DevOps tooling in Visual Studio will enable Microsoft developers to take full advantage of Cloud technologies. We will take a closer look at Azure SQL and PaaS in a follow-up blog.