Three Database Fixes Capable of Improving App Performance Overnight

Published on

By Thomas LaRock SolarWinds® Head Geek™

Database applications have vastly improved in recent years, but despite our best efforts, they can still feel frustratingly slow. A report supposed to take minutes might instead (for unknown reasons) take two hours. Corporate teams complain: every time they work with the database, it gums up the works, leaving users grumpy and managers suspicious.

This is no small matter. According to the SolarWinds® Query Report 2021, nearly one-third of data pros surveyed manage more than 300 databases in their organization’s environment. Yet for database administrators (or DBAs), time is at a premium, and they’re frequently asked to do more with less.

DBAs are short-staffed, under constant pressure, and forced to cope with a skills gap. Responsibilities have also increased, despite most companies now recognizing their data is their most important asset. Or, as Harvard Business Review puts it, data is the “centerpiece of corporate value creation.”

DBAs know data is inert if it isn’t securely provided to the right people at the right time and in the right format. By extension, DBAs touch the very core of the company business processes and help turn their data into useful information. This fact needs wider acknowledgment, and DBAs need more support. It should also be noted they performed their tasks during the worst of the pandemic—and during this time, their responsibilities only increased.

Despite their importance to the company, the databases are still slow. Often, this can be accounted for. For example, developers were given only functional requirements during the build process and had no expectations set for the performance of the application and database. Or the team who wrote the query asked for a lot more data than they needed. Rather than creating a more careful and selective query, they asked for something scattershot in the expectation the end user would then use a spreadsheet to filter and analyze the results. They used the Select * syntax to select all the columns in the table, retrieving too much data. Rule of thumb: if you ask for more data, you’ll spend more time waiting.

But even when users carefully craft a well-written SQL query, the database response can still be slow. In this case, the application may perform well, but the reason for poor performance lies within the database itself.

Look first in the three areas below if you have a slow database—and if you can practically hear the squeaky wheels turning while the horse-and-buggy plugs along.

1. Review Your Indexes

Indexes matter! It’s common to hear an important query or report that normally takes two hours to run could instead take two seconds if the database had proper indexes. Without indexes, the application will be molasses-slow, causing user tempers to run high.

To improve performance, look at the database design and make sure it has appropriate indexes. For example, your tables in a SQL Server® or Azure® SQL database should all have a primary key and clustered index, and you should apply non-clustered indexes to all foreign key columns as well as columns frequently used in WHERE and JOIN clauses. For custom-built applications, it’s also common to see duplicate indexes, which can hurt performance. Be aware when you buy third-party products, the vendor makes sure it runs on specific database platforms. But they rarely do more than make sure it can run. Performance tuning, especially for third-party products built to run on multiple database platforms, is extremely rare. Your third-party product may be good enough to run on multiple platforms, but it’s unlikely to run well.

Check the indexes on all your databases—you’ll likely make positive gains.

2. Check for a Data Type Mismatch

Developers and DBAs can dramatically improve database performance by understanding data types—an attribute telling the database query optimizer how the programmer intends to use the data. Conversely, novice data pros frequently choose broad and general data types; for example, a text data type with no limit to how much text it can hold—e.g., varchar (max)—not realizing shortcuts like this often degrade performance.

Choosing the right data type is important, but it’s even more important to ensure the data type is used consistently throughout the database and applications using the database. For example, assume the database architect designed a table with a column called Employee_ID defined as a 10-character long text data type —i.e., CHAR(5). The column itself holds numbers, such as 09508 or 00113, but the database architect chose a character data type instead of an integer data type deliberately because the CHAR data type will record any leading zeros, whereas integer data types will drop any leading zeros of an Employee_ID. However, it’s not uncommon to find on multi-person development efforts an Employee_ID appears as CHAR(5) in some tables, as INT in other tables and programs, and as SMALLINT in yet other tables and programs. Everywhere there’s a mismatch between the data types of Employee_ID, between tables, or between tables and programs, performance degrades.  

For developers to choose the correct data type, they must ask the right questions during the design process, such as how the data will be used, how much space is needed to properly store the data, and how likely the data will be used in the future. From there, developers and DBAs should review their database and application design to ensure a given column definition is used consistently everywhere, whether it’s in tables or in SQL code. Doing this improves performance and leads to cost savings.

3. Investigate the Entity Framework

The Entity Framework (EF) is an open-source object-relational mapper (ORM) designed to enable .NET developers to work with a database using .NET objects. EF is our example, but this advice applies to all ORMs, such as TypeORM, Waterline, Hibernate, and NHiberate.

EF makes building the front-end app easier and faster, especially when compared with hand-writing SQL to do the simplest types of data access. Developers become much more productive for simple CRUD requirements (CRUD stands for create, read, update, and delete data). With the right skills, a talented developer using EF can quickly and effectively build most database-driven web applications.

Note we say “simple” more than once in the previous paragraph. This is because ORMs are good at making simple CRUD screens. But if you want to make sophisticated user interfaces, ORMs write low-quality SQL code when compared with hand-written SQL. Though EF has significantly improved over the years, there’s still a problem: it has multiple areas capable of tripping up developers. Learn to spot these traps and avoid them. Here be dragons.

If you have application performance problems, check to see if it’s caused by the EF or another ORM used during the development process.

Bonus: Get a Database Performance Monitoring Tool

A good database performance monitoring (DPM) tool doesn’t just tell you when there are error conditions or problems. Putting a DPM tool in place can improve database and app performance by providing easier and faster access to vital performance data. It also makes applications better-behaved by closely and carefully measuring performance and resource consumption during the development process and in production. By using a DPM tool during development, data pros can learn what a normal performance profile is and determine when performance becomes abnormal. This can lead to better-optimized, more efficient database design, since fixing performance problems is an order of magnitude easier during development than after being deployed into production.

DPM tools speed problem resolution and troubleshooting in production environments and can strongly guide the development of database queries and applications toward better-performing alternatives. DBAs can use DPM tools to compare specific queries and database applications before and after changes with a complete view of all empirical performance metrics. DPM tools also monitor all the organization’s databases from a single pane of glass, whether they’re on-prem, in the cloud, or in a multi-cloud environment.

Because a good DPM tool uses its own database to store performance metrics produced automatically by the operating system, database server, and the application, it can also provide long-term trending information. It does this all while adding relatively little overhead to ongoing database activity. Additionally, make sure the DPM tool you chose can monitor cloud-native, on-premises, and hybrid production and test databases at scale without too much performance overhead.

Database professionals know many of these problems lessen or even disappear when their organization has enough staff members (and a narrower skills gap). While we wait for the boardroom to understand this fact, look to the fixes providing the most improvement while expending the least amount of effort. And know databases don’t have to be slow—we’ve moved on to a faster era than horse-and-buggy. It’s time to speed things up.

Join 34,209 IT professionals who already have a head start

Network with the biggest names in IT and gain instant access to all of our exclusive content for free.

Get Started Now