The SELECT * FROM is a sin
in SQL while considering the performance. It is going to be definitely a bad
thing if you are seriously working on developing a full-fledged application for a high-traffic interface. The SELECT * FROM is a Hitman for SQL Optimizer. There
is no chance for the SQL Optimizer to choose the underlying indexes for a SELECT * statement.
Area of Tuning:
Try to
SELECT the column names needed for the data retrieval. This is going to help
avoid disk, network, and memory contention on your server.
2. Caching In Application
If your application is randomly hit
by multiple users across the globe and the data retrieval is happening directly
from the server, then it is a bad idea. Every time a user retrieves the data from the
database, there is going to be a huge amount of disk rotation and network
contention. Additionally, this phenomenon can cause the server to go slow
eventually.
Area of Tuning:
Encouraging to cache
in the application memory, to avoid constant queries to the database. This caching
method can be implemented on infrequent updates and small sets of data such as
lookup values.
3. Keep Transactions Short
This happens in many industries
wherein the queries are Adhoc. These Adhoc queries could be to fix the
application problems such as data retrieval and other reporting strategies.
Basically, some of the reporting queries can run so long that the entire
server’s CPU is eaten up. In fact, this is going to ruin the entire server’s
performance.
Area of Tuning:
Try to shorten the
query execution time and keep them simple. Longer queries are going to take way
long locking on tables. The higher locking, the more the next user waits for
accessing the table. Along with the lock waits, there could be a spike in CPU usage if
the transactions are running for a longtime. Use a precise locking strategy to overcome
blocking in SQL.
4. Avoid NOLOCK Hint
It is very general for most of the
DBAs working on SQL environment to use the NOLOCK hint on big tables on a production
server to avoid blocking. Usage of NOLOCK hint breaks the fundamental rules of
data integrity. Additionally, using the NOLOCK hint can lead to missing or
extra rows in data sets, which is a huge abuse to SQL.
Area of Tuning:
We have an alternate
version to deal with the NOLOCK hint introduced by Microsoft, a snapshot isolation
level. Using the command such as READ_COMMITTED_SNAPSHOT isolation level on top
of any of your queries can simply solve most of the SQL overhead problems. Further,
this can as well solve the missing or extra rows in data sets.
5. Use Variable And Parameter Data
Types Same As Columns
This can be something interesting
for people on how the data type slow the performance of a query. While I was testing
the data import on a stored procedure on one of my optimization projects, there
seemed a mismatch in data type between the declared variable and the columns. When
the data is imported from a variable to a column with different data types, there
is going to conversions happening. This can hit the SQL optimizer to a greater
extent.
Area of Tuning:
Try to keep both the
variable/parameter data types with respect to columns always the same. This can
help avoid table scans and as well void the cause of data conversions.
6. Avoid Usage Of DISTINCT, ORDER
BY Or UNION Unnecessarily
It becomes very common for us to eliminate
the duplicates using the DISTINCT function. A DISTINCT can be a killer for SQL.
Additionally, the ORDER BY is going to sort your query, and internally optimizer
is thinking of more logical ways to do that. This in turn of sorting your query
will be a time taking job. Further, the UNION is one simple way to combine results. The usage of UNION can just slog the server over time.
Area of Tuning:
DISTINCT: Instead of using the
DISTINCT function, try with GROUP BY clause at the end of your query to eliminate
duplicates. The reason for this changeover is simple and clear that the SQL is
going to do a bit of more job internally if used as DISTINCT. This can be viewed
in a precise manner while using the DISTINCT on big queries.
ORDER BY: Now it's going to
be your call on either to use a GROUP or BETWEEN operator to get the range. The
logic on your query should be optimal enough to make the SQL feel much better.
UNION: Please refer to the Advantagesof UNION ALL Over UNION for your solution of replacement.
7. Minimize CURSOR and WHILE LOOP
Usage
CURSORs and WHILE LOOPs usage may be
one of the Database developers or DBA’s personal choice. But as per my research on
the CURSORs and WHILE LOOPs, the performance was degrading over time. This is
because the CURSOR and WHILE LOOP go with row by row basis and it is time-consuming
task. If you are developing your application that needs a faster result and are
working on CURSOR and WHILE LOOP basis, then it’s a bad idea.
Note: Usage of CURSOR/WHILE LOOP can be seen with good performance on small data sets.
Area of Tuning:
Try to use the sp_msforeachdb system stored procedure.
The sp_msforeachdb is commonly avoided by some junior folks. This is because it
becomes a tedious job to go with the internal coding of sp_msforeachdb stored
procedure. But, please check in the long run of your application and it would be much appreciated.