Friday, September 30, 2011

Linux Running Too Slow? Here’s How to Find the Cause

I think it’s safe to say that most people familiar with both Linux and Windows would tell you that your average Linux install would outperform an equivalent Windows install on the same machine. That may not always be the case, and some people might have contradictory stories to share, but that’s sure been this author’s experience. In fact according to Top500.org, about 90% of the world’s top supercomputers are running Linux. While the kernel and OS may lend themselves well to high-performance computing, there are often hardware and software issues on the desktop which can cause major lag. Today, we’ll cover some diagnostic tips and tools to help you figure out where your problem might be.

CPU Load

We’ll start with the most obvious cause of PC slowness – processor overload. As you’re reading this website, it’s got various bits of JavaScript running. Each time you load the page, the JavaScript is read as text, interpreted by your browser, passed to your operating system which loads system libraries and passes the data to your kernel – which works through the hardware drivers to actually run the program through your CPU. We get the convenience of being able to have a single script run on nearly every computer, but all that interpretation and data passing can really hammer down your system performance.
The most basic way to check your CPU load is with the command-line utility top. It contains a lot of information, but it really shines when trying to make comparisons between the CPU and RAM usage of various applications.
linuxperformance-topcpu

In that screenshot, you can see that top sorts the entries with the highest usage on top, so that you can see right away what’s using the most CPU or RAM, and the result is shown in percentages.
It’s worth noting that on a machine with multiple cores, it’s quite possible that the percentages top shows you will total up to more than 100% (ie, one core is 70% of max and another is 60%, top might show 130% usage).

RAM Usage

Next to CPU, your RAM (or lack thereof) is the mostly likely culprit of performance problems. Most MakeTechEasier readers are probably familiar with how RAM works, but here’s a quick primer for those who don’t.
Let’s say you’re at a library, and the new Larry Porter and the Prince of Bologna book is out. Normally, fantasy books are kept in the basement, but these books are hugely popular, so the library staff keeps a stack of them right at the front desk. This means that the library patrons can grab their book quickly and easily without going to the basement, it’s a win-win for everyone. That all sounds great, but you can’t do that with EVERY book in the library. Since the staff cannot keep a single convenient shelf for every book they have, most of them get kept in sections such as the basement.
That’s similar to how hard drives and RAM work. The hard drive, in this analogy, would be the basement shelves. It’s well suited to long-term, organized storage. The RAM is the smaller area by the front desk, and space specifically suited to hold the most needed items so that they can be retrieved quickly.
If you’ve got too much in your RAM (too many programs and services running) then the computer’s ability to retrieve the needed information can be drastically reduced. Suddenly it’s got to sort through a giant stack of stuff instead of just grabbing what it needs.
While it’s true that the free command will give you basic memory info, this is another case where top can come in very handy. Instead of simply showing “X mb free”, top will give you the detailed numbers, percentages, and swap usage information.
Take note of the swap usage information. On an average desktop, the percentage of used swap space should generally be very low. If it’s not, you may have to just go out and buy more RAM (or seriously reduce the amount of running programs.)

Overworked Hard Drive

Is your hard drive light constantly chugging along, yet you have no idea what it’s doing? Mysterious input/output can sure be a problem, so there is a top-like tool called iotop specifically meant to help diagnose this kind of problem.
It is not, however, built in to many distributions so you’ll likely have to install it separately. It should be available in your distro’s repositories, but if not you can download it here.
A normal, idle system should be mostly zeros across the board, sometimes with a few small bursts while data is being written, as in the screenshot below.
If however, you run a disk-intensive utility like find, you’ll see its name and throughput listed clearly in iotop.
linuxperformance-iotop2
Now, you can easily find out what program is using your I/O (in this case I ran “find / -name chicken”), who ran it, the speed the data is being read, and more.

GUI Tools

The author of this post chose command line tools to gather this information for two main reasons. First, CLI tools generally require fewer resources than GUI tools, and second, tools like top can be found in just about any Linux system, where as GUI tools can be hit-or-miss.
Many people do not like the command line, and there are several GUI tools to perform system monitoring, but this author recommends Gnome System Monitor. It’s already available on just about any Gnome-based distribution, and includes a lot of useful information including realtime graphs for CPU, memory, and network.
linuxperformance-gsysmon

Conclusion

While there are many things that can potentially cause system slowness those three things (CPU, RAM, and disk I/O) are behind the vast majority of performance problems. Using the tools described here won’t solve your performance problem, but they’ll make the cause of the problem a whole lot easier to find.

No comments:

Post a Comment