The major factors in making code slow (in order of how easy it is to remedy):

  • Redundancy
  • Overhead
  • Using the "wrong" algorithm
Redundancy is doing the same thing over and over. Why would you recompute a constant value like pi every time you use it? The remedy for this type of redundancy is to precompute constants at the start of your program and use them. There can be other forms of redundancy, also.

Overhead is a part of a system that takes a (usually) constant amount of time doing things you don't care about. These operations might include allocating, copying, and freeing RAM; pushing and popping stacks; or loading or saving files to disk. Reducing overhead requires both skill and compromises. You must first identify the sources of overhead. Then you have to decide which is most important: how fast the code runs, how much RAM the code consumes, and how easy the code will be to maintain. Sometimes overhead depends on the hardware used. This part of optimization is grist for this forum.

Choosing the right tool for the job; that's one of the toughest choices to make when writing code. In this case the "tools" are algorithms. Your choice might depend on dozens of factors, such as how the program is used, what the data looks like, how the data is stored, how "sensitive" the data is, etc. How you decide may be based on testing and statistical analyses. It usually helps to own some computer science textbooks. There are also good resources on the Web for doing research. Discussions of this type might become long and involved, but should help many others.