Computational cost of classification is as important as accuracy in on-line classification systems. The computational cost is usually dominated by the cost of computing implicit features of the raw input data. Very few efforts have been made to design classifiers which perform effectively with limited computational power; instead, feature selection is usually employed as a pre-processing step to reduce the cost of running traditional classifiers. We present CoCoST, a novel and effective approach for building classifiers which achieve state-of- the-art classification accuracy, while keeping the expected computational cost of classification low, even without feature selection. CoCost employs a wide range of novel cost-aware decision trees, each of which is tuned to specialize in classifying instances from a subset of the input space, and judiciously consults them depending on the input instance in accordance with a cost-aware meta-classifier. Experimental results on a network flow detection application show that, our approach can achieve better accuracy than classifiers such as SVM and random forests, while achieving 75%-90% reduction in the computational costs.