In multi-stage processes, decisions occur in an ordered sequence of stages. Early stages usually have more observations with general information (easier/cheaper to collect), while later stages have fewer observations but more specific data. This situation can be represented as a dual funnel structure, in which the sample size decreases from one stage to the other while the information available about each instance increases. Training classifiers in this scenario is challenging since information in the early stages may not contain distinct patterns to learn (underfitting). In contrast, the small sample size in later stages can cause overfitting. We address both cases by introducing a framework that combines adversarial autoencoders (AAE), multitask learning (MTL), and multi-label semi-supervised learning (MLSSL). We improve the decoder of the AAE with MTL so it can jointly reconstruct the original input and use feature nets to predict the features for the next stages. We also introduce a sequence constraint in the output of an MLSSL classifier to guarantee the sequential pattern in the predictions. Using different domains (selection process, medical diagnosis), we show that our approach outperforms other state-of-the-art methods.