The US military will use digital video compression for many different operations and applications. System requirements result in the need to transmit video streams within limited bandwidth, and often over harsh transmission environments. To deliver video with sufficient quality, the most modern video encoders and decoders (codecs) need to be employed with parameters optimized for high compression ratios and tolerance to network loss.Optimization of video codec parameters is complex, and many parameters can be varied when encoding video. This paper presents a methodology to quantifiably determine and optimize the video quality with different codec parameters for different bit rates and operations. Video quality was numerically calculated with two methods of machine-based (objective) scoring, and the objective scores were spot-checked by human (subjective) scoring. The rigorous procedure in this study results in the optimization of several codec parameters for different bit rates and military applications. End-to-end video codec performance was analyzed and optimized here for two main application areas: Tele-operations (teleops) and Reconnaissance. Network conditions were considered, including throughput and packet loss. H.264 (aka MPEG4 part 10) compression was used in this study since it is the highest-performing widely-accepted emerging standard for video coding. Adaptation parameters evaluated in this study include video compression profile, resolution, frame rate, and group of pictures (GOP) structure. This study assumes that the bit rate is determined by the network, and then the video systems must adapt to that target bit rate. For each bit rate, objective quality scores were calculated by two different systems for a range of video resolutions, frame rates, and codec parameters. The combinations of resolution and frame rate that were identified as optimal are presented in this paper. Different GOP structures, and several packet loss conditions, were also varied and their scores are presented.