We present contrast information, a novel application of some specific cases of relative entropy measures, designed to be useful for cognitive modelling of sequential perception of continuous signals. We explain the relevance of entropy in cognitive modelling of music and language. Then, as a first step to demonstrating the utility of constrast information for that purpose, we show empirically that its discrete case correlates well with existing successful cognitive models in the literature. We explain some interesting properties of constrast information. Finally we propose future work towards a cognitive architecture that uses it.