Correlativity of perception is defined as a capacity to discover similar configurations of stimuli and to form high- level configurations from them. It is equivalent to describing information in terms of generative elements and their transformations. Such a representation saves memory and reveals causality in data generation. This approach is implemented in a model of artificial perception wherein data are selforganized in order to segregate patterns before recognizing them. Input information is described as generative patterns and their transformations. The least complex data representation that leads to a causally related semantic description is chosen, with (Kolmogorov) complexity defined by the amount of memory store required. The model is applied to voice separation and to rhythm/tempo tracking. Chord spectra are described by generative subspectra, which correspond to tonal spectra, and by their translations, which coincide with the intervals of the chord. Time events are also described by generative rhythmic patterns. Tempo and rhythm interdependence is overcome by the optimal sharing of complexity between representations of rhythmic patterns and tempo curve. The model explains the function of interval hearing, certain statements of music theory, and some effects of rhythm perception. Applications to image processing and modeling of abstract thinking are also discussed.