Abstract
Previous research has established that people can implicitly learn chunks, which do not require a memory buffer to process. The present study explores the implicit learning of nonlocal dependencies generated by higher than finite-state grammars, specifically, Chinese tonal retrogrades and inversions , which do require buffers . People were asked to listen to and memorize artificial poetry instantiating one of the two grammars; after this training phase, people were informed of the existence of rules and asked to classify new poems, while providing attributions of the basis of their judgments. People acquired unconscious structural knowledge of both tonal retrogrades and inversions. Moreover, inversions were implicitly learnt more easily than retrogrades constraining the nature of the memory buffer in computational models of implicit learning