Semantics is the study of meaning in language — how words, phrases, and sentences encode meaning, and how listeners and readers extract meaning from linguistic input. It encompasses word meaning (lexical semantics), sentence meaning (compositional semantics), and the relationships between linguistic meaning and the real world (referential semantics). Understanding meaning is arguably the ultimate goal of language processing.
Lexical Semantics
Word meanings are organized in the mental lexicon through multiple types of relationships: synonymy (couch/sofa), antonymy (hot/cold), hyponymy (robin is a type of bird), meronymy (wheel is part of car), and associative relations (doctor/nurse). These relationships form a complex semantic network. Collins and Loftus's spreading activation model proposes that accessing one word's meaning automatically activates related meanings, producing semantic priming effects measured in reaction time experiments.
Compositional Semantics
The meaning of a sentence is built from the meanings of its words and the way they are syntactically combined — the principle of compositionality. "The dog bit the man" means something different from "The man bit the dog" because the syntactic structure assigns different roles to the same words. Compositional semantics explains how we can understand sentences we have never heard before — a novel combination of known words and known structures yields a novel but computable meaning.
Marta Kutas and Steven Hillyard (1980) discovered that semantically unexpected words in sentences elicit a large negative-going ERP component peaking around 400 ms — the N400. "He spread the warm bread with socks" produces a large N400 to "socks" compared to "butter." The N400 amplitude indexes the difficulty of semantic integration and has become the primary neural measure of semantic processing, used in thousands of studies of word meaning, sentence comprehension, discourse processing, and semantic memory.