Langbahn Team – Weltmeisterschaft

Paul Smolensky

Paul Smolensky
Born (1955-05-05) May 5, 1955 (age 69)
NationalityAmerican
Alma materHarvard University, Indiana University
Known forOptimality theory, phonology, syntax, language acquisition, learnability, artificial neural networks, restricted Boltzmann machines
AwardsRumelhart Prize (2005)
Scientific career
FieldsCognitive science, linguistics, computational linguistics, artificial intelligence
InstitutionsJohns Hopkins University, Microsoft Research, Redmond
Websiteat JHU, at MSR

Paul Smolensky (born May 5, 1955) is Krieger-Eisenhower Professor of Cognitive Science at the Johns Hopkins University and a Senior Principal Researcher at Microsoft Research, Redmond Washington.

Along with Alan Prince, in 1993 he developed Optimality Theory, a grammar formalism providing a formal theory of cross-linguistic typology (or Universal Grammar) within linguistics.[1] Optimality Theory is popularly used for phonology, the subfield to which it was originally applied, but has been extended to other areas of linguistics such as syntax[2] and semantics.[3]

Smolensky is the recipient of the 2005 Rumelhart Prize for his development of the ICS Architecture, a model of cognition that aims to unify connectionism and symbolism, where the symbolic representations and operations are manifested as abstractions on the underlying connectionist or artificial neural networks. This architecture rests on Tensor Product Representations,[4] compositional embeddings of symbolic structures in vector spaces. It encompasses the Harmonic Grammar framework, a connectionist-based numerical grammar formalism he developed with Géraldine Legendre and Yoshiro Miyata,[5] which was the predecessor of Optimality Theory. The ICS Architecture builds on Harmony Theory, a formalism for artificial neural networks that introduced the restricted Boltzmann machine architecture. This work, up through the early 2000s, is presented in the two-volume book written with Géraldine Legendre, The Harmonic Mind.[6] Subsequent work introduced Gradient Symbolic Computation, in which blends of partially-activated symbols occupy blends of positions in discrete structures such as trees or graphs.[7] This has been successfully applied to numerous problems in theoretical linguistics where traditional discrete linguistic structures have proved inadequate,[8] as well as incremental sentence processing in psycholinguistics.[9] In work with colleagues at Microsoft Research and Johns Hopkins, Gradient Symbolic Computation has been embedded in neural networks using deep learning to address a range of problems in reasoning and natural language processing.

Among his other important contributions is the notion of local conjunction of linguistic constraints, in which two constraints combine into a single stronger constraint that is violated only when both of its conjuncts are violated within the same specified local domain. Local conjunction has been applied to the analysis of various "super-additive" effects in Optimality Theory. With Bruce Tesar (Rutgers University), Smolensky has also contributed significantly to the study of the learnability of Optimality Theoretic grammars (in the sense of computational learning theory).

Smolensky was a founding member of the Parallel Distributed Processing research group at the University of California, San Diego, and is currently a member of the Center for Language and Speech Processing at Johns Hopkins University and of the Deep Learning Group at Microsoft Research, Redmond Washington.

References

  1. ^ Prince, Alan; Smolensky, Paul (2002). Optimality Theory: Constraint Interaction in Generative Grammar (Report). Rutgers University. doi:10.7282/T34M92MV. — updated version of July 1993 report
  2. ^ Legendre, Géraldine; Grimshaw, Jane; Vikner, Sten, eds. (2001). Optimality-theoretic syntax. MIT Press. ISBN 978-0-262-62138-0.
  3. ^ Legendre, Géraldine; Putnam, Michael T.; De Swart, Henriette; Zaroukian, Erin, eds. (2016). Optimality-theoretic syntax, semantics, and pragmatics: From uni-to bidirectional optimization. Oxford University Press. ISBN 978-0-19-875711-5.
  4. ^ Smolensky, Paul (November 1990). "Tensor product variable binding and the representation of symbolic structures in connectionist systems". Artificial Intelligence. 46 (1–2): 159–216. doi:10.1016/0004-3702(90)90007-M.
  5. ^ Legendre, Géraldine; Miyata, Yoshiro; Smolensky, Paul (1990). Harmonic Grammar: A formal multi-level connectionist theory of linguistic well-formedness: Theoretical foundations (PDF) (Report). In Proceedings of the twelfth annual conference of the Cognitive Science Society (pp. 388–395). Cambridge, MA: Lawrence Erlbaum. Report CU-CS-465-90. Computer Science Department, University of Colorado at Boulder.
  6. ^
    • Smolensky, Paul; Legendre, Géraldine (2006). The Harmonic Mind: From Neural Computation to Optimality-Theoretic Grammar. Vol. 1: Cognitive Architecture. Cambridge, MA: MIT Press.
    • Smolensky, Paul; Legendre, Géraldine (2006). The Harmonic Mind: From Neural Computation to Optimality-Theoretic Grammar. Vol. 2: Linguistic and Philosophical Implications. Cambridge, MA: MIT Press.
  7. ^ Smolensky, Paul; Goldrick, Matthew; Mathis, Donald (2014). "Optimization and quantization in gradient symbol systems: A framework for integrating the continuous and the discrete in cognition". Cognitive Science. 38 (6): 1102−1138. doi:10.1111/cogs.12047. PMID 23802807 – via Rutgers Optimality Archive.
  8. ^ Smolensky, Paul; Rosen, Eric; Goldrick, Matthew (2020). "Learning a gradient grammar of French liaison". Proceedings of the 2019 Annual Meeting on Phonology. 8. doi:10.3765/amp.v8i0.4680.
  9. ^ Cho, Pyeong Whan; Goldrick, Matthew; Smolensky, Paul (2017). "Incremental parsing in a continuous dynamical system: Sentence processing in Gradient Symbolic Computation". Linguistics Vanguard. 3 (1). doi:10.1515/lingvan-2016-0105. S2CID 67362174.