Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
Last revision Both sides next revision
course:esslli2018:start [2018/07/25 10:26]
schtepf created
course:esslli2018:start [2018/07/26 09:24]
schtepf [Distributional Semantics – A Practical Introduction (ESSLLI 2016 & 2018)]
Line 2: Line 2:
  
 **Distributional Semantics – A Practical Introduction** **Distributional Semantics – A Practical Introduction**
-[[http://esslli2009.labri.fr/|{{ :course:esslli2009:esslli09_logo.png|ESSLLI 2009 (Bordeaux)}}]]+[[http://esslli2016.unibz.it/|{{ :course:esslli2018:esslli2016_logo_outline.png?150|ESSLLI 2016 (Bolzano)}}]] 
 +[[http://esslli2018.folli.info/|{{ :course:esslli2018:esslli2018_logo.jpg?260|ESSLLI 2018 (Sofia)}}]]
 \\ \\
 //Introductory course at [[http://esslli2016.unibz.it/?page_id=242|ESSLLI 2016]], Bolzano, August 15–19, 2016 and [[http://esslli2018.folli.info/distributional-semantics-a-practical-introduction/|ESSLLI 2018]], Sofia, August 6–10, 2018// //Introductory course at [[http://esslli2016.unibz.it/?page_id=242|ESSLLI 2016]], Bolzano, August 15–19, 2016 and [[http://esslli2018.folli.info/distributional-semantics-a-practical-introduction/|ESSLLI 2018]], Sofia, August 6–10, 2018//
  
   * [[course:esslli2018:schedule|Course schedule & handouts]]   * [[course:esslli2018:schedule|Course schedule & handouts]]
-  * [[course:material|Downloads important links]] +  * [[course:material|Software data sets]] 
-  * [[course:bibliography|Suggested readings (bibliography)]]+  * [[course:bibliography|Bibliography & links]]
  
 ===== Course description ===== ===== Course description =====
  
 +Distributional semantic models (DSM) – also known as “word space” or “distributional similarity” models – are based on the assumption that the meaning of a word can (at least to a certain extent) be inferred from its usage, i.e. its distribution in text. Therefore, these models dynamically build semantic representations – in the form of high-dimensional vector spaces – through a statistical analysis of the contexts in which words occur. DSMs are a promising technique for solving the lexical acquisition bottleneck by unsupervised learning, and their distributed representation provides a cognitively plausible, robust and flexible architecture for the organisation and processing of semantic information.
 +
 +This course aims to equip participants with the background knowledge and skills needed to build different kinds of DSM representations – from traditional “count” models to neural word embeddings – and apply them to a wide range of tasks. It is accompanied by practical exercises with the user-friendly [[http://www.r-project.org/|R]] software package [[http://wordspace.r-forge.r-project.org/|wordspace]] and various pre-built models.