Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Last revision Both sides next revision | ||
course:esslli2021:start [2021/08/06 16:36] schtepf [Schedule & handouts] |
course:esslli2021:start [2022/08/11 12:22] schtepf [Hands-on Distributional Semantics (ESSLLI 2021 / 2022)] |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Hands-on Distributional Semantics (ESSLLI 2021) ====== | + | ====== Hands-on Distributional Semantics (ESSLLI 2021 / 2022) ====== |
**Hands-on Distributional Semantics – From first steps to interdisciplinary applications** | **Hands-on Distributional Semantics – From first steps to interdisciplinary applications** | ||
+ | [[https:// | ||
[[https:// | [[https:// | ||
\\ | \\ | ||
// | // | ||
+ | \\ | ||
+ | |||
+ | **Hands-on Distributional Semantics for Linguistics using R** | ||
+ | \\ | ||
+ | // | ||
+ | \\ | ||
+ | |||
* [[course: | * [[course: | ||
Line 10: | Line 18: | ||
{{: | {{: | ||
- | * check this page for updates and instructions by **Friday, August 6th, 2021** (late afternoon CEST) | + | * update of all materials |
- | * please install software and download data sets on the weekend before | + | |
===== Course description ===== | ===== Course description ===== | ||
- | Distributional semantic models (DSM) are based on the assumption that the meaning of a word can (at least to a certain extent) be inferred from its usage, i.e. its distribution in text. Therefore, these models dynamically build semantic representations through a statistical analysis of the contexts in which words occur. DSMs are a promising technique for solving the lexical acquisition bottleneck by unsupervised learning, and their distributed representation provides a cognitively plausible, robust and flexible architecture for the organisation and processing of semantic information. | + | Distributional semantic models (DSM) – also known as “word space”, “distributional similarity”, |
- | In this introductory course we will highlight the interdisciplinary potential of DSM beyond standard semantic similarity tasks, with applications in cognitive modeling and theoretical linguistics. This course aims to equip participants with the background knowledge and skills needed to build different kinds of DSM representations and apply them to a wide range of tasks. | + | In this introductory course we will highlight the interdisciplinary potential of DSM beyond standard semantic similarity tasks, with applications in cognitive modeling and theoretical linguistics. This course aims to equip participants with the background knowledge and skills needed to build different kinds of DSM representations |
- | **Lecturers: | + | |
+ | **Lecturers: | ||
===== Organizational information ===== | ===== Organizational information ===== | ||
- | Please make sure you have up-to-date versions of **[[https:// | + | Please make sure you have up-to-date versions of **[[https:// |
+ | Additional instructions will be given in the first session on Monday. In particular, you will be asked to download and install the '' | ||
- | We will answer questions during lectures and in the afternoon via the course' | + | /* We will answer questions during lectures and in the afternoon via the course' |
- | {{: | ||
- | |||
- | \\ | ||
- | **Further information on required software packages and data sets will be provided by Saturday, August 7th, 2021.** | ||
- | \\ | ||
- | \\ | ||
===== Schedule & handouts ===== | ===== Schedule & handouts ===== | ||
- | {{:under_construction.png? | + | === Day 1: Introduction === |
- | \\ | + | [[http:// |
- | **Handouts and other materials will be made available along with the course.** | + | |
- | \\ | + | |
- | \\ | + | |
- | + | ||
- | === Day 1: Introduction === | + | |
* motivation and geometric intuition | * motivation and geometric intuition | ||
* distributional vs. semantic similarity | * distributional vs. semantic similarity | ||
* outline of the course | * outline of the course | ||
- | * practice: //software setup, first practical exercises | + | * practice: //software setup, first steps with the '' |
=== Day 2: Building a DSM === | === Day 2: Building a DSM === | ||
+ | |||
+ | [[http:// | ||
+ | [[http:// | ||
+ | R code: [[http:// | ||
+ | bonus material: [[http:// | ||
* formal definition of a DSM, taxonomy of parameters | * formal definition of a DSM, taxonomy of parameters | ||
Line 61: | Line 64: | ||
=== Day 3: Which aspects of meaning does a DSM capture? === | === Day 3: Which aspects of meaning does a DSM capture? === | ||
+ | |||
+ | [[http:// | ||
+ | [[http:// | ||
+ | R code: [[http:// | ||
* evaluation: conceptual coordinates | * evaluation: conceptual coordinates | ||
Line 68: | Line 75: | ||
=== Day 4: DS beyond NLP – Linguistic theory === | === Day 4: DS beyond NLP – Linguistic theory === | ||
+ | |||
+ | [[http:// | ||
+ | [[http:// | ||
+ | R code: [[http:// | ||
+ | bonus material: [[http:// | ||
+ | |||
* linguistic exploitation of DSM representations | * linguistic exploitation of DSM representations | ||
Line 76: | Line 89: | ||
=== Day 5: DS beyond NLP – Cognitive modelling === | === Day 5: DS beyond NLP – Cognitive modelling === | ||
+ | |||
+ | [[http:// | ||
+ | [[http:// | ||
+ | R code: [[http:// | ||
+ | bonus task: [[http:// | ||
+ | bonus material: [[http:// | ||
* DSMs for cognitive modelling | * DSMs for cognitive modelling | ||
Line 81: | Line 100: | ||
* predicting free associations with DSMs | * predicting free associations with DSMs | ||
* practice: //combining DSMs with first-order co-occurrence for the FAST task// | * practice: //combining DSMs with first-order co-occurrence for the FAST task// | ||
- | |||
- | <!-- | ||
- | |||
- | === Day 1: Introduction === | ||
- | |||
- | {{: | ||
- | |||
- | * motivation and brief history of distributional semantics | ||
- | * common DSM architectures & prototypical applications | ||
- | * first practical exercises with the '' | ||
- | |||
- | |||
- | === Day 2: The parameters of a DSM === | ||
- | |||
- | {{: | ||
- | |||
- | |||
- | * taxonomy of DSM parameters: context representation, | ||
- | * overview of common parameter settings & best-practice recommendations | ||
- | * practical exercises: building DSMs and exploring their parameters | ||
- | |||
- | |||
- | === Day 3: Applications and evaluation === | ||
- | |||
- | {{: | ||
- | |||
- | * attributional and relational similarity, clustering and semantic categorization, | ||
- | /* * supervised & unsupervised classification based on DSM vectors */ | ||
- | * insights from recent parameter evaluation studies | ||
- | * practical exercises: implementation and evaluation of selected tasks | ||
- | |||
- | |||
- | === Day 4: Elements of matrix algebra === | ||
- | |||
- | {{: | ||
- | |||
- | * basic matrix and vector operations, orthogonal projection & dimensionality reduction | ||
- | * singular value decomposition (SVD) | ||
- | * practical exercises: roll your own DSM with matrix operations | ||
- | |||
- | |||
- | === Day 5: Making sense of DSMs === | ||
- | |||
- | * mathematical properties of and relations between different types of DSM | ||
- | * singular value decomposition (SVD) as a latent class model | ||
- | * comparison with neural vector embeddings | ||
- | |||
- | --> | ||