<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-1">
<br>
<br>
<b style="font-weight:normal;"
id="docs-internal-guid-938af4d8-99a2-1242-7b5e-5b0ded273321">
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
center;"><span style="font-size:19px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:bold;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">CALL
FOR PARTICIPATION</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
center;"><span style="font-size:19px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:bold;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><br>
EACL 2014 Tutorial on Structured Sparsity in Natural Language
Processing: </span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
center;"><span style="font-size:19px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:bold;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Models,
Algorithms and Applications</span><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
</span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
center;"><a href="http://www.cs.cmu.edu/%7Eafm"
style="text-decoration:none;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">André
F. T. Martins</span></a><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">,
</span><a href="http://www.lx.it.pt/%7Emtf"
style="text-decoration:none;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Mário
A. T. Figueiredo</span></a><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:bold;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">,
</span><a href="http://www.cs.cmu.edu/%7Enasmith"
style="text-decoration:none;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Noah
A. Smith</span></a><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">,
and Dani Yogatama</span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
justify;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">This
tutorial will cover recent advances in sparse modeling with
diverse applications in natural language processing (NLP). A
sparse model is one that uses a relatively small number of
features to map an input to an output, such as a label
sequence or parse tree. The advantages of sparsity are, among
others, compactness and interpretability; in fact, sparsity is
currently a major theme in statistics, machine learning, and
signal processing. The goal of sparsity can be seen in terms
of earlier goals of feature selection and therefore model
selection (Della Pietra et al., 1997; Guyon and Elisseeff,
2003; McCallum, 2003).</span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
justify;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">This
tutorial will focus on methods which embed sparse model
selection into the parameter estimation problem. In such
methods, learning is carried out by minimizing a regularized
empirical risk functional composed of two terms: a "loss
term," which controls the goodness of fit to the data (e.g.,
log loss or hinge loss), and a "regularizer term," which is
designed to promote sparsity.</span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
justify;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">The
simplest example is L1-norm regularization (Tibshirani, 1996),
which penalizes weight components individually, and has been
explored in various NLP applications (Kazama and Tsujii, 2003;
Goodman, 2004). More sophisticated regularizers, those that
use mixed norms and groups of weights, are able to promote
"structured" sparsity: i.e., they promote sparsity patterns
that are compatible with a priori knowledge about the
structure of the feature space. This kind of regularizers has
been proposed in the statistical and signal processing
literature (Yuan and Lin, 2006; Zhao et al., 2009; Bach et
al., 2011) and is a recent topic of research in NLP
(Eisenstein et al., 2011; Martins et al, 2011, Das and Smith,
2012, Nelakanti et al. 2013). Some regularizers are even able
to encourage structured sparsity, without prior knowledge
about this structure (Bondell et al., 2007; Zeng et al.,
2013). Sparsity-inducing regularizers require the use of
specialized optimization routines for learning (Bach et al.,
2011, Wright et al., 2009; Xiao, 2009; Langford et al., 2009).</span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;text-align:
justify;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">The
tutorial will consist of three parts: (1) how to formulate the
problem, i.e., how to choose the right regularizer for the
kind of sparsity pattern intended; (2) how to solve the
optimization problem efficiently; and (3) examples of the use
of sparsity within natural language processing problems. <br>
<br>
</span></p>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span><br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:italic;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Tutorial
outline: </span></p>
<br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">1.
Introduction (30 minutes):</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- What is sparsity?</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Why sparsity is often desirable in NLP</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Feature selection: wrappers, filters, and embedded methods</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- What has been done in other areas: the Lasso and
group-Lasso, compressive sensing, and recovery guarantees</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
49.5pt;text-indent: -9pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">-
Theoretical and practical limitations of previous methods to
typical NLP problems</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Beyond cardinality: structured sparsity</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">2.
Group-Lasso and Mixed-Norm Regularizers (45 minutes):</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Selecting columns in a grid-shaped feature space</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Examples: multiple classes, multi-task learning, multiple
kernel learning</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Mixed L2/L1 and L∞/L1 norms: the group Lasso</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Non-overlapping groups</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Example: feature template selection</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Tree-structured groups</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- The general case: a DAG</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Coarse-to-fine regularization</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Unknown structure: feature grouping</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Open problems</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Coffee
Break (15 minutes)</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">3.
Optimization Algorithms (45 minutes):</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Non-smooth optimization: limitations of subgradient
algorithms</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Quasi-Newton methods: OWL-QN</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Proximal gradient algorithms: iterative soft-thresholding,
forward-backward and other splittings</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Computing proximal steps</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Other algorithms: FISTA, Sparsa, ADMM, Bregman iterations</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Convergence rates</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Online algorithms: limitations of stochastic subgradient
descent</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Online proximal gradient algorithms</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Managing general overlapping groups</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Memory footprint, time/space complexity, etc.</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- The “Sparseptron” algorithm and debiasing</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Open problems (e.g., non-convex objectives)</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">4.
Applications (30 minutes):</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Sociolinguistic association discovery</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Sequence problems: named entity recognition, chunking</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Multilingual dependency parsing</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;margin-left:
36pt;"><span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">
- Lexicon expansion</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">5.
Closing Remarks and Discussion (15 minutes)<br>
</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><br>
</span></p>
</b><span style="font-weight:normal;">
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><br>
</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><br>
<span style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"></span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;">Looking
forward to seeing you in the tutorial!<br>
</span></p>
<p dir="ltr"
style="line-height:1;margin-top:0pt;margin-bottom:0pt;"><span
style="font-size:15px;font-family:'Times New
Roman';color:#000000;background-color:transparent;font-weight:normal;font-style:normal;font-variant:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;"><br>
André, Mário, Noah, Dani<br>
</span></p>
</span><br class="Apple-interchange-newline">
</body>
</html>