<html><head><meta http-equiv="Content-Type" content="text/html charset=windows-1252"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;">== Deadline Extension: ICMR'14 Special Session User-centric Video Search and Hyperlinking ==<br><br>Due to the large number of requests, the ICMR'14 special session<br>"User-centric Video Search and Hyperlinking" deadline has been<br>extended to 5th of January 2014.<br><br>Changed deadlines:<br>Submission Abstract => December 15, 2013 (unchanged)<br>Submission Full Paper => Extended to 5th of January 2014<br><br>Note about anonymity:<br>All papers should be prepared and submitted according to the ICMR'14'<br>guidelines, which uses a double-blind review process. Authors should<br>not know the names of the reviewers of their papers, and reviewers<br>should not know the name(s) of the authors. Please prepare your<br>paper in a way that preserves anonymity of the authors, e.g.<br><br>* Do not put your name(s) under the title. * Avoid using phrases<br> such as “our previous work” when referring to earlier publications<br> by the authors.<br>* Remove information that may identify the authors in the acknowledgments<br> (e.g., co-workers and grant IDs).<br>* Check supplemental material (e.g., titles in the video clips, or<br> supplementary documents) for information that may identify the<br> authors identity.<br>* Avoid providing links to websites that identify the authors.<br><br>Abstract and keywords:<br>The abstract and the keywords form the primary source for assigning<br>papers to reviewers. So make sure that they form a concise and<br>complete summary of your paper with sufficient information to let<br>someone who doesn’t read the full paper know what it is about.<br><br>Maximum paper length:<br>Each regular paper should not be longer than 8 pages.<br><br>Original call for papers:<br><br>== User-centric Video Search and Hyperlinking ==<br>(<a href="http://www.icmr2014.org/?page_id=307">http://www.icmr2014.org/?page_id=307</a>)<br><br>Recent years have seen extensive interest in video search focusing<br>on retrieval of visual shot level units, and linking of multimedia<br>documents clusters faces or other properties assuming someone wants<br>to group those documents. While there has been much progress in<br>developing methods for improved search effectiveness, this research<br>generally focuses on technical aspects of retrieval.<br><br>This special session will focus on video search from a user-centric<br>perspective and targets a real-world use-case scenario. We step<br>forward from the search of textual information or of the relevant<br>video content in response to a user search query, and focus on the<br>search through video content that is supported by the navigation<br>in the video collection using inter and cross-item hyperlinks. The<br>search and linking might be based on both spoken and visual content<br>targeting the diversity in the results to satisfy potential variety<br>of user interests.<br><br>User studies suggest users interests are multimodal in nature.<br>Successful search and hyperlinking to relevant content thus requires<br>creation and exploitation of multi-modal queries combining visual,<br>audio and textual features. This special session aims to bring<br>together researchers working on video search and hyperlinking to<br>find solutions for tasks motivated by real-world use-case scenarios.<br><br>We particularly encourage papers that present methods covering the<br>complete search and hyperlinking use-case across all content<br>modalities.<br><br>Ideal submissions will cover some or all of the following aspects<br>in a close connection to the search and hyperlinking use-case:<br><br>Automatic multi-modal query generation Models to identify video<br>segments that are usable for linking as source anchors and link<br>targets Methods for creation of effective links Evaluation of<br>user-centric search and hyperlinking User studies related to<br>search and hyperlinking scenarios<br><br>Organisation<br><br>* Maria Eskevich (Dublin City University, Ireland) <a href="mailto:meskevich@computing.dcu.ie">meskevich@computing.dcu.ie</a><br>* Dr. Robin Aly (University of Twente, The Netherlands) <a href="mailto:r.aly@utwente.nl">r.aly@utwente.nl</a><br>* Dr. Roeland Ordelman (University of Twente, The Netherlands) <a href="mailto:roeland.ordelman@utwente.nl">roeland.ordelman@utwente.nl</a><br>* Dr. Gareth J.F. Jones (Dublin City University, Ireland) <a href="mailto:gjones@computing.dcu.ie">gjones@computing.dcu.ie</a><div><br><div apple-content-edited="true">
<div>--<br>Maria Eskevich<br>PhD-student<br>L2.08<br>School of Computing<br>Dublin City University<br>Dublin 9, Ireland<br><br><a href="http://nclt.computing.dcu.ie/~meskevich/">http://nclt.computing.dcu.ie/~meskevich/</a><br>http://ie.linkedin.com/pub/maria-eskevich/17/520/741<br><br>tel (Ireland): +353 87 14 23 101<br>tel (Russia): + 7 921 915 52 54<br><br>e-mail: maria.eskevich@gmail.com,<br> meskevich@computing.dcu.ie,<br> maria.eskevich2@mail.dcu.ie,<br> maria.eskevich@yandex.ru<br><br>Please consider your environmental responsibility before printing this email! ;)<br><br>DCU Disclaimer: https://iss.servicedesk.dcu.ie/index.php?/News/NewsItem/View/37/dcu-email-disclaimer-information</div>
</div>
<br></div></body></html>