[Corpora-List] MediaEval 2014 Multimedia Benchmark: Call for Task Proposals

M. Larson m.a.larson at tudelft.nl
Sun Nov 10 17:17:00 UTC 2013


MediaEval 2014 Call for Task Proposals
MediaEval Multimedia Benchmark
http://www.multimediaeval.org/mediaeval2014
***Task proposal submission deadline: 21 December 2013***
__________________________________________________________

MediaEval is a benchmarking initiative dedicated to evaluating new algorithms for multimedia access and retrieval. It emphasizes the 'multi' in multimedia and focuses on human and social aspects of multimedia tasks.

MediaEval is calling for proposals for tasks from researchers in academia and industry to run in the 2014 benchmarking season.

The proposal should contain the following elements:
- Name of the task,
- Short description of the use scenario underlying the task (Who would ultimately use the technology developed to address this task?),
- Short description of the task (What is the problem that task participants will be expected to solve?),
- Description of the data to be used, including a statement on how it is to be licensed (Note that MediaEval encourages the use of Creative Commons data wherever possible.),
- Description of how the ground truth will be obtained,
- Statement of the evaluation metric and/or methodology,
- Brief statement of how the task is different from existing tasks in other benchmarks and/or how it extends the previous year’s MediaEval task (if applicable),
- Brief statement of why the task is a MediaEval task (Does the task involve a strong social or human component?),
- Examples (2-3) of recommended reading (i.e., references of papers that you would expect participants to have read before attempting the task),
- Name and contact information for the members of the proposing team (Please include a couple sentences about the composition/history of the team. New collaborations are explicitly encouraged.),
- Summary (200-300 words) of the motivation, task, data and evaluation in a form suitable for the survey (i.e., a condensed version including the most important points from above),
- The survey asks if people are interested in the task, and also asks questions that gather people's input on certain task design decisions. Please add 4-7 questions that you would like potential participants to ask about the task.

For the last two points, it is helpful to refer to last years survey to see the format of the task description and the type of questions.
http://www.multimediaeval.org/docs/MediaEval2013_SurveyForm_FInal.pdf

There is no particular length specification for the proposal, some tasks will require more explanation than others. However, proposals are easier to manage if they are concise: in general, they should not exceed two pages.

Please email your proposal (as a .pdf) to Martha Larson m.a.larson at tudelft.nl and Gareth Jones gareth.jones at computing.dcu.ie by December 21, 2013.

__________________________________________________________

Task proposals are accepted on the basis of the existence of a community of task supporters (i.e., researchers who are interested and would plan to participate in the task). Support is determined using a survey, which is circulated widely to the multimedia research community at the beginning of the year (January 2014). Task decisions are made mid-February. Tasks must also be viable given the design of the task and the resources available to the task organization team.

We encourage task proposers to join forces with colleagues from other institutions and other projects to create an organizing team large enough to bear the burden of data set generation, results evaluation, and working notes paper review. Please contact Martha Larson m.a.larson at tudelft.nl if you have questions about task organization or if you are interested in being connected up with other people with similar interests and who could join together to form a task organizer team.

MediaEval has been experiencing steady growth since it was founded in 2008 as a track called "VideoCLEF" within the CLEF benchmark campaign. In 2010, it became an independent benchmark and in 2012 it ran for the first time as a fully "bottom-up benchmark", meaning that it is organized for the community, by the community, independently of a "parent" project. The MediaEval benchmarking season culminates with the MediaEval workshop. Participants come together at the workshop to present and discuss their results, build collaborations, and develop future task editions or entirely new tasks. Past working notes proceedings of the workshop include:

MediaEval 2012: http://ceur-ws.org/Vol-807/
MediaEval 2013: http://ceur-ws.org/Vol-1043/

Example tasks that have run in past years are:
- Placing Task: Predict the geo-coordinates of user-contributed photos.
- Tagging Task: Automatically assign tags to user-generated videos.
- Spoken Web Search: Search FOR audio content WITHIN audio content USING an audio content query.
- Search and Hyperlinking: Multi-modal search and automated hyperlinking of user-generated and commercial video.
- Social Event Detection: Find multimedia items related to a particular event within a social multimedia collection.
- Violent Scenes Detection Task: Automatically detect violence in movies.

We expect the MediaEval 2014 workshop to be held in October 2014 in Europe, possibly returning to the venue of the MediaEval 2013 workshop in Barcelona. For more information on the MediaEval Multimedia benchmark, please visit http://www.multimediaeval.org/ or contact Martha Larson m.a.larson at tudelft.nl.

Martha Larson -- m.a.larson at tudelft.nl
Multimedia Information Retrieval Lab
Delft University of Technology


_______________________________________________
UNSUBSCRIBE from this page: http://mailman.uib.no/options/corpora
Corpora mailing list
Corpora at uib.no
http://mailman.uib.no/listinfo/corpora



More information about the Corpora mailing list