[Corpora] [Corpora-List] Automatic evaluation of human translation

John F Sowa sowa at bestweb.net
Mon Oct 27 17:14:15 UTC 2014


Since nobody responded to this question in the past two weeks,
I thought I might make some suggestions.

On 10/10/2014 6:15 AM, Emad Mohamed wrote:
> Do you know of any work on the automatic evaluation of human translation?
> ...
> Typical scenario from my school:
> 200+ students per class in a translation studies course and one
> instructor. If the students can upload their translations and be given
> automatic feedback, this will greatly facilitate the process.

One possibility is to adapt software used for evaluating answers
to essay questions.  For example,

  1. An expert on the subject writes a "gold standard" essay on
     the same topic.  The gold essay covers the points that the
     students should make for an A+ score.

  2. The software checks whether the expected points (in related
     terminology) are mentioned in students' essays, and it uses
     various methods for evaluating spelling, grammar, style, etc.

In addition to the methods for evaluating essays, MT methods for
aligning sentences and evaluating possible word and phrase choices
could be adapted to this task.

The instructor could also list erroneous choices that should be
avoided.  When running the software, the instructor could check
some evaluations, modify the list of preferred and erroneous
choices, and rerun all evaluations with the updated list.

Has anybody implemented anything along these lines?  Or anything
else that might be useful for developing such software?

Software designed for evaluating human translations could also
be use to evaluate MT systems -- and vice versa.

John

_______________________________________________
UNSUBSCRIBE from this page: http://mailman.uib.no/options/corpora
Corpora mailing list
Corpora at uib.no
http://mailman.uib.no/listinfo/corpora



More information about the Corpora mailing list