Ensuring quality in Collaborative translation can be a challenge. One way to approach this issue is to vet contributors, instead of individual contributions. In other words, make sure that only good contributors are allowed to create or modify content on your site.
But how can you tell who the good contributors are?
One way to do this is to screen contributors beforehand, by requiring that they go through an entry exam before they are allowed to contribute or modify content.
Once a potential contributor has passed the exam, one usually gives him free reign, and they can create or modify content with no, or minimal quality checks.
Note that it's important that the content of the Entry Exam be specific to the kind of content that the contributor will have to deal with. For example, if you are managing a Translation crowdsourcing community for a non-profit humanitarian aid organization, then the documents that the applicants are asked to translate in the exam should pertain to humanitarian aid .
Links to related patterns
- Automatic Reputation Management is another way of screening contributors. The main difference is that it screens them after the fact, whereas Entry Exam screens them beforehand.
- Kiva, and Translators Without Borders are two not-for-profit humanitarian aid organizations which have used that practice.