Subido por osferga3

HUMAN TRANSLATION LING INSTRUCTIONS v.2.4

Anuncio
HUMAN TRANSLATION (HT) INSTRUCTIONS v. 2.4
Contents
HUMAN TRANSLATION (HT) INSTRUCTIONS .......................................................................... 1
PROJECT REFERENCE MATERIAL ............................................................................................. 2
TRANSLATION GUIDELINES ......................................................................................................... 2
WEBAPP GUIDELINES ..................................................................................................................... 9
GENERAL QUALITY GUIDELINES ............................................................................................. 13
QUALITY CRITERIA ....................................................................................................................... 14
VIEW PERFORMANCE AND COUNTER-FEEDBACK ............................................................. 15
VERSION UPDATES ......................................................................................................................... 17
AIM: populate a Machine Translation engine for ISAAC. You will perform Human
Translation of content into multiple languages and this will be used to train an MT engine and
produce automatic translations.
TASK DESCRIPTION: this task provides a source sentence that you have to translate into
the target language. Source cannot be edited. The resulting translation must be accurate,
fluent, and read naturally in the target language. Avoid literal translations. Read your target
sentence on its own and double check that it sounds as though a native speaker said/wrote
that sentence. Depending on the project, 1 or 2 translations for the same source string may be
necessary.
TARGET LANGUAGES: the only target languages requested and allowed in this project
are the ones listed below. Make sure you translate into the correct target language and flavor
(example: Latin American Spanish is not allowed, Canadian French is not allowed, etc.)
-
ar-AE = Arabic from United Arab Emirates
de-DE = German from Germany
el-GR = Greek from Greece
en-UK = English from United Kingdom
en-US = English from United States
es-ES = Spanish from Spain
fr-FR = French from France
hi-IN = Hindi from India
id-ID = Indonesian from Indonesia
it-IT = Italian from Italy
ja-JP = Japanese from Japan
ko-KR = Korean from Korea
nl-NL = Dutch from Netherlands
-
pl-PL = Polish from Poland
pt-BR = Portuguese from Brazil
ru-RU = Russian from Russia
sv-SE = Swedish from Sweden
th-TH = Thai from Thailand
tr-TR = Turkish from Turkey
vi-VN = Vietnamese from Vietnam
zh-CN = Simplified Chinese from China
zh-HK = Traditional Chinese from Hong Kong
S.A.R.
zh-TW = Traditional Chinese from Taiwan
zh-YUE = Cantonese from Mainland China
(Guangdong province)
CHEATING BEHAVIORS: cheating behaviors are strictly prohibited and will be
punished with immediate removal/ban from the project. The affected wordcount and the
additional cost of the rework will be deducted from the PO. Examples include (but are not
limited to):
1
•
•
•
•
•
•
•
DO NOT use Machine Translation (no matter if edited or not) or any other input from MT engines
(like Google Translate, Bing Translator, Deepl, etc.) to provide translations. All edits should come
from human translators.
DO NOT translate into the wrong target language or flavor (for example, do not translate into South
American Spanish as the project only has Spain Spanish as target language).
DO NOT use any kind of script, automation, browser add-in, or other third-party tools.
DO NOT retain the source and/or the target for translation memory purposes or any other use case.
DO NOT apply random nonsensical changes to the Translation2 just for the sake of providing it. For
example in cases where Translation2 differs just in capitalization, or punctuation, or in a synonym, or
when Translation2 contains the explanation of Translation1 content, etc.
DO NOT apply unreasonable omissions (like translating only the first part of the sentence)
DO NOT mark error categories during review without fixing the translation. If there is an error, you
must both select the error category and fix the target.
ANY OTHER cheating behavior from other circumstances or scenarios not explicitly
mentioned here that may jeopardize the project and the cooperation with the customer is also
extremely prohibited.
FINAL EXPECTED QUALITY: HUMAN QUALITY. Translated strings should retain
the meaning of the source and express it as fluently as possible, as a native speaker of the
target language would do.
TEAM MEMBERS: Translations will be provided by native speakers of target language
who are fluent in source language and will be accurate and in compliance with the project
translation Guidelines below.
CONTENT: general news, newspapers, instant messages or comments in social networks,
strings extracted from news programs, medical publications, finance, and business
documents, or concerning sports, travel, shopping, etc. There is no further reference or
context than the string itself. Focus on each string individually, and do not try to find how the
strings are related with one another; probably, they are not related at all.
PROJECT REFERENCE MATERIAL
Besides these guidelines, make sure to check all the reference material and instructions listed
in our dedicated Sharepoint site before starting the project. See links below. In case you do
not have access, please reach out to the Sharepoint manager Lu Zhang
([email protected]).
-
Isaac Sharepoint site
Oneforma instructions
Style guides
Trainings
Documents
TRANSLATION GUIDELINES
TRANSLATION SET: depending on the project, the customer will require to provide
either 1 or 2 translations for the same source string. Project managers will inform the
translation team whether 1 or 2 translations are required in each handoff.
2
TASKS WITH 2 TRANSLATIONS NEEDED: if 2 translations are required:
•
•
•
•
•
•
•
•
They will be paid individually (if you provide both Translation1 and Translation2
for the same source string, the source string will be paid twice).
Translation1 needs to be the most common.
It will be mandatory to provide 2 valid different translations of the same source
string. Both translations need to maintain the original intent using different words,
structures, and expressions, which means that Translation1 cannot be the same as
or very similar to Translation2. Make every attempt to provide a good diversity in
Translation2 and not just change the articles or other minor changes. Example:
o T1: I’m Ralf.
o T2 (valid): My name is Ralf
o T2 (not valid): I am Ralf
This diversity can come from using a different level of formality, translate into
two sentences with different meanings (if the source text is ambiguous), use
synonyms in the second translation or rephrase the sentence (syntactic difference).
Please take all these possibilities for providing diversity into account and combine
them. Do not always use the same criteria (for example do not always rephrase).
Don't try to add a new translation from a different variant such as Swiss German
for German (this would be a cheating behavior for wrong target language). Hints
and tips on how to provide valid 2nd translations are posted in the below section.
Pactera EDGE will conduct checks to detect how similar Translation1 and
Translation2 are. If there is an indication that Translation1 and Translation2 are
the same or too similar (only minimum variations applied), it will be considered a
cheating behavior.
There are cases where it is not possible to provide a valid Translation2, and in
these cases Translation2 should be omitted. For example when the source is made
just by:
o a number or a price
o a URL
o an email address
o a proper name
o etc.
If only Translation1 has been provided, Pactera EDGE will conduct checks to detect if
the lack of Translation2 is justified. If there is an indication that the lack of
Translation2 is not justified, the user who missed to translate it will be penalized in
the QA.
If the task requires 2 translations but it is not possible to provide a valid Translation2,
click the SECOND TRANSLATION NOT POSSIBLE button. This will indicate to
the reviewers that there is only 1 possible translation for the provided source.
TIPS ABOUT 2ND TRANSLATIONS: The 2nd translations should be considerably
different. It’s not just a matter of changing the articles or the punctuation. You should really
find a different way to say the same (without of course doing mistranslations, omissions,
etc.). Please find here some tips about how to reach this diversity in the 2nd translations.
•
3
Different levels of FORMALITY: Due to their nature, some strings should have
either an informal or a formal tone. For example, business reports will usually be
•
•
•
•
formal, tweets will usually be informal. Vulgar and offensive contents will more
likely have an informal tone too. This perception may change from language to
language and the gold rule is to always avoid using a level of formality that would
be unnatural in the context of the sentence for your language. However, if the source
context is ambiguous enough to let you play with different levels of formality, you
can use this to vary Translation1 from Translation2. For example:
o Translation1: Votre enfant va impatiemment se rendre sur le site Internet
www.loremipsum.com
o Translation2: Ton gamin s’éclatera sur www.loremipsum.com
Different meaning for AMBIGUOUS sources: Sometimes the source text is
ambiguous and can be interpreted in different ways. You can use Translation1 and
Translation2 to provide 2 different possible meanings of the same ambiguous source
string. For example:
o Source: Fred previously checked on this with the gas desk because Aquila
indicated interest in buying out this contract.
Here it is not clear whether “Fred” and “Aquila” are two people or two companies,
so you can:
o T1: translate as if they were both people names
o T2: translate as if they were both company names
Different SINTAXES: Use different syntactic structures in Translation1 and
Translation2. This doesn’t only mean change the word order, but literally using
different grammar/syntactic structures. For example:
o T1: I wish you all the best on your birthday
o T2: Wishing you a happy birthday
o T1: Quería saber si te gustaría ir a ver “American Beauty” con nosotros un
día de estos
o T2: ¿Te apetecería ir a ver con nosotros “American Beauty” algún día?
Different TERMINOLOGY: Terminology between Translation1 and Translation2
can be changed, but please make sure you don’t change just one word, otherwise
they would be too similar. For example:
o Source: Manual coffee crafting competition (Atlanta)
o T1: Gara di preparazione manuale del caffè (Atlanta)
o T2: Competizione di Atlanta su come fare il caffè a mano
Different GENDERS: If the source language doesn't support a specific gender and
it’s ambiguous, you can use different genders for Translation1 and Translation2. For
example:
o Source: You are really funny.
o T1: Sei molto simpatica. → This is feminine
o T2: Ti trovo davvero simpatico. → This is masculine
UNACCEPTABLE 2ND TRANSLATIONS: Examples of unacceptable 2nd translations,
that will be considered cheating behaviors, include (but are not limited to):
T2 explains the meaning
of T1
Abbreviation is the only
difference
Punctuation is the only
difference
4
Source
Lunedì!
T1
Monday!
귀찮게 하지마라
Don't bother me.
T2
The first day of the
week!
Do not bother me.
반탕 반탕
Half and half
Half and half.
몇살이야?
How old are you?
How old are you??
T2 contains meaningless
text/comments or
punctuation only
너는 작성 안해?
You don't write?
You don't write??
건전한 안마?
Data-data yang telah
didapatkan seperti data
form pendaftaran.
healthy massage?
healthy massage??
Carlos Roy Fajarta ·
Rabu, 08 Desember
2021 - 03:36:00 WIB
The data which has
been obtained as
registration form data.
Carlos Roy Fajarat ·
Wednesday, 08
December 2021 03:36:00 WIB
.
No translation2
AMBIGUOUS/UNCLEAR SOURCE: if the context is clear, keep the same meaning as
source. But if the meaning of the source is ambiguous and the context is not clear, try
performing several searches in the Web to find a possible context or meaning. A search for
the exact snippets in can lead you to the article or whole conversation. If there is no way to
clarify the meaning of the source, provide a translation based on your best judgment, even
word by word if needed. If the project allows 2 translations for the same source, you can use
them to put 2 different interpretations of the source. For example, “Rat” can be translated
referring to the animal in Translation1 and with the meaning of “traitor” in Translation2.
ERRORS IN SOURCE: if the source contains grammar or spelling mistakes but the
meaning remains clear, please provide a fluent translation - do not replicate the
ungrammatical constructs or misspellings into the target language. We always expect
grammatically correct and correctly spelled translations. Also disambiguate abbreviations
(Ex. 'how r u -> cómo estás').
NONSENSICAL SOURCE: If source text is completely nonsensical, check the
corresponding “Bad source” checkbox in Oneforma and provide a literal word by word
translation. If only a part of the source is nonsensical, but the rest of the sentence is fine,
translate the nonsensical part literally and the rest of the sentence as you would normally do.
GARBLED SOURCE: If source text is completely garbled (a series of random characters
like: “jknjknkjnknknn”), check the corresponding “Bad source” checkbox in Oneforma and
move to the next string. The strings marked like this will not be considered in the project
wordcount. If only a part of the source is garbled, do not click the “Bad source” checkbox,
leave the garbled part as it is in source and translate the rest of the sentence as you would
normally do.
INCOMPLETE/TRUNCATED STRINGS: if a string is incomplete in source, do not try to
complete it in target; translate the part that appears in source with what would be the
equivalent part in your language. Do not try to guess what the rest of the string would be.
WRONG LANGUAGE IN SOURCE:
•
5
If the input text is completely in another language (not the announced source
language), check the corresponding “Source is not in the expected language”
checkbox in Oneforma and skip the sentence. The strings marked like this will not
be considered in the project wordcount. Example, you are working in a EN-CN
task, but the source is “Datos de mercado no son rastreados”.
•
If only a part of the source is in another language but the rest is fine, leave the
words in the wrong language as they are and translate the rest of the sentence as
you would normally do.
INCIDENTAL WORDS: when incidental words of cultural/linguistic courtesy or habit are
generally used in the source language but not in the target, or generally used in the target
language but not in the source, translators may add or delete such words to best convey the
same meaning and tone. For example, when translating a string that discusses a future event
from Arabic into English, sometimes it may be needed to delete the word “inshallah” rather
than literally translating it to more closely convey the meaning and tone.
IDIOMS: in case of idioms, proverbs, and expressions, always retain the meaning of the
source string in your translation. A literal word-by-word translation is not acceptable if the
meaning of the source string gets lost. Example: “Break a leg” → “In bocca al lupo”.
GRAMMAR AND SPELLING: we expect grammatically correct translations without
spelling errors. Spell checker is integrated in Oneforma and available directly in the webapp.
To launch the check, click on this button in each string you want to check:
GENDER: if the source sentence has a specific gender, it should be translated into the same
gender in the target. If the source language doesn't support a specific gender and you need to
choose one in the target language, for e.g. pronouns when translating from English into
German, Spanish etc., choose a random gender. Do not always choose male or female.
STYLE AND FORMALITY: if the source sentence contains a level of formality, try to
replicate the same level of formality/politeness in the target unless this would be perceived
as unnatural by the target audience. If the source sentence does not provide a level of
formality (e.g. English), translate with an informal style by default, unless this would be
perceived as unnatural by the target audience (like it usually happens with Japanese).
POLITE WORDS: if using ‘polite’ words (like 'please') is not common or expected in the
target, please omit them (this should not be considered an Omission error during review).
SWEAR WORDS/VULGAR CONTENT: do not omit any parts of the source content and
do not change the original tone of the sentence. If source contains swear words, politically
incorrect expressions, vulgar comments, etc. please keep it in target as well by conveying
the same meaning and tone. If you do not feel comfortable with the content of the string and
do not want to translate it, feel free to skip the hit.
NUMBERS: keep numbers as is (keep the “same style”). That means written-out numbers in
the source should be translated into written-out numbers in the target and Arabic numerals
(123) in the source should be transferred as Arabic numerals (123) in the target. Please stick
to this rule even if it might be unusual/rare in the target language. Example: 123 → 123; two
plus two equals four → zwei plus zwei ist gleich vier
THOUSAND/DECIMAL SEPARATOR: any formatting of the numbers, for e.g thousand
and decimal separators needs to be modified to fit the target language. For example, given
English to German translation, “123,000” should be translated to “123.000”.
6
DATES: dates need to be formatted based on the target language but remaining close to the
source format as much as possible. For example, given English to French translation,
“02/10/2015” should be translated to “10/02/2015”.
TIME: time indication/hours need to be formatted based on the target language but
remaining close to the source format as much as possible. For example, given English to
Italian translation, “5:30 PM” should be translated to “17:30”.
CURRENCIES: currency names should be translated (not converted). For example, given
English to Spanish translation, “8000 crowns” would be “8000 coronas”. “18 dollars” would
be “18 dólares”.
UNITS OF MEASUREMENT: Never convert units even if the target language uses
different metric systems. Translation with converted units are not acceptable. Example: 75
Fahrenheit should not change into 25 Celsius (75 F should be translated to 75 F). 6 inches
should not change into 15.24 cm.
SYMBOLS: symbols can be switched if one variant is more common in the target language.
For example, given English to German translation, “$125 dinner” might be translated to “125
USD Abendessen”.
FORMATTING: The format of the source text should be maintained in the target languages
unless target language has a pre-defined syntax. Example: “123” should be translated as
“123” and not as “one hundred and twenty-three”. Example: “I am so happy!!!!!!!!!” should
be translated by maintaining the same number of “!” in the target.
PUNCTUATION: preserve source punctuation as much as possible, consistent with
common usages in the target language. For example, if the source sentence includes several
question marks, explanation marks, dots etc. please copy this punctuation as is into the target.
Example: “@BetoORourke Well bless her heart…..” (copy all dots at the end of the sentence
into the target). However, where the grammar and punctuation rules of the target language
require different punctuation than the source language, the translation should use punctuation
correctly for the target language. For example, even if a Spanish source string begins with the
¿ symbol, the English Translation would not because that symbol is not correct or commonly
used in English. PACTERA UPDATE Because Isaac instruction is vague, we have
received tons of queries and doubts about this. After checking many cases and analyse
different scenarios, we have decided to add the below additional clarifications:
-
Due the nature of the instruction provided by the customer, if the source sentence is
missing a final dot at the end, it’s ok to omit the final dot in translation as well.
Reviewers can choose to correct it in the review stage in order to make it
grammatically correct if they wish, but this should not be marked as an error. You
should instead select the "Preferential change" option which does not add any penalty
and does not affect the translator QA score.
CAPITALIZATION: preserve as appropriate. PACTERA UPDATE Because Isaac
instruction is very vague, we have received tons of queries and doubts about this. After
checking many cases and analyse different scenarios, we have decided to go with the below
guidelines:
7
-
If the source capitalization (for example the first word of the sentence being
lowercase instead of uppercase) does not have any impact on the meaning and the
sentence can be equally understood from the user perspective, it's ok to follow it
(leaving the initial word lowercase for example). Reviewers can choose to correct it
in the review stage in order to make it grammatically correct if they wish, but this
should not be marked as an error. You should instead select the "Preferential change"
option which does not add any penalty and does not affect the translator QA score.
MENTIONS AND HASHTAGS: keep @mentions and #hashtags as in source. Example: I
don't like @PapaJohn pizza #cardboard #evil → no me gusta la pizza de @PapaJohn
#cardboard #evil
EMOJIS: copy emojis directly from source into the target. Do not change them, even if they
are incomplete (like :-‘). If the source sentence uses an emoji to replace a word that is
essential to the sentence meaning, then translate with the inferred word (without omitting the
emoji). Example: “The
is red” should be translated like if it were “The apple is red”.
BUSINESS AND PRODUCT NAMES: keep business names and product names as-is with
proper capitalization. Examples: Facebook → Facebook, ebay → eBay. Exception: if a
company/product has been officially marketed in another country under a different name, you
should use the official name for the target country. Examples: “Diet Coke” would be “CocaCola Light” in Italian; “Algida” would be “Frigo”, “Miko”, or “Eskimo” depending on the
target country.
PROPER NAMES: proper names of people should NOT be changed unless they are
historical names widely recognized in each language. Example: Columbus > Colón.
Transliteration can be used as long as the names are the same as in source. Example for
Serbian: "Shakespeare" > "Šekspir". Other names (e.g. names of places and languages)
should be translated when the name is rendered differently in the source and target languages.
For example, Warsaw in Polish should be Warszawa, Cologne in German should be Köln and
Spain in Spanish should be España.
APP NAMES: keep well known app names as in source if there is no official translation.
Example: Skype → Skype
NAMES OF LAWS, ACTS, ORGANIZATIONS: keep as in source the names of laws,
acts, organizations, etc. that belong to a specific country. Example: 'Defense of Marriage
Act'. For international organizations or international documents, i.e., 'Universal Declaration
of Human Rights', please use the standard and approved translations for the target.
MOVIE/BOOK/SONG/SHOW TITLES: if there is no official/culturally relevant
translation for the title, keep the one in the source. If there is an official translation or
adaptation that is more familiar to the target locale, please translate it. Example: “Harry
Potter and the Philosopher’s Stone” should be translated into French as “Harry Potter à
l'école des sorciers” because that was the title of the French version of the same book.
“Twinkle, twinkle, little star” might be translated into Korean using the corresponding
Korean-language song “반짝 반짝 작은 별”.
QUERIES: if you have linguistic queries or doubts about the instructions, raise a ticket to
our Query Management System Global Query (C837-Isaac projects). If you do not have an
8
account or you cannot remember your username, reach out to Zian Wang
([email protected]). If you don’t remember the password, you can reset it from
the login page. If you have queries about workflow, rates, payments, or want to report a
technical bug (not related to linguistic issues), reach out to the project managers by email:
[email protected]. Do not raise tickets in Global Query for non-linguistic topics.
All answers provided outside this tool and by third parties/companies are neither official nor
reliable for this project.
WEBAPP GUIDELINES
DESK ACCESS: go to https://desk.oneforma.com/ and login with your OneForma
credentials. In "My Tasks" there is a list of webapps that have been assigned to you.
WEBAPP ACCESS: For translation, go to Actions > Start.
For review, go to
QA actions > QA.
RADIO BUTTONS: currently 4 radio buttons are available in the webapp:
•
•
9
The source is not in the expected language: If you click this option, the translation
textbox will become grey and you won’t be able to provide a translation. You will just
need to click submit. This checkbox should be used for all cases where the source
string is entirely in the wrong language, for example you are working on a EN-FR
webapp and the source is completely in Chinese. On the contrary, if only a part of the
source is in another language but the rest is fine, do not use this radio button. Just
leave the words in the wrong language as they are and translate the rest of the
sentence as you would normally do.
Bad Source: completely garbled: If you click this option, the translation textbox will
become grey and you won’t be able to provide a translation. You will just need to
click submit. This checkbox should be used for all cases where the source is
completely made by a series of random characters, like “nfhjsfbsjknfjksanfjkahnuifji”.
•
•
On the contrary, if only a part of the source is garbled, do not use this radio button.
Just leave the garbled part as it is and translate the rest of the sentence as you would
normally do.
Bad Source: completely nonsensical: If you click this option, you need to provide a
translation word by word. Otherwise, you won’t be able to submit the string. This
checkbox should be used for all cases where the source text is in the expected
language and is not a series of random characters, but still does not make any sense.
For example “Tomorrow like flower car in the pool”. On the contrary, if only a part of
the source is nonsensical, but the rest of the sentence is fine, do not use this radio
button. Translate the nonsensical part literally and the rest of the sentence as you
would normally do.
Source has typos or spelling errors: If you click this option, you still need to
provide a translation of the source. This checkbox should be used in all cases where
the source contains typos or spelling mistakes that are clearly unintentional. Translate
the sentence as you would normally do and correct any errors you see from the source
in your translation, since your translation must be free of typos/spelling mistakes.
To revert any of these selections, click the “clear selection” button.
NOTE: if the source string just contains a symbol or a punctuation like like “>” or “[[ ]]”,
this does not fall into any of the above scenarios. This should be left as in the source for T1
and T2 should be omitted.
HT WEBAPP FOR TRANSLATORS: for projects that require 1 translation, you will see
only one textbox per source string. For projects that require 2 translations, it will be the same
and you will still see only one textbox per source string, so the same hit can be translated by
two different resources. The first time you translate a hit you will have the following layout,
so you just need to provide the Translation1 in the Proposed translation textbox.
If you translate a sentence that has been previously translated by another resource or by
yourself (because you can translate one sentence and after some time get the same source to
translate), it means you need to provide the Translation2 (if possible) and you will see the
following layout. In this case the Translation1 from first translator will be blocked and can
only be used for reference. You need to provide a Translation2 in the Proposed Translation
textbox.
10
Following with the general guidelines it’s forbidden to provide an identical or very similar
Translation2: you won’t be able to submit it and you’ll get a pop-up message warning about
it.
HT WEBAPP FOR REVIEWERS: for projects that require 2 translations, you will see 2
textboxes for the same source, the upper one with Translation1 and the lower one with
Translation2 (which may be empty if the second translator didn’t provide a Translation2).
Read the source carefully, review the 2 different translations following the provided linguistic
guidelines and implement any change (if needed) in the corresponding text box. If the
reviewed translations (either 1, 2 or both) have errors, select the applicable error category
from the drop down menu. Error categories that can be selected are:
Accuracy-Mistranslation
Accuracy-Omission/Addition
Language-Spelling
Language-Grammar/Syntax
Language-Capitalisation
Language-Punctuation
Country-Format
Terminology-Context
11
The target content does not accurately reflect the source content
or for project instructions that have not been followed
The target content includes something that is not present in the
source or content present in the source is missing from the
translation. This includes also missing or wrong emojis
Issues related to typos and wrong orthography of the words
Issues related to the grammar or syntax of the text
Issues related to capitalization of words
Issues related to punctuation and spacing
Content uses the wrong format for currency, date/time, addresses,
measurements, telephone numbers and other locale-specific
conventions
A term is misused within a sentence or improper for the provided
context
Readability-Style/Tone
Preferential Edit
Critical-Wrong target language/country
Critical-Cheating behaviour
A text does not sound natural/fluent, uses the wrong tone (formal
vs informal) or is not idiomatic or violates the provided style
guidelines
Original translation is equally correct and the edit is purely
preferential. Translator is not going to be penalized in the QA
score for this.
The language used in target translation is not the expected one
(example: Target is in Dutch instead of German) or the text is
written for the wrong country (target is in Latin American
Spanish instead of Spanish from Spain; target is in Canadian
French instead of French from France, etc.)
This Error category has a huge penalization, so please make sure
to select it only when applicable, in order not to unfairly penalize
the translator.
Any severe cheating behavior that goes against the project
guidelines or the signed SOW (examples include but are not
limited to: systematically adding nonsensical synonyms just for
the sake of providing a Translation2; adding random spaces
between words just for the sake of providing a Translation2;
using Deepl or any other MT output (no matter if edited or not);
changing the punctuation nonsensically just for the sake of
providing a Translation2; etc.)
This Error category has a huge penalization, so please make sure
to select it only when applicable, in order not to unfairly penalize
the translator.
For cases when the reviewer works on a 2nd round of QA (when additional hits of underperforming translators are sent out to QA), only the translation box coming from the underperforming translator will be editable for review. For example, here we need to review only
the Translation1. The “Alternative translation provided” is the Translation2 coming from a
translator who doesn’t need to be reviewed. So you have to review only the Translation1
considering that it will be associated with the indicated Translation2 (don’t change it to
something too similar or the system will not allow you to submit).
And here we need to review only the Translation2. The “Alternative translation provided” is
the Translation1 coming from a translator who doesn’t need to be reviewed. So you have to
review only the Translation2 considering that it will be associated with the indicated
Translation1 (don’t change it to something too similar or the system will not allow you to
submit).
12
OTHER BUTTONS: When you are done and are sure about the proposed translation, click
Submit to send it to the system. In case there is a sentence with adult content, insults, etc. you
don't feel comfortable with, click SKIP HIT to have it assigned to another
translator/reviewer. Skipped hits will not be paid.
If the task requires 2 translations but it is not possible to provide a Translation2, click the
SECOND TRANSLATION NOT POSSIBLE button. This will indicate to the reviewers
that there is only 1 possible translation for the provided source.
When you have translated all the strings and/or you want to exit the task, click CLOSE in the
upper right corner of the screen (next to your username) to go back to the My Task
dashboard.
A map chart is available, which can be used to type in the translation box any uncommon
character you may not have in your keyboard.
GENERAL QUALITY GUIDELINES
CLIENT GOAL AND COST RECOVERY: 95% of error free content (no corrections
required). If any complaint about poor quality or rejection from the customer, a rework will
be arranged. If the underperforming translator/reviewer is asked to implement the changes
required by the customer by himself/herself, the required fixing will be done for free. If the
feedback implementation is assigned to another linguist, the affected wordcount/additional
cost of the rework will be compensated and deducted from the PO of the underperforming
users.
13
QA CHECKS: General and task-specific quality checks will be performed by the reviewers
and the Pactera EDGE team before, during, and after the delivery to the customer. This is to
monitor the general translation/review quality and to identify cheaters or poor performers. No
matter in which stage of the process issues are detected, QA and anti-fraud protocols will
apply to all users (reviewers included).
PROTOCOL FOR MANAGING PERFORMANCE
Quality Score
Discount in the PO
>=80%
None
60 – 79.99%
30% discount
0% – 59.99%
60% discount
< 0%
100% discount
If any reviewer unfairly penalizes any translator by applying any error category improperly,
the penalization will be removed from the translator performance report and applied to the
reviewer himself/herself.
QUALITY GROUPS: According to each translator global quality score, there will be 3
different Quality groups:
•
•
•
Gold: global quality score >=95%
Silver: global quality score between 80 - 94.99%
Bronze: global quality score between 70% - 79.99%
Additionally:
•
•
•
•
•
Bronze users could be temporarily deactivated for further quality checks and
assessments.
Users with less than 70% of global quality score will be removed from the pool for
poor quality performance.
Users with negative scores (below 0%) will be considered cheaters and the below
anti-fraud protocol will apply.
Gold suppliers will have higher priority during job assignments over the rest.
Reviewers with a global quality score lower than 90% in Review webapps will be
removed from the review pool for poor quality performance.
This is a general framework, and it is for reference only. Pactera EDGE is free to adapt and
adjust it at any time depending on customer specific requests and project specific needs. For
extraordinary circumstances or other scenarios not explicitly mentioned here, Pactera EDGE
is free to apply any necessary quality management strategy (including user removal, user
deactivation, and PO discounts) depending on the case.
QUALITY CRITERIA
ACCURACY: Translated text represents the meaning of the source text concepts and
conveys them correctly. 1-to-1 correspondence will exist between the source and target texts,
14
all target strings are required to accurately convey the meaning of the corresponding source
string. Retain the meaning of the source string in translation. A literal word-by-word
translation is not acceptable if the meaning of the source string gets lost. In this context,
idioms such as “he was beating around the bush” or “I’m just pulling your leg” can be
especially challenging to translate, and a literal word-by-word translation is not acceptable.
LANGUAGE MECHANICS: Language based on standard grammar, syntax rules and
conventions of the target language. Correct spelling and punctuation. Absence of typing
errors.
LANGUAGE NATURALNESS AND HUMAN FINAL QUALITY: language will be as
natural and fluent as if written by a native. MT translation is forbidden in all cases.
VIEW PERFORMANCE AND COUNTER-FEEDBACK
FOR TRANSLATORS: It is essential that translators check their translation performance in
Oneforma to learn from their errors and let us know in case they disagree with some of the
received feedbacks. From the “My Tasks” page in Oneforma, go to the webapp you want to
check the feedback for and click Actions > View performance. The Performance page
opens. In the upper part called My Performance, you can find:
•
•
Quality score: the overall quality score you got for all the work in that webapp
Translations Reviewed: the number of your translations that have been reviewed in
that specific webapp
In the lower part, called Translation feedback, you can find a table with the list of all the
strings where reviewers have found mistakes and/or made edits.
You can use this to compare your original translation with the one submitted by the reviewer.
You can also see error category that has been assigned to edit. If you agree with the reviewer,
learn from the feedback and click the Agree button. In case you disagree with the correction
made by the reviewer, click Disagree. A text box will open where you can send us a
comment and explain why you disagree with the reviewer. Click Send counterfeedback to
share the comment with the Pactera EDGE team. This will be arbitrated by a third party who
15
will decide if the error should be removed from your report or not. NOTE: make sure your
comments in the counter-feedback box are professional and polite, precise and concise.
NOTE: "Preferential Edit" DOES NOT penalize the QA score. It has no impact on your QA
result. It is just an alternative translation for your reference. If both translations are equally
correct, click Agree. If there is a mistake in the review, click Disagree and let us know what
the error in the review is, so we can penalize the reviewer if the arbitrator agrees.
If the arbitrator agrees with the reviewer and the error is confirmed, you will see a comment
with the final arbitration under the “comment from arbitrator” column. If on the contrary
we agree with you and decide to remove the error from your report, it will disappear from the
list. Score will always be updated automatically every time an arbitration is done.
FOR REVIEWERS: It is essential that reviewers check their review performance in
Oneforma to learn from their errors and see in which cases they have unfairly penalized a
translator or performed the review wrongly. This is extremely important because reviewers
are the last step before the delivery to the customer and we need to make sure only qualified
reviewers are working in the QA webapps. From the “My Tasks” page in Oneforma, go to the
webapp you want to check the QA feedback for and click QA Actions > QA View
performance. The Performance page opens. In the upper part called My Performance, you
can find:
•
•
Quality score: the overall quality score you got for all the work in that webapp
Translations Reviewed: the number of your translations that have been reviewed in
that specific webapp
In the lower part, called Translation feedback, you can find a table with the list of all the
strings where arbitrators have found mistakes in strings you reviewed and/or where you
misjudged and assigned the wrong error category to a translator.
•
•
16
“Error category from you to translator” is the error category you originally
assigned to the translator and for which the translator complained.
“Error category from arbitrator to translator” is the error category the arbitrator
decided to assign to the translator after the complaint about your review. If this
doesn’t match what appears in the previous column, it means the arbitrator thought
your review was not fair or the category you used was not correct.
•
•
•
“Error category from arbitrator to you” is the error category the arbitrator assigned
to you after evaluating the counter-feedback. This can be because you did a mistake in
the reviewed string or because you unfairly penalized the translator in the first place.
“Counterfeedback” contains the original complaint we received from the translator.
“Arbitration” is the final decision made by the arbitrator
VERSION UPDATES
Date
15/12/2020
Version
1.7
Author
Pactera EDGE LQM
03/03/2021
1.8
Pactera EDGE LQM
02/03/2022
1.9
Pactera EDGE LQM
08/03/2022
2.1
Pactera EDGE LQM
10/02/2022
17/03/2022
14/06/2022
2.2
2.3
2.4
Pactera EDGE LQM
Pactera EDGE LQM
Pactera EDGE LQM
17
Comment
Removed the below sentence from the
Accuracy paragraph: Please provide
translation 2 only if it is fairly common.
If there is no fairly common translation,
please omit the second translation.
Added clarifications about Punctuation
and Capitalization instructions
Complete revamp of the instructions
document based on new HT webapps
Table of contents added and examples
about radio buttons
Reformulation of ambiguous sentence
Instruction about Time added
Added radio button for spelling mistakes
+ Added button for 2nd translation not
possible + Query contact changed
Descargar