BACK TO INDEX

Publications of year 2022
Articles in journal, book chapters
  1. Ergun Biçici. Machine Translation Performance Prediction System: Optimal Prediction for Optimal Translation. Springer Nature Computer Science, 3, 2022. ISSN: 2661-8907. [doi:10.1007/s42979-022-01183-0] Keyword(s): Machine Learning, Machine Translation.
    Abstract:
    Machine translation performance prediction (MTPP) system (MTPPS) is an automatic, accurate, language and natural language processing (NLP) output independent prediction model. MTPPS is optimal by the capability to predict translation performance without even using the translation by using only the source, bypassing MT model complexity. MTPPS was casted for tasks involving similarity of text in machine translation (MT), semantic similarity, and parsing of sentences. We present large scale modeling and prediction experiments on MTPP dataset (MTPPDAT) covering $3800$ document- and $380000$ sentence-level prediction in $7$ different domains using $3800$ different MT systems. We provide theoretical and experimental results, empirical lower and upper bounds on the prediction tasks, rank the features used, and present current results. We show that we only need $57$ labeled instances at the document-level and $17$ at the sentence-level to reach current prediction results. MTPPS achieve $4\%$ error rate at the document-level and $45\%$ at the sentence-level relative to the magnitude of the target, $61\%$ and $27\%$ relatively better than a mean predictor correspondingly, and $40\%$ better than the nearest neighbor baseline. Referential translation machines use MTPPS and achieve top results.

    @article{Bicici:MTPPS:SNCS2022,
    author = {Ergun Bi\c{c}ici},
    title = {Machine Translation Performance Prediction System: Optimal Prediction for Optimal Translation},
    journal = {Springer Nature Computer Science},
    year = {2022},
    volume = {3},
    issue = {4},
    issn = {2661-8907},
    doi = {10.1007/s42979-022-01183-0},
    keywords = {Machine Learning, Machine Translation},
    abstract = {Machine translation performance prediction (MTPP) system (MTPPS) is an automatic, accurate, language and natural language processing (NLP) output independent prediction model. MTPPS is optimal by the capability to predict translation performance without even using the translation by using only the source, bypassing MT model complexity. MTPPS was casted for tasks involving similarity of text in machine translation (MT), semantic similarity, and parsing of sentences. We present large scale modeling and prediction experiments on MTPP dataset (MTPPDAT) covering $3800$ document- and $380000$ sentence-level prediction in $7$ different domains using $3800$ different MT systems. We provide theoretical and experimental results, empirical lower and upper bounds on the prediction tasks, rank the features used, and present current results. We show that we only need $57$ labeled instances at the document-level and $17$ at the sentence-level to reach current prediction results. MTPPS achieve $4\%$ error rate at the document-level and $45\%$ at the sentence-level relative to the magnitude of the target, $61\%$ and $27\%$ relatively better than a mean predictor correspondingly, and $40\%$ better than the nearest neighbor baseline. Referential translation machines use MTPPS and achieve top results.},
    
    }
    



BACK TO INDEX




Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All person copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Les documents contenus dans ces répertoires sont rendus disponibles par les auteurs qui y ont contribué en vue d'assurer la diffusion à temps de travaux savants et techniques sur une base non-commerciale. Les droits de copie et autres droits sont gardés par les auteurs et par les détenteurs du copyright, en dépit du fait qu'ils présentent ici leurs travaux sous forme électronique. Les personnes copiant ces informations doivent adhérer aux termes et contraintes couverts par le copyright de chaque auteur. Ces travaux ne peuvent pas être rendus disponibles ailleurs sans la permission explicite du détenteur du copyright.




Last modified: Sun Feb 5 17:37:19 2023
Author: ebicici.


This document was translated from BibTEX by bibtex2html