z-logo
open-access-imgOpen Access
SLIP: A Sanskrit Linguistic Intelligence Pipeline for Enhanced Neural Machine Translation of Classical Texts
Author(s) -
N Biraja Isac,
Himansu Das
Publication year - 2025
Publication title -
ieee access
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 0.587
H-Index - 127
eISSN - 2169-3536
DOI - 10.1109/access.2025.3613597
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
The translation of classical Sanskrit texts presents unique challenges due to the language's complex morpho-phonological structure, rich grammatical system, and cultural-philosophical depth. This paper introduces the Sanskrit Linguistic Intelligence Pipeline (SLIP), a comprehensive framework that integrates traditional Sanskrit grammatical analysis with modern neural machine translation architectures to enhance translation quality for classical texts. SLIP incorporates morphological feature extraction (verb tenses, case endings, prefixes), phonological analysis (vowel ratios, syllable patterns, sandhi rules), and structural processing (compound word detection, sentence analysis) specifically designed for Sanskrit linguistic characteristics. The paper evaluates SLIP across multiple neural architectures including enhanced transformers, MBART, and retrieval-augmented generation (RAG) systems using diverse evaluation metrics (BLEU, ROUGE, METEOR, BERTScore) on Bhagavad Gita verses spanning different thematic categories. The results demonstrate substantial improvements: SLIP-enhanced transformers achieve a 10.20% relative improvement in BERT F1-score (57.63% to 63.51%) with statistical significance, while RAG integration yields remarkable gains of up to 41.8% in BLEU-4 scores. Comparative analysis reveals that domain-specific linguistic intelligence significantly outperforms generic large language models, with enhanced transformers showing +5.88 BERT F1 improvement compared to MBART's -7.12 decline. Thematic analysis indicates variable effectiveness across philosophical concepts, with austerity practices showing the highest improvement (+7.23%) compared to food classifications (+4.73%). These findings demonstrate that Sanskrit-specific morpho-phonological feature engineering provides superior translation quality compared to generic multilingual approaches, offering a paradigm for preserving semantic accuracy and cultural nuances in automated translation of ancient languages. The SLIP framework addresses critical limitations in existing approaches and provides a foundation for broader digitization efforts of classical Indian literature.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom