z-logo
Premium
The cost of lexical analysis
Author(s) -
Waite W. M.
Publication year - 1986
Publication title -
software: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.437
H-Index - 70
eISSN - 1097-024X
pISSN - 0038-0644
DOI - 10.1002/spe.4380160508
Subject(s) - analyser , computer science , compiler , task (project management) , code (set theory) , software , lexical analysis , quality (philosophy) , programming language , basis (linear algebra) , artificial intelligence , set (abstract data type) , mathematics , engineering , philosophy , chemistry , geometry , systems engineering , chromatography , epistemology
Abstract This paper examines a common design for a lexical analyser and its supporting modules. An implementation of the design was tuned to produce the best possible performance. In effect, many of the optimizations that one would expect of a production‐quality compiler were carried out by hand. After measuring the cost of tokenizing two large programs with this version, the code was ‘detuned’ to remove specific optimizations and the measurements were repeated. In all cases, the basic algorithm was unchanged, so that the difference in cost is an indication of the effectiveness of the optimization. Comparisons were also made with a tool‐generated lexical analyser for the same task. On the basis of the measurements, several specific design and optimization strategies are recommended. These recommendations are also valid for software other than lexical analysers.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here