z-logo
open-access-imgOpen Access
Higher certainty of the laser-induced damage threshold test with a redistributing data treatment
Author(s) -
Lars Jensen,
Marius Mrohs,
Mark Gyamfi,
H. Mädebach,
Detlev Ristau
Publication year - 2015
Publication title -
review of scientific instruments
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.605
H-Index - 165
eISSN - 1089-7623
pISSN - 0034-6748
DOI - 10.1063/1.4932617
Subject(s) - computer science , test data , calibration , observational error , statistics , scale (ratio) , experimental data , test (biology) , reliability engineering , mathematics , physics , engineering , programming language , paleontology , quantum mechanics , biology
As a consequence of its statistical nature, the measurement of the laser-induced damage threshold holds always risks to over- or underestimate the real threshold value. As one of the established measurement procedures, the results of S-on-1 (and 1-on-1) tests outlined in the corresponding ISO standard 21 254 depend on the amount of data points and their distribution over the fluence scale. With the limited space on a test sample as well as the requirements on test site separation and beam sizes, the amount of data from one test is restricted. This paper reports on a way to treat damage test data in order to reduce the statistical error and therefore measurement uncertainty. Three simple assumptions allow for the assignment of one data point to multiple data bins and therefore virtually increase the available data base.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom