Premium
A novel large‐signal FET model considering trapping‐induced dispersions
Author(s) -
Yuan Ye,
Zhong Zheng,
Guo Yongxin,
Mu Shanxiang
Publication year - 2019
Publication title -
international journal of numerical modelling: electronic networks, devices and fields
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.249
H-Index - 30
eISSN - 1099-1204
pISSN - 0894-3370
DOI - 10.1002/jnm.2552
Subject(s) - high electron mobility transistor , signal (programming language) , interpolation (computer graphics) , computer science , artificial neural network , range (aeronautics) , oscillation (cell signaling) , trapping , electronic engineering , materials science , biological system , algorithm , transistor , electrical engineering , voltage , engineering , artificial intelligence , chemistry , motion (physics) , ecology , biochemistry , composite material , biology , programming language
A novel large‐signal model construction technique is proposed in this paper. High‐order current and charge sources with additional dimensions derived from pulsed small‐signal measurement data are used to describe the dispersive behaviour of FETs. Added dimensions can effectively account for trapping‐induced dispersions. Instead of look‐up tables, empirical functions and artificial neural network are adopted to implement the model into simulators, which can reduce the volume of the model and the possible oscillation caused by interpolation and provide better prediction performance beyond the measurement data range. The validity of proposed modelling technique has been verified by a 2 × 75um GaAs pHEMT. This proposed technique can be easily extended to GaN devices with the similar procedure.