Premium
A Bayesian mixture model for differential gene expression
Author(s) -
Do KimAnh,
Müller Peter,
Tang Feng
Publication year - 2005
Publication title -
journal of the royal statistical society: series c (applied statistics)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.205
H-Index - 72
eISSN - 1467-9876
pISSN - 0035-9254
DOI - 10.1111/j.1467-9876.2005.05593.x
Subject(s) - bayes' theorem , inference , bayesian inference , nonparametric statistics , bayesian probability , computer science , posterior probability , statistical inference , bayes factor , artificial intelligence , prior probability , machine learning , mathematics , statistics
Summary. We propose model‐based inference for differential gene expression, using a nonparametric Bayesian probability model for the distribution of gene intensities under various conditions. The probability model is a mixture of normal distributions. The resulting inference is similar to a popular empirical Bayes approach that is used for the same inference problem. The use of fully model‐based inference mitigates some of the necessary limitations of the empirical Bayes method. We argue that inference is no more difficult than posterior simulation in traditional nonparametric mixture‐of‐normal models. The approach proposed is motivated by a microarray experiment that was carried out to identify genes that are differentially expressed between normal tissue and colon cancer tissue samples. Additionally, we carried out a small simulation study to verify the methods proposed. In the motivating case‐studies we show how the nonparametric Bayes approach facilitates the evaluation of posterior expected false discovery rates. We also show how inference can proceed even in the absence of a null sample of known non‐differentially expressed scores. This highlights the difference from alternative empirical Bayes approaches that are based on plug‐in estimates.