z-logo
open-access-imgOpen Access
Multi-task Gaussian Process Regression-based Image Super Resolution
Author(s) -
Xinwei Jiang,
Jie Yang,
Лей Ма,
Yiping Yang
Publication year - 2015
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.29.151
Subject(s) - kriging , artificial intelligence , computer science , gaussian process , feature (linguistics) , interpolation (computer graphics) , image (mathematics) , pattern recognition (psychology) , gaussian , parametric statistics , pixel , image resolution , process (computing) , field (mathematics) , computer vision , machine learning , mathematics , statistics , physics , quantum mechanics , operating system , linguistics , philosophy , pure mathematics
Image super resolution (SR) aims at recovering the missing high frequency details from single image or multiple images. Existing SR methods can be divided into three categories: interpolation-based, reconstructionbased and example learning-based. Our paper focuses on the third category. Example learning-based SRmethods [6] utilize the LR-HR image pair to infer the missing high-frequency details in the LR image and achieve state-of-the-art performance.Recently, in the field of example learningbased SR, more and more researchers resort to learn the LR-HR relationship directly, i.e. y = f (x), where x is the input LR image feature, y is the targeted HR image and f is the mapping function that transforms the LR feature into HR image. Instead of commonly used parametric models, non-parametric methods [3], especially gaussian process regression (GPR)-related methods [2, 4, 5] begin to emerge in the SR field. However, previous GPR-based SR methods simply learn all the GPR models independently and ignore the correlation between them. On the other hand, each pixel prediction can be treated as a task, so that inferring a HR patch can be regarded as a multi-task problem. In this paper, we focus on the multi-task gaussian process (MTGP) regression and apply it to superresolution problem. We first give a brief overview of MTGP proposed in [1]. Then we study how SR problem corresponds to MTGP and propose the multi-task gaussian process super-resolution (MTGPSR) framework. MTGP tries to solve the following problem: Given N distinct inputs x1, ...,xN we define the complete set of responses for M tasks as y = (y11, ...,yN1, ...,y12, ...,yN2, ...,y1M , ...,yNM) , where yi j is the response for the jth task on the ith input xi. We also denote the N ×M matrix Y such that y = vecY . Given a set of observations yo, which is a subset of y, we wish to predict the unobserved values of yu of some input points for some tasks. MTGP wishes to learn M related latent functions { fl} by placing a GP prior over { fl} and directly induce correlations between tasks. Assuming that the GPs have zero mean we define ⟨ fl(x) fk(x ′) ⟩ = K f lkk x(x,x′) (1)

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom