- Home /
The question is answered, right answer was accepted
Is there a way to write a Demosaicing fragment shader?
I'm working on a realistic camera simulation, and one important effect (even if it's not the most visible effect) is the demosaicing effect (wikipedia article : https://en.wikipedia.org/wiki/Demosaicing). Currently, it is managed by a C# script, copying the RenderTexture on a texture2D, reading each pixel of the texture, filling 3 2D arrays (one for each RGB color), and working on each array, then reconstruct the image with the values contained in the arrays.
I'm sure there is a way to optimize the readings and reconstructions in the C# script, but since shaders are way more efficient, I'm trying to figure if it is possible to do so. Since it's a camera simulation, the script is used as a post-processing effect, and it's literally eating my FPS (from 70 to 4 for a 800*500 picture), and more effects are to come, and I already know they will not be easy on the performances as well.
Just in case, here is the code of the script (if it hurts your eyes, I'm sorry, i'm quite messy when it comes to code, and sorry for the comments in french)
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Bayer : MonoBehaviour{
private Texture2D texture;
private float[,] rArray;
private float[,] gArray;
private float[,] bArray;
private int w;
private int h;
// Initialisation des paramètres
void Start(){
w = Camera.main.pixelWidth;
h = Camera.main.pixelHeight;
rArray = new float[w, h];
gArray = new float[w, h];
bArray = new float[w, h];
for (int i = 0; i < w; i++){
for (int j = 0; j < h; j++){
rArray[i, j] = 0;
gArray[i, j] = 0;
bArray[i, j] = 0;
}
}
texture = new Texture2D(w, h);
}
// Méthode appellée à chaque fois qu'une image doit être rendue
void OnRenderImage(RenderTexture src, RenderTexture dest){
RenderTexture.active = src;
texture.ReadPixels(new Rect(0, 0, src.width, src.height), 0, 0);
//Déconstruction de l'image
//Remplissage du tableau de rouge
for (int i = 0; i < texture.width; i += 2){
for (int j = 1; j < texture.height; j += 2){
rArray[i, j] = texture.GetPixel(i, j).r;
}
}
//Remplissage du tableau de vert (en deux boucles)
for (int k = 0; k < texture.width; k += 2){
for (int l = 0; l < texture.height; l += 2){
gArray[k, l] = texture.GetPixel(k, l).g;
}
}
for (int m = 1; m < texture.width; m += 2){
for (int n = 1; n < texture.height; n += 2){
gArray[m, n] = texture.GetPixel(m, n).g;
}
}
//Remplissage du tableau de bleu
for (int o = 1; o < texture.width; o += 2){
for (int p = 0; p < texture.height; p += 2){
bArray[o, p] = texture.GetPixel(o, p).b;
}
}
//Dématriçage des 3 tableaux
//Reconstruction de la sous-image rouge
for (int i = 1; i < w - 1; i++){
for (int j = 1; j < h - 1; j++){
if (rArray[i, j] == 0) {
int count = 0;
float sum = 0.0f;
for (int k = i - 1; k < i + 2; k++){
for (int l = j - 1; l < j + 2; l++){
if (rArray[k, l] != 0){
count++;
sum += rArray[k, l];
}
}
}
rArray[i, j] = sum / count;
}
}
}
//Reconstruction de la sous-image verte
for (int i = 1; i < w - 1; i++){
for (int j = 1; j < h - 1; j++){
if (gArray[i, j] == 0){
int count = 0;
float sum = 0.0f;
for (int k = i - 1; k < i + 2; k++){
for (int l = j - 1; l < j + 2; l++){
if (gArray[k, l] != 0){
count++;
sum += gArray[k, l];
}
}
}
gArray[i, j] = sum / count;
}
}
}
//Recontruction de la sous-image bleue
for (int i = 1; i < w - 1; i++){
for (int j = 1; j < h - 1; j++){
if (bArray[i, j] == 0){
int count = 0;
float sum = 0.0f;
for (int k = i - 1; k < i + 2; k++){
for (int l = j - 1; l < j + 2; l++){
if (bArray[k, l] != 0){
count++;
sum += bArray[k, l];
}
}
}
bArray[i, j] = sum / count;
}
}
}
//Reconstruction de l'image
for (int x = 0; x < texture.width; x++){
for (int y = 0; y < texture.height; y++){
texture.SetPixel(x, y, new Color(rArray[x, y], gArray[x, y], bArray[x, y]));
}
}
//On applique les modifications à la texture, puis on copie cette texture dans la RenderTexture de destination
texture.Apply();
RenderTexture.active = null;
Graphics.Blit(texture, dest);
//On réinitialise les sous-images
Array.Clear(rArray, 0, rArray.Length);
Array.Clear(gArray, 0, gArray.Length);
Array.Clear(bArray, 0, bArray.Length);
}
}
Answer by Remy_Unity · May 04, 2018 at 08:10 AM
Obviously, the short answer is : yes.
It's possible to do the same things with shaders, could it be pure pixels shaders or with compute shaders.
Instead of reading the image on CPU and doing calculations, use Graphics.Blit to write from texture to texture with custom shaders to do you effect (this is for pixel shaders), or go for the more complex compute shader route.
I'm a beginner with shaders, how do you get the surrounding pixels inside a fragment shader? Or is it made somewhere else?
Getting the surrounding pixels is just a mater of sampling your texture again with an uv offset :-) :
float2 pixelOffset = 1.0 / _ScreenParams.xy;
float4 pixel = tex2D( _Texture, uv);
float4 pixelRight = tex2D( _Texture, uv + float2( pixelOffset.x, 0 ) );
float4 pixelLeft = tex2D( _Texture, uv + float2( -pixelOffset.x, 0 ) );
float4 pixelUp = tex2D( _Texture, uv + float2( 0, pixelOffset.y ) );
float4 pixelDown = tex2D( _Texture, uv + float2( 0, -pixelOffset.y ) );
This was blindly written here, but it should get you an idea of how to do.
Note: _ScreenParams comes from here : https://docs.unity3d.com/$$anonymous$$anual/SL-UnityShaderVariables.html