Sunday, March 11, 2018

image - Wavelet Transform


I want to perform 2D haar discrete wavelet transform and inverse DWT on an image.Will you please explain 2D haar discrete wavelet transform and inverse DWT in a simple language and an algorithm using which I can write the code for 2D haar dwt?The information given in google was too technical.I understood the basic things like dividing the image into 4 sub-bands:LL,LH,HL,HH but I can't really understand how to write a program to perform DWT and IDWT on an image.I also read that DWT is better than DCT as it is performed on the image as a whole and then there was some explanation which went over the top of my head.I might be wrong here but I think DWT and DCT compression techniques because the image size reduces when DWT or DCT is performed on them.Hoping you guys share a part of your knowledge and enhance my knowledge.


Thank You


Re: Does it have anything to do with the image format.What is "value of pixel" that is used in DWT?I have assumed it to be the rgb value of the image.


import java.awt.event.*;
import javax.swing.*;
import java.awt.image.BufferedImage;
import javax.swing.JFrame;
import javax.swing.SwingUtilities;

import java.io.*;
import javax.swing.JFileChooser;
import javax.swing.filechooser.FileFilter;
import javax.swing.filechooser.FileNameExtensionFilter;
import javax.imageio.ImageIO;
import java.awt.*;
import java.lang.*;
import java.util.*;

class DiscreteWaveletTransform


{

public static void main(String arg[])
{ DiscreteWaveletTransform dwt=new DiscreteWaveletTransform();
dwt.initial();
}


static final int TYPE=BufferedImage.TYPE_INT_RGB;

public void initial()
{
try{

BufferedImage buf=ImageIO.read(new File("lena.bmp"));
int w=buf.getWidth();
int h=buf.getHeight();
BufferedImage dwtimage=new BufferedImage(h,w,TYPE);
int[][] pixel=new int[h][w];
for (int x=0;x
{
for(int y=0;y {
pixel[x][y]=buf.getRGB(x,y);


}
}
int[][] mat = new int[h][w];
int[][] mat2 = new int[h][w];


for(int a=0;a {
for(int b=0,c=0;b {
mat[a][c] = (pixel[a][b]+pixel[a][b+1])/2;
mat[a][c+(w/2)] = Math.abs(pixel[a][b]-pixel[a][b+1]);
}
}
for(int p=0;p
{
for(int q=0,r =0 ;q {
mat2[r][p] = (mat[q][p]+mat[q+1][p])/2;
mat2[r+(h/2)][p] = Math.abs(mat[q][p]-mat[q+1][p]);
}
}
for (int x=0;x {
for(int y=0;y
{
dwtimage.setRGB(x,y,mat2[x][y]);
}
}
String format="bmp";
ImageIO.write(dwtimage,format, new File("DWTIMAGE.bmp"));
}

catch(Exception e)
{

e.printStackTrace();
}
}
}

The output is a black image with a thin line in between,in short nowhere near the actual output.I think I have interpreted the logic wrongly.Please point out the mistakes. Regards




No comments:

Post a Comment

digital communications - Understanding the Matched Filter

I have a question about matched filtering. Does the matched filter maximise the SNR at the moment of decision only? As far as I understand, ...