[go: up one dir, main page]

0% found this document useful (0 votes)
43 views11 pages

DC Lab Programs

The document outlines various coding techniques including Hamming code for error detection and correction, CRC for error checking, convolution coding for data encoding, and Huffman coding for data compression. It provides MATLAB code snippets for each technique, demonstrating the generation of codewords, error detection, and decoding processes. The results include error positions, efficiency calculations, and decoded outputs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views11 pages

DC Lab Programs

The document outlines various coding techniques including Hamming code for error detection and correction, CRC for error checking, convolution coding for data encoding, and Huffman coding for data compression. It provides MATLAB code snippets for each technique, demonstrating the generation of codewords, error detection, and decoding processes. The results include error positions, efficiency calculations, and decoded outputs.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 11

HAMMING CODE

clc;
clear all;

% Parameters
n = 7; % Length of the codeword
k = 4; % Number of data bits
P = [1 1 1; 1 1 0; 1 0 1; 0 1 1]; % Parity-check matrix

% Generator matrix
G = [eye(k) P];

% Parity-check matrix
H = [P' eye(n-k)];

% Data to encode
D = [1 0 1 1];

% Codeword generation
C = mod(D * G, 2);

% Received codeword (with error)


R = [1 0 1 1 1 0 0];

% Syndrome calculation
S = mod(R * H', 2);

% Error detection and correction


find = 0; % Error detection flag
for i = 1:n
if ~find
E = zeros(1, n); % Error vector
E(i) = 1; % Flip one bit
search = mod(E * H', 2); % Calculate syndrome for error pattern
if isequal(search, S) % Check if syndrome matches
find = 1;
k = i; % Error position
end
end
end

disp(['Position of error in codeword = ', num2str(k)]);

% Correct the error


CC = bitxor(R, E);

% Decode the corrected codeword (extract the first k bits)


Decoded = C(1:4);
clc;
clear all;

% Parameters
n = 7; % Length of the codeword
k = 4; % Number of data bits
P = [1 1 1; 1 1 0; 1 0 1; 0 1 1]; % Parity-check matrix

% Generator matrix
G = [eye(k) P];
% Parity-check matrix
H = [P' eye(n-k)];

% Data to encode
D = [1 0 1 1];

% Codeword generation
C = mod(D * G, 2);

% Received codeword (with error)


R = [1 0 1 1 1 0 0];

% Syndrome calculation
S = mod(R * H', 2);

% Error detection and correction


find = 0; % Error detection flag
for i = 1:n
if ~find
E = zeros(1, n); % Error vector
E(i) = 1; % Flip one bit
search = mod(E * H', 2); % Calculate syndrome for error pattern
if isequal(search, S) % Check if syndrome matches
find = 1;
k = i; % Error position
end
end
end

disp(['Position of error in codeword = ', num2str(k)]);

% Correct the error


CC = bitxor(R, E);

% Decode the corrected codeword (extract the first k bits)


Decoded = C(1:4);

Position of error in codeword = 3

CRC

clc;
clear all;
message =[1 1 0 1 1 0 0 1 1 1 0 1 1 0 1 0];
messageLength=16;
divisor=[1 1 1 1];
divisorDegree=3;
divisor=bitshift(divisor,messageLength-divisorDegree-1);
dec2bin(divisor);
divisor=bitshift(divisor,divisorDegree);
remainder=bitshift(message,divisorDegree);
dec2bin(divisor);
dec2bin(remainder);
for k=1:messageLength
if bitget(remainder,messageLength+divisorDegree)
remainder=botoxor(remainder,divisor);
end
remainder=bitshift(remainder,1)
end
CRC_value=bitshift(remainder,messageLength)
if remainder==0
disp('msg is errror free')
else
disp('msg contain error')
end

msg contain error

CONVULATION

clc;
clear all;
k=3;
G1=7;
G2=5;
msg=[1 1 0 0 1 0]
trellis=poly2trellis(k,[G1,G2])
encode_data=convenc(msg,trellis)
tblen=length(msg)
decoded_data=vitdec(encode_data,trellis,tblen,'trunc','hard')

OUTPUT
trellis = struct with fields:
numInputSymbols: 2
numOutputSymbols: 4
numStates: 4
nextStates: [4×2 double]
outputs: [4×2 double]

HUFFMAN SIR

x=input('enter the no of symbols:');


N=1:x;
disp('the no of symbols are N:');
disp(N);
P=input('enter the probabilities=');
disp('the probabilities are:');
disp(P);
S=sort(P,'descend');
disp('the sorted prob are:');
disp(S);
[dict,savglen]=huffmandict(N,S);
disp('the average length of the code is:');
disp(avglen);
H=0;
for i=1:x
H=H+(P(i)*log2(1/p(i)));
end
disp('entropy is');
disp(H);
disp('bits/msg');
E=(H/avglen)*100
disp('efficienct is');
disp(E);
codeword=huffmanenco(N,dict);
disp('the code word are');
disp(codeword);
decode=huffmandeco(codeword,dict);
disp('decoded output is:');
disp(decode)

CHATGPT

x=input('enter the no of symbols:');


N=1:x;
disp('the no of symbols are N:');
disp(N);
if abs(sum(P) - 1) > 1e-6
disp('Probabilities do not sum to 1. Normalizing...');
P = P / sum(P);
end

disp('The probabilities are:');


disp(P);
[dict,avglen] = huffmandict(N,P);
disp('the average length of the code is:');
disp(avglen);
H=0;
for i=1:x
H=H+(P(i)*log2(1/P(i)));
end
disp('entropy is');
disp(H);
disp('bits/msg');
E=(H/avglen)*100
disp('efficiencY is');
disp(E);
codeword=huffmanenco(N,dict);
disp('the code word are');
disp(codeword);
decode=huffmandeco(codeword,dict);
disp('decoded output is:');
disp(decode);

OUTPUT
AVGLENG=2.16
ENTROPY=2.10
EFFICIENCY=97.47

You might also like