Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convol... more Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convolutional neural networks (CNNs) for monocular depth prediction represent two largely disjoint approaches towards building a 3D map of the surrounding environment. In this paper, we demonstrate that the coupling of these two by leveraging the strengths of each mitigates the other's shortcomings. Specifically, we propose a joint narrow and wide baseline based self-improving framework, where on the one hand the CNN-predicted depth is leveraged to perform pseudo RGB-D feature-based SLAM, leading to better accuracy and robustness than the monocular RGB SLAM baseline. On the other hand, the bundle-adjusted 3D scene structures and camera poses from the more principled geometric SLAM are injected back into the depth network through novel wide baseline losses proposed for improving the depth prediction network, which then continues to contribute towards better pose and 3D structure estimation ...
International Journal of Engineering, Science and Technology, 2012
Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data commun... more Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data communication over network. Generally transferring a file over a network is not secure if the network is wireless network or it consists of hubs as a networking device. Because then packets are broadcasts to every other computers over the network. A hub does not remember what all devices are attached to it. It just sends the packets to all its ports. Same in case of wireless networks the data packets are broadcasted. In general scenario the data packets are received by only those clients which are supposed to receive it, but it may be happen that a third party too, called “Sniffers” capture or “sniffed” the data packets during file transaction even if they are not supposed to accept it. In this work we try to enhance the security of file transfer by merging file transfer over secure socket along with Public Key Infrastructure (PKI). If we implement file transfer along with asymmetric key crypt...
Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Mo... more Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Most successful defense strategies adopt adversarial training or random input transformations that typically require retraining or fine-tuning the model to achieve reasonable performance. In this work, our investigations of intermediate representations of a pre-trained DNN lead to an interesting discovery pointing to intrinsic robustness to adversarial attacks. We find that we can learn a generative classifier by statistically characterizing the neural response of an intermediate layer to clean training samples. The predictions of multiple such intermediate-layer based classifiers, when aggregated, show unexpected robustness to adversarial attacks. Specifically, we devise an ensemble of these generative classifiers that rank-aggregates their predictions via a Borda count-based consensus. Our proposed approach uses a subset of the clean training data and a pre-trained model, and yet is agno...
Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Mo... more Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Most successful defense strategies adopt adversarial training or random input transformations that typically require retraining or fine-tuning the model to achieve reasonable performance. In this work, our investigations of intermediate representations of a pre-trained DNN lead to an interesting discovery pointing to intrinsic robustness to adversarial attacks. We find that we can learn a generative classifier by statistically characterizing the neural response of an intermediate layer to clean training samples. The predictions of multiple such intermediate-layer based classifiers, when aggregated, show unexpected robustness to adversarial attacks. Specifically, we devise an ensemble of these generative classifiers that rank-aggregates their predictions via a Borda count-based consensus. Our proposed approach uses a subset of the clean training data and a pre-trained model, and yet is agno...
Robust multi-model fitting problems are often solved using consensus based or preference based me... more Robust multi-model fitting problems are often solved using consensus based or preference based methods, each of which captures largely independent information from the data. However, most existing techniques still adhere to either of these approaches. In this paper, we bring these two paradigms together and present a novel robust method for discovering multiple structures from noisy, outlier corrupted data. Our method adopts a random sampling based hypothesis generation and works on the premise that inliers are densely packed around the structure, while the outliers are sparsely spread out. We leverage consensus maximization by defining the residual density, which is a simple and efficient measure of density in the 1-D residual space. We locate the inlier-outlier boundary by using preference based point correlations together with the disparity in residual density of inliers and outliers. Finally, we employ a simple strategy that uses preference based hypothesis correlation and resid...
A Server that can handle only one client at a time is not very effective and useful. Servers typi... more A Server that can handle only one client at a time is not very effective and useful. Servers typically need to be able to handle multiple clients simultaneously. Three different general strategies are used for handling: multiplexing, forking and threading with their advantages and drawbacks. Basically servers fall into two category iterative server and concurrent server. In iterative server cannot process a pending client until it has completely serviced the current client. Concurrent server use concept of forking to spawn a child process to every client. There are also two modified design of concurrent servers. Preforking and Pre-threading. Pre-forking or process preallocation is used to control delay, and maintain high throughput in concurrent servers by lowering the server delay caused due to cost of creating process each time request arrives as compared to general forking. But in a multi tasking environment, pre-forked processes run as daemon processes, then they pose to securit...
Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convol... more Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convolutional neural networks (CNNs) for monocular depth prediction represent two largely disjoint approaches towards building a 3D map of the surrounding environment. In this paper, we demonstrate that the coupling of these two by leveraging the strengths of each mitigates the other's shortcomings. Specifically, we propose a joint narrow and wide baseline based self-improving framework, where on the one hand the CNN-predicted depth is leveraged to perform pseudo RGB-D feature-based SLAM, leading to better accuracy and robustness than the monocular RGB SLAM baseline. On the other hand, the bundle-adjusted 3D scene structures and camera poses from the more principled geometric SLAM are injected back into the depth network through novel wide baseline losses proposed for improving the depth prediction network, which then continues to contribute towards better pose and 3D structure estimation ...
2014 International Conference on Information Technology, 2014
People often capture document images from their mobile camera or handheld digital camera. These c... more People often capture document images from their mobile camera or handheld digital camera. These cameras are handy and can capture partial images of a big document like chart or new paper and text written on wall. Image mosaic king is the process of reconstructing the whole image from its constituent partial overlapped images. If partial images of a document are captured by flatbed scanner then mosaic of the document is obtained by simply estimating the homography which is a simple 2D transformation between partial images. While this is not the case when partial images are captured by mobile or handheld camera. Camera images are suffered from illumination change, scale change, change in viewpoint, perspective distortions and low resolution. In this paper we discussed a robust mosaicking approach for camera captured document images and its implementation results for the applications like document digitization and OCR processing.
Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data commun... more Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data communication over network. Generally transferring a file over a network is not secure if the network is wireless network or it consists of hubs as a networking device. Because then packets are broadcasts to every other computers over the network. A hub does not remember what all devices are attached to it. It just sends the packets to all its ports. Same in case of wireless networks the data packets are broadcasted. In general scenario the data packets are received by only those clients which are supposed to receive it, but it may be happen that a third party too, called “Sniffers” capture or “sniffed” the data packets during file transaction even if they are not supposed to accept it. In this work we try to enhance the security of file transfer by merging file transfer over secure socket along with Public Key Infrastructure (PKI). If we implement file transfer along with asymmetric key crypt...
2016 IEEE International Conference on Image Processing (ICIP), 2016
We propose a fast and efficient two-stage hypothesis filtering technique that can improve perform... more We propose a fast and efficient two-stage hypothesis filtering technique that can improve performance of clustering based robust multi-model fitting algorithms. Sampling based hypothesis generation is nondeterministic and permits little control over generating poor model hypotheses, often leading to a significant proportion of bad hypotheses. Our novel filtering approach leverages the asymmetry in the distributions of points around the inlier/outlier boundary via the sample skewness computed in the residual space. The output is a set of promising hypotheses which aid multi-model fitting algorithms in improving accuracy as well as running time. We validate our approach on the AdelaideRMF dataset and show favorable results along with comparisons to state-of-the-art.
Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convol... more Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convolutional neural networks (CNNs) for monocular depth prediction represent two largely disjoint approaches towards building a 3D map of the surrounding environment. In this paper, we demonstrate that the coupling of these two by leveraging the strengths of each mitigates the other's shortcomings. Specifically, we propose a joint narrow and wide baseline based self-improving framework, where on the one hand the CNN-predicted depth is leveraged to perform pseudo RGB-D feature-based SLAM, leading to better accuracy and robustness than the monocular RGB SLAM baseline. On the other hand, the bundle-adjusted 3D scene structures and camera poses from the more principled geometric SLAM are injected back into the depth network through novel wide baseline losses proposed for improving the depth prediction network, which then continues to contribute towards better pose and 3D structure estimation ...
International Journal of Engineering, Science and Technology, 2012
Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data commun... more Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data communication over network. Generally transferring a file over a network is not secure if the network is wireless network or it consists of hubs as a networking device. Because then packets are broadcasts to every other computers over the network. A hub does not remember what all devices are attached to it. It just sends the packets to all its ports. Same in case of wireless networks the data packets are broadcasted. In general scenario the data packets are received by only those clients which are supposed to receive it, but it may be happen that a third party too, called “Sniffers” capture or “sniffed” the data packets during file transaction even if they are not supposed to accept it. In this work we try to enhance the security of file transfer by merging file transfer over secure socket along with Public Key Infrastructure (PKI). If we implement file transfer along with asymmetric key crypt...
Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Mo... more Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Most successful defense strategies adopt adversarial training or random input transformations that typically require retraining or fine-tuning the model to achieve reasonable performance. In this work, our investigations of intermediate representations of a pre-trained DNN lead to an interesting discovery pointing to intrinsic robustness to adversarial attacks. We find that we can learn a generative classifier by statistically characterizing the neural response of an intermediate layer to clean training samples. The predictions of multiple such intermediate-layer based classifiers, when aggregated, show unexpected robustness to adversarial attacks. Specifically, we devise an ensemble of these generative classifiers that rank-aggregates their predictions via a Borda count-based consensus. Our proposed approach uses a subset of the clean training data and a pre-trained model, and yet is agno...
Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Mo... more Deep Neural Networks (DNNs) are often criticized for being susceptible to adversarial attacks. Most successful defense strategies adopt adversarial training or random input transformations that typically require retraining or fine-tuning the model to achieve reasonable performance. In this work, our investigations of intermediate representations of a pre-trained DNN lead to an interesting discovery pointing to intrinsic robustness to adversarial attacks. We find that we can learn a generative classifier by statistically characterizing the neural response of an intermediate layer to clean training samples. The predictions of multiple such intermediate-layer based classifiers, when aggregated, show unexpected robustness to adversarial attacks. Specifically, we devise an ensemble of these generative classifiers that rank-aggregates their predictions via a Borda count-based consensus. Our proposed approach uses a subset of the clean training data and a pre-trained model, and yet is agno...
Robust multi-model fitting problems are often solved using consensus based or preference based me... more Robust multi-model fitting problems are often solved using consensus based or preference based methods, each of which captures largely independent information from the data. However, most existing techniques still adhere to either of these approaches. In this paper, we bring these two paradigms together and present a novel robust method for discovering multiple structures from noisy, outlier corrupted data. Our method adopts a random sampling based hypothesis generation and works on the premise that inliers are densely packed around the structure, while the outliers are sparsely spread out. We leverage consensus maximization by defining the residual density, which is a simple and efficient measure of density in the 1-D residual space. We locate the inlier-outlier boundary by using preference based point correlations together with the disparity in residual density of inliers and outliers. Finally, we employ a simple strategy that uses preference based hypothesis correlation and resid...
A Server that can handle only one client at a time is not very effective and useful. Servers typi... more A Server that can handle only one client at a time is not very effective and useful. Servers typically need to be able to handle multiple clients simultaneously. Three different general strategies are used for handling: multiplexing, forking and threading with their advantages and drawbacks. Basically servers fall into two category iterative server and concurrent server. In iterative server cannot process a pending client until it has completely serviced the current client. Concurrent server use concept of forking to spawn a child process to every client. There are also two modified design of concurrent servers. Preforking and Pre-threading. Pre-forking or process preallocation is used to control delay, and maintain high throughput in concurrent servers by lowering the server delay caused due to cost of creating process each time request arrives as compared to general forking. But in a multi tasking environment, pre-forked processes run as daemon processes, then they pose to securit...
Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convol... more Classical monocular Simultaneous Localization And Mapping (SLAM) and the recently emerging convolutional neural networks (CNNs) for monocular depth prediction represent two largely disjoint approaches towards building a 3D map of the surrounding environment. In this paper, we demonstrate that the coupling of these two by leveraging the strengths of each mitigates the other's shortcomings. Specifically, we propose a joint narrow and wide baseline based self-improving framework, where on the one hand the CNN-predicted depth is leveraged to perform pseudo RGB-D feature-based SLAM, leading to better accuracy and robustness than the monocular RGB SLAM baseline. On the other hand, the bundle-adjusted 3D scene structures and camera poses from the more principled geometric SLAM are injected back into the depth network through novel wide baseline losses proposed for improving the depth prediction network, which then continues to contribute towards better pose and 3D structure estimation ...
2014 International Conference on Information Technology, 2014
People often capture document images from their mobile camera or handheld digital camera. These c... more People often capture document images from their mobile camera or handheld digital camera. These cameras are handy and can capture partial images of a big document like chart or new paper and text written on wall. Image mosaic king is the process of reconstructing the whole image from its constituent partial overlapped images. If partial images of a document are captured by flatbed scanner then mosaic of the document is obtained by simply estimating the homography which is a simple 2D transformation between partial images. While this is not the case when partial images are captured by mobile or handheld camera. Camera images are suffered from illumination change, scale change, change in viewpoint, perspective distortions and low resolution. In this paper we discussed a robust mosaicking approach for camera captured document images and its implementation results for the applications like document digitization and OCR processing.
Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data commun... more Public Key Infrastructure (PKI) provides an intensive security mechanism for securing data communication over network. Generally transferring a file over a network is not secure if the network is wireless network or it consists of hubs as a networking device. Because then packets are broadcasts to every other computers over the network. A hub does not remember what all devices are attached to it. It just sends the packets to all its ports. Same in case of wireless networks the data packets are broadcasted. In general scenario the data packets are received by only those clients which are supposed to receive it, but it may be happen that a third party too, called “Sniffers” capture or “sniffed” the data packets during file transaction even if they are not supposed to accept it. In this work we try to enhance the security of file transfer by merging file transfer over secure socket along with Public Key Infrastructure (PKI). If we implement file transfer along with asymmetric key crypt...
2016 IEEE International Conference on Image Processing (ICIP), 2016
We propose a fast and efficient two-stage hypothesis filtering technique that can improve perform... more We propose a fast and efficient two-stage hypothesis filtering technique that can improve performance of clustering based robust multi-model fitting algorithms. Sampling based hypothesis generation is nondeterministic and permits little control over generating poor model hypotheses, often leading to a significant proportion of bad hypotheses. Our novel filtering approach leverages the asymmetry in the distributions of points around the inlier/outlier boundary via the sample skewness computed in the residual space. The output is a set of promising hypotheses which aid multi-model fitting algorithms in improving accuracy as well as running time. We validate our approach on the AdelaideRMF dataset and show favorable results along with comparisons to state-of-the-art.
Uploads
Papers