The authors propose a simple approach of clustering of text documents with a given n-words input using graph mining techniques. This proposed approach clusters text documents in three forms based on identifying of n-words, (n-1)-words, and (n-2)-words respectively from text documents. These three forms of clustering of text documents with identifying all the n-words, (n-1)-words, and (n-2)-words from n-words input forms from set of text documents. These three forms of clustering are treated as document-word relation and finally represented as bi-partite graphs. For this the authors have proposed an algorithm for three forms of clustering of text documents for a given n-words input using graph mining techniques. Finally, the paper concludes with the result analysis of the proposed algorithm by implementing the proposed technique using C++ programming language and observed satisfactory results.
The performance requirement of modern devices and systems are growing exponentially. Therefore, the number of components and Hardware requirements are also going high. Unfortunately, the existing methodology and technological advancement is not adequate to provide enough support. Therefore, the need for a much faster and efficient design method should be employed. Hence we introduce the concept of Network on Chip (NOC). By which we can overcome the disadvantage of existing Bus architecture. In bus architecture as we already know data transfer between different parts of the system occur via buses. This mode of communication is not much efficient and has slow bus response and also scalability problem. On the other hand, Network on Chip architecture increases the efficiency of communication between different modules. Its significantly reduces the amount of wire required to route data between modules and has high operating frequency and increased scalability. The objective of this paper is to introduce some concepts of Network on Chip (NOC) architecture involving the comparison of different routing strategies in Network on Chip.
Fuzzy controllers are very simple conceptually. They consist of an input stage, a processing stage, and an output stage. The input stage maps sensor or other inputs, such as distance between vehicles, road condition, climatic conditions and vehicle condition to the appropriate membership functions and truth values. The processing stage invokes each appropriate rule and generates a result for each, then combines the results of the rules. Finally, the output stage converts the combined result back into a specific control output value speed of the vehicle. In this context fuzzy logic provides more efficient methods to achieve the same.
A cognitive radio wireless sensor networks is one of the candidate areas where cognitive techniques can be used for spectrum access. Current research in this area is still in its early stage, but now it is evolving rapidly. The aim of this research study is to classify the pros of the fast emerging appliance area of cognitive radio networks and to highlights the key research that has already been undertaken and indicates the open problems. This paper describes the benefits of cognitive radio wireless sensor networks, application areas of cognitive radio wireless sensor networks and current research trends in cognitive radio wireless sensor network. The CRS need for the resource utilization in the leased networks, Military, CR Mesh Networks, Multimedia networks, Cellular Networks and Emergency Networks. For the resourceful utilization of the limited spectra, CRS plays an important role, which has a capability to adapt its parameter and protocol as per the nearby environment and from its past experience. Grounded on software defined radio technology, its main motive is to provide additional flexibility and improved effectiveness in overall spectrum deployment. But the design and execution part suffers from many issues and challenges. Some key research issues and challenges are pointed out in this research article. Especially achievement challenges in Cognitive Radio (CR) focusing on RF front-end, transceiver, A/D and D/A interfaces which still act as blockade in CRS development
Dishwasher is a machine that facilitates the washing of utensils and cutlery during its operation. We had observed that household and traditional way of washing utensils was a very cumbersome process that led to waste of time and energy. We conducted a market research on various dishwashers present in India and observed that dishwasher present have low power efficiency. This paper explores the design decisions and steps implemented in achieving a customised power efficient dishwasher.
Inventory management plays a vital role in supply chain management. The service provided to the customer eventually gets enhanced once the efficient and effective management of inventory is carried out all through the supply chain. Thus the determination of the inventory to be held at various levels in a supply chain becomes inevitable so as to ensure minimal cost for the supply chain. Minimizing the total supply chain cost is meant for minimizing holding and shortage cost in the entire supply chain as well as the lead times at various stages. A serious issue in the implementation of the same is that the excess stock level and shortage level is not static for every period. In this report, study has been carried out on Genetic Algorithms and Particle Swarm Optimization in order to distinctively determine the most probable excess stock level and shortage level required for inventory optimization in the supply chain such that the total supply chain cost is minimized.
Cloud computing means the use of computing resources that are delivered in the form of service over the internet. Cloud computing is resulting in fundamental changes to computing infrastructure. These changes in the computing infrastructure have not resulted in corresponding changes to operating systems. A new operating system called MultiLibOs is built specifically for the cloud to achieve increased efficiency, scale and functionality. Later in this paper MulitLibOS architecture is explained. If this architecture is adopted will result in many families of libraries, each addressing different concerns for different classes of applications and systems.
In recent past optical network has improved very much. Optical network increasing its bandwidth demand. Providers of the networks are moving towards a milestone in network evolution. It provides high speed; high capacity reduced the costs for new applications such as internet and advance digital services. In this paper briefly described about the technologies and the explained the connectivity. This paper also provides the information about transmission capacity which is very high compared to copper wire. DWDM allows signals to be transmitted simultaneously on a single fiber. Network management explains the how to manage the network.
Resent trends in computer science research has exponentially growing domains with interdisciplinary fields which are applied in many verticals. One such domain with golden opportunities for research is CLOUD COMPUTING. In this research article we have concentrated on various cloud services with more emphasis on Robot as a Service and Hybrid computing which would serve as a base for our future research in cloud computing.
In the field of artificial intelligence, a genetic algorithm is a search heuristic that mimics the process of natural selection. The genetic algorithm is Search and optimization techniques that generate solutions to optimization difficulties using technology following the natural evolution. It is an artificial intelligence a type of evolutionary computer algorithm in which symbols representing possible solutions are “bred.” Also in general software, genetic algorithms are used for research of artificial life, cellular automatons, and neural networks.
To meet the growing computation-intensive applications and the needs of low-power, high-performance systems, the number of computing resources in single-chip has enormously increased, because current VLSI technology can support such an extensive integration of transistors. By adding many computing resources such as CPU, DSP, specific IPs, etc. to build a system in System-on-Chip, its interconnection between each other becomes another challenging issue. As an improvement on SoC, a new paradigm has been introduced called NoC. NoC is an intercommunication based network system, which is implemented on an integrated circuit, typically between IP cores in a SoC. With the development of IC technology, SoC fails to meet the increasing requirement of network communication since it is based on traditional bus architecture. With the transplantation of network technology from computer systems and the replacement of traditional bus structure with network structure, which solves the communication bottleneck issue of SoC and is rather promising. This paper is about the design and development of a simulator for NoC architectures. More specifically butterfly and fattree architectures.
Information security is one of the major concerns in present day scenario. Everything is on the digital platform these days, which makes life simpler. But, it has a dark side as well. Information being put in digital platform makes it vulnerable and mostly easy to slide on to the hands of wrong people. People, who use the information for wrong reasons. The importance of the information security is generally neglected by normal a person who does not have any computer knowledge and often tend to make mistakes which may haunt them back with very bad episode. The paper describes the general security threats, requirements and probable solutions
As the energy demand and the environmental problems increase, the natural energy sources have become very important as an alternative to the conventional energy sources. The renewable energy sector is fast gaining ground as a new growth area for numerous countries with the vast potential it presents environmentally and economically. Solar energy plays an important role as a primary source of energy, especially forRural area. The project is divided into two stages, one part is by using dual axis solar tracker and the another one is by using organic solar panels in order to reduce the cost of panels.our part in this idea is that by using these panels for tracking the solar radiation and produce more power. Because of the reduced cost it can be implemented in every homes and for agriculture purpose.
We use Web within our own semantic context as we are blessed with intuitiveness. We visually deduce and coordinate context by looking at images and text on web sites. Machines are really fast but also incredibly dumb so don’t have any of these capabilities. The Semantic Web is about capacitating machines by formatting data in intelligible format to machines. The classic example for this would be the efforts for consolidating geographic data on Web. Fifty years ago it appeared dauntless for building global web of info and deploying semantics on such a scale and to attempt inference over the resulting components. Today the Semantic Web is attainable. The goal of semantic web research is to allow the vast range of web-accessible information and services to be more effectively exploited by both humans and automated tools. RDF and OWL have been developed as standard formats for sharing of data and knowledge in form of rich conceptual schemas called ontologies. These languages and tools developed to support them are rapidly becoming standards for ontology development and deployment.
The main objective of our project, “SURFACE FINISH DETECTOR BY USING ULTRASONIC SENSOR” is used to detect the surface finish of an object. In most of the industries surface finish of an object is can be measured by using many techniques. In our project the ultrasonic sensor is used to determine the surface finish of an object. In this project we are using some major components like Micro controller, Voltage regulator, Transformer, Ultrasonic sensor. Ultrasound waves or ultrasonic waves are the terms used to describe elastic waves with frequency greater than 20,000 Hz and normally exist in solids, liquids, and gases. An ultrasonic wave moves at a velocity (the wave velocity) that is determined by the material properties and shape of the medium, and occasionally the frequency. If any deviations are detected the system will be indicate.
In closed and engaged hall or rooms sometimes oxygen level is decreased. So, it’s difficult to breathe normally. So, we had implemented an oxygen sensor to monitor the oxygen level. In this project w design and fabricate a system which automatically monitors the oxygen and carbon dioxide levels. If the oxygen level decreased beyond the set level, the microcontrollerAT89C51 sends the signal to actuate the generator and oxygen generator will generate oxygen to the closed hall. We use a zeolite material that is used for air fresher and odours controller.
Artificial Neural Networks(ANN), which are simplified models of the biological neurons system, is a massively parallel distributed processing system made up of highly interconnected neural computing elements that have the ability to learn and thereby acquire knowledge and make it available for use. The application of ANN methodology for modeling amount of runoff for catchment of the Machhan river located in Dahod district of Gujarat(India) is presented. Dahod district is a semi –arid region with around 800m.m of average annual rainfall which is erratic in nature. The model uses rainfall as input and gives runoff as output. On calibration of the model is found to be giving good comparison of the observed and simulated flows.
The objective of the paper is to study about the Advanced Version of GPS. By using GPS, only the location of an object can be found but my research says that we can able to find the location along with Latitude and Longitudinal manner along with its Altitude by using a device or a chip. I named this device as LLR device ( Latitude and Longitudinal Ranging device). The theoretical setup is given along with some experimental setup of various scientists. If this Project came into existence, some missing objects from air planes to pet animals can be found with a fraction of seconds.
This paper focuses on recapturing short circuit power (or) fault power by means of a retrieval scheme through grounding. Considering any day today applications wastage of power is a routine activity, similarly for a fault protection we use grounding methodology and techniques for safeguarding the humans and equipment. But when taking this into consideration these fault power is wasted without any purpose or applications. Hence for effective utilization of this fault power the above project has been proposed. The main aim of this proposed project is, when this method is implemented in the power system, it would certainly help to save fault power.
Cloud Computing is an technology for providing computer service via Internet.It is based on the concept of virtualization,grid and utility computing.Instead of Installing and storing the software in users PC’s,cloud enable the user to use the resources through the Internet. It also ensures optimal usage of the available resources. Cloud computing is a completely internet dependent technology where client data is stored and maintain in the data center of a cloud provider like google, amazon, salesforce.com and microsoft etc. Limited control over the data may incur various security issues and threats which include data leakage, insecure interface, sharing of resources, data availability and inside attacks.Due to low cost most of the IT Industries moved to cloud technology to meet their computational requirements. It is a technology where the users can use high end services in form of software that reside on different servers and access data from all over the world.Recently the availability and popularity of cloud services has increased rapidly.This recent development helps the users who are unwilling and unable to maintain and produce their own computational infrastructure. Technological advancements in cloud computing due to increased connectivity and exponentially proliferating data has resulted in migration towards cloud architecture.cloud computing delivers services for consumer and business needs in a simplified way, providing unbounded scale and differentiated quality of service.This paper deals with this recent development and new trends in Cloud Computing.
Our world has been facing a large amount of thread due to the insecurity in all the electronic devises. In this method the smart phone will be having a application which the phone keeps all its data in it so when someone try to hack it will be prevented. In general Hacking is like laying a bridge between the computer of the hacker and computer which is under attack .This bridge is a way by which the data is been taken from the other .So if we are able to keep the defiance mechanism which will send a virus, whenever a bridge is laid to steal the data of someone in unauthorized manner .Since when a connection is between two computer both computers can exchange any information. So the computer which is under attack can sends a virus to the hackers computer so that all the data will be infected with the virus .The virus has a tracking software or a program which can be stored in the hackers computer even the computer of the hacker have an anti virus. When the person who’s computer can get the information that his / her computer was under attack and they can get the location of the hacker easily.
Data mining (DM) is used for extracting useful and non-trivial information from large amount of data. Cluster analysis is used to form logical group of similar data, widely applied in many practical applications such as weather forecast, share trading, medical data analysis, aerial data analysis, etc., Clustering in data mining is an unsupervised learning model. Clustering techniques in handling high dimensional data is bit complex due to intrinsic sparsity of high dimensional nature of data. However, existing methods to prune irrelevant clusters were based on spectral clustering algorithm and graph-based learning algorithm,whose lack of sparsity and polynomial time complexity. In this paper to cluster sparsely distributed high dimensional data objects, Fuzzy Relational Scattered Distance based Clustering (FRSDC) technique is developed. The main objective of the FRSDC technique is to specify the clustering data points over sparsely distributed data within limited processing time. Fuzzy relational in FRSDC technique calculate the geometric median of sparsely distributed high dimensional data and determine clustering objects to be placed on each cluster. Initially, FRSDC identify the geometric median of similar sparse data and then the non selected sparse data objects appropriate the relational fuzziness across data points, reducing subspace of data objects in clustered plane. Next, Scattered Distance measures the distance of geometric median (i.e.,) inner object and similar object position (i.e.,) outer object and computes the probability distribution function while performing clustering. Finally, Scattered Distance with grid form is used to compute the area of the cluster in FRSDC and therefore obtains the clustered sparse data. The space complexity of each algorithm is analyzed and the results are compared with one another. By comparing the result of this technique, it was found that the results obtained are more accurate, easy to understand and above all the time taken to cluster the data was substantially low in FRSDC technique than the state-of-art methods.
This radiant energy device works with the concept which is directly converting ionized particles generated by Radiant matter (or) Radiant Energy into electrical power. It captures free moving ionised charges in the atmosphere and the opposite charges that got sink inside the ground and then stores the charges either in a capacitor or in some charge storing devices. These charges moving with a velocity exceeding that of light is collected through an antenna. The power thus produced can be utilized for power backups in charge storing batteries and can be linked with solar panel setups for power backup even at night time. The complete principle lies with the concept of Tesla’s radiant energy patents. The modification done here is just by directly using the setup connected in series and by using a rectifier the D.C is supplied for charging.
Wearable technologies have so far been dominated by smart watches and fitness fans keen to exploit the tracking of speed, location and body monitoring to try and improve health. Yes fashion designers are also now exploring the potential of sensors and internet connectivity to create clothing and accessories that are often beautiful and intriguing as well as smart. It’s not only clothing. While even the likes of Tag Heuer have announced plans to unveil an IoT wearable, there’s more to connected accessories than smart watches. The research is still dancing between the two extremes. But the one thing that’s certain is the world will be full of connected sensors. It’s merely a matter of deciding how to use them – to count how many steps we’ve taken in a day, or to show off our emotional state to those around us with an interactive t-shirt.
This work aims in bringing out a new structure which could stand eliminated from the grid and can serve the public in different ways. These structures can be defined as the power producers. It is the combination of solar panels, which converts solar radiation to electricity and antennas, which attracts signals from atmosphere converting them into energy. The combination of both the mechanisms makes solar structs more reliable and efficient all round the clock. These structures are sure to be the kick-starter for tomorrow’s smart cities. The solar panels and the antenna produce electricity without the help of grid and this power is stored in batteries. Then, they are utilised by various technologies and facilities present within the structure to provide a sophisticated surrounding.
It is estimated that in India has more than 100,000 brick kilns producing about 250 billion bricks annually, consuming about 35 million tons of coal annually. The brick industry is growing as the demand for bricks is increasing in the towns and villages due to the fast economic growth, urbanization and prosperity. It is alarming to note that 300 mm depth of fertile top soil in India will be consumed for burnt clay brick production in about 60 years. The objective of this paper is too aware us about the harms of using red bricks as a construction material. In India more than 65% of our electricity demand is fulfilled by coal based thermal power plants, in the process of generating electricity through coal based thermal power plants, a large amount of fly ash is generated as residue. If this fly ash residue is exposed to open air, it would be a big threat to our environment, but luckily we got an option to turn this ash residue into useful construction material by the use of bed waste, fly ash etc and this type of bricks is known as fly ash bricks, due to rise in pollution day by the day author has analysed that there is need to aware the people about the harms of using red bricks. During the survey it is analysed there is more no. Of red bricks manufactures in the district as compared to ash brick in near the residential areas which are legally not permitted; it is the strict order of the government that bricks plants must be situated about a range of 50km away from the residential areas. Everyday tons of waste is originated from different companies and factories but it is our luck that we are using this waste material in the construction work which will beneficial for mankind as well as environment. The main motive of this paper is to promote the usage of ash bricks( ecofriendly bricks) On the other side, our country needs approximately 250 billion bricks per year for all kind of construction works, to make 60 billion bricks, 185 million tons of top soil is needed. Ultimately near about 7500 hectares of very fertile land is deliberately eroded to meet the demand of clay bricks for construction every year. This devasting act is slowly killing our environment and we will be left with no fertile land for agriculture in near future, deforestation also occurs in search of soil source for clay brick manufacturing. This paper is based on the local survey conducted under the district ‘DURG’ (Chhattisgarh).
Web page optimization streamlines the content to increase display speed. With the average internet speed increasing globally, it is suitable for website administrators and webmasters to consider the time it takes for websites to render for the visitor. Fast speed performance is the key to success with website. It enlarges profits, decreases costs, and improves customer satisfaction. Reorganization converts the pages to display navigable content quicker, and to defer or delay off-site content. The ranking functions are usually learned to rank search outcomes based on features of individual documents i.e., point-wise features. This work will enlarge the website visibility and make the user to achieve the information what they are actually looking for. Web Services have appeared as a new Web-based technology paradigm for altering information on the Internet with heavy speed and best quality. They have become a promising technology to plan and build complex inter-enterprise business applications.
The inlet and outlet of two water samples were taken from a dyeing factory (input to the dyeing unit and output from effluent treatment plant ) in Sukkaliyur, Karur, Tamil Nadu, India. The water samples were analyzed according to standard water testing methods and the results of both the samples were compared with CPHEEO (Central Public Health and Environmental Engineering Organisation) standards. The two samples were considered as not potable with respect to tested water parameters. The outlet of the dye effluent treatment unit was used for microbial decolorization. In this study, bacterial strains namely Bacillus subtilis , Pseudomonas aeruginosa and fungal strains namely Aspergillus flavus, Candida albicans were used for decolorization of azo dyes(azo dyes account for majority of textile dyestuff). Bacterial and fungal strains were isolated from the dye effluent itself. Decolorization of the textile dye effluent was analaysed by varying parameters like pH, temperature and dye concentration .The most suitable value of the above parameters was reported for maximum decolorization. The variation of other water parameters like Biological oxygen demand, Chemical oxygen demand and Nitrogen content were studied at the reported optimum level of basic parameters (pH, Temperature, Concentration ) . The results were in favour of future reliability of textile industry on biodegradation of dye effluents. Among fungal strains Aspergillus flavus showed maximum decolorization of 82 % and among bacterial strains Bacillus subtilis showed maximum decolorization of 84 %. This method was established to be the cheapest and most eco-friendly method of treatment of dye effluent.
Cloud computing is used to reduce the expense of the users and the man power they use vastly for maintaining the servers that is mostly needed only for some particular days for important publishes or updates. The main Objective of the System is to review the available document retrieval into cloud. In this proposed System the concept of the cloud and how the document retrieval is in the cloud. The document retrieval consists of these facilities like securing the performance evaluation of the schemes. The documents to be retrieved are consisting of certificates, credit card, PAN card etc., can be retrieved from the cloud. Resources provided by use of the data storage and computation outsourced. Within the unique id, one can access the services and retrieve the data in the cloud. In this paper, we feature the new idea to provide some better document retrieval in cloud. The system that we are going to develop will be available for all people who possess the cloud. For the user whenever the details are needed they can reach the server through an unique id and can access the details in the cloud.
Mobile ad hoc networks (MANETs) consist of a collection of wireless mobile nodes which dynamically exchange data among themselves without the reliance on a fixed base station or a wired backbone network. MANET nodes are typically distinguished by their limited power, processing, and memory resources as well as high degree of mobility. In mobile ad hoc network there are several routing algorithms, which utilize topology information to make routing decisions at each node. In geographic routing, nodes need to maintain up-to-date positions of their immediate neighbors for making effective forwarding decisions. Periodic broadcasting of beacon packets that contain the geographic location coordinates of the nodes is a popular method used by most geographic routing protocols to maintain neighbor positions. We contend and demonstrate that periodic beaconing regardless of the node mobility and traffic patterns in the network is not attractive from both update cost and routing performance points of view. We propose the Adaptive Position Update (APU) strategy for geographic routing, which dynamically adjusts the frequency of position updates based on the mobility dynamics of the nodes and the forwarding patterns in the network. APU is based on two simple principles: 1) nodes whose movements are harder to predict update their positions more frequently (and vice versa), and (ii) nodes closer to forwarding paths update their positions more frequently (and vice versa). Our theoretical analysis, which is validated by NS2 simulations of a well-known geographic routing protocol, Greedy Perimeter Stateless Routing Protocol (GPSR), shows that APU can significantly reduce the update cost and improve the routing performance in terms of packet delivery ratio and average end-to-end delay in comparison with periodic beaconing and other recently proposed updating schemes. The benefits of APU are further confirmed by undertaking evaluations in realistic network scenarios, which account for localization error, realistic radio propagation, and sparse network.
Hbase is the one which has the benefits of distributed storage system in the open-source environment and HDFS (Hadoop Distributed File System) is the highly fault tolerance system which runs on the hardware is the unique function compared to other distributed system. But the Hbase distributed storage run on just above the HDFS as part of the Apache Software Foundation’s Apache Hadoop. Hbase is the one of the application of the Hadoop which facilitate the users for read and write random access of large data. In modern times Hbase is used by the Google to manage the huge amount of the structured data. Hbase has the functionalities of the big table and also than it so Hbase is used to support the big table. Hbase is the developed by the Java language where the Hbase is similar to the other non SQL languages. Few features of the Hbase on the column basis are compression, in-memory operation and the Boom-Filters where these features are expect to grab by the Hive in the Hadoop in the future. In the Hbase the Table Input are get in the some Format which facilitate the Map Reduce function to takes place and the output of the function is given to the tables of the Hbase. This paper includes the Introduction to the Hbase, Installation of Hbase, Architecture of Hbase, RDMS, how Hbase overcome the problems of RDMS(Relational Database Management System).
Main objective of this work is to determine the optimum mass flow rate on fuel injector of a typical Gas turbine Engine .This paper presents the design of combustion chamber followed by three dimensional simulation to investigate the velocity and at the exit part of combustion chamber which is nothing but the entry region of Turbine section. Fuel Considered for the simulation process is methane (CH4). CFD simulation has been made using ANSYS CFX 14.5 software to analyze the flow pattern within the combustion chamber. A 3D combustion chamber is designed with CATIA V5 which is later exported to ANSYS CFX 14.5
Energy efficacious-cooperative spectrum sensing (EE-CSS) protocol based on Trust and Reputation Management (TRM) unit is proposed. This protocol reduces the number of sensing reports exchanged between secondary users and its base station. Trust and Reputation Management unit was proposed to alleviate the malicious behaviour in Cognitive Radio Network (CRN) and to ensure there is no link disconnection in secondary users in the network. The Experimental result shows that the energy consumption in the proposed protocol can be much lowered than other Traditional spectrum sensing method.
As the use of Wireless Sensor Networks (WSNs) has grown enormously, the need for energy-efficient management has also risen. This paper presents a new approach to clustering wireless sensor networks and determining cluster heads. Energy-Balance on Demand Clustering Algorithm Based on LEACH-C is proposed. . Clustering algorithms have been widely used to reduce energy consumption. The algorithm adopts centralized cluster formation and distributed CH selection methods. Minimum energy clustering is used to divide the network into clusters, while energy and total communication distance are considered as secondary criteria to select optimal CH. From simulation results the proposed algorithm outperforms LEACH-C in life time, stability period and performance efficiency.
In power production, the tidal energy plays a vital role. This paper deals with the new initiative method to produce energy with latest innovation and cheaper cost. The waves with high force, hits the piston which then pushes the vertical plate. The plate is connected with the dynamo with the help of the shaft to make the motion easy. As the force of the waves gets increased, the dynamo gets rotated and so the power is generated. The power generated from the dynamo is the dc power and this is been stored in the battery. The stored power is then converted as the ac power using the inverter with the help of the transistor 2N3055 which itself also acts as a power booster in order to boost up the power that is generated. As the dc power is converted to the ac power this can be used to run a load.
In this paper, we present a reformative work for developing an assistive aid system for visually impaired people. This device act as an identification system and text navigator that are capable of assist or guide people with vision loss, ranging from partially sighted to totally blind, by means of sound commands. This device involves the sequential operations of color identification, currency denomination recognition, obstacle detection in indoor as well as reading newspapers and books. Here the text navigator device and identification systems are integrated into the single chip which is represented as Raspberry pi ARM11 (BCM2836). This Raspberry pi has 900MHz of high speed and accuracy. The text navigator system is used to captures an alphabetic and numeric letters by using a camera module as an image object and convert to text file using OCR(Optical Character Recognition) engine then convey that text file information using the speaker. This identification system and text-to-speech converter (Text navigator system) are specifically designed for visually impaired (VI) people, So that they can easily use this device without having to ask for help to others.
Motion tracking is a major issue in security field whether it is borders ,banks, offices and institutions etc. Security is always maximum concerned. To maintain security we deploy security guards but with them human errors are most common as they cannot available on a place all the time. Hardware sensor based systems are very costly and maximum lasts for few years only. it can be placed on single place. This paper proposes to create motion detection system using software. It deals with the concept of motion tracking using cameras in real time. It is designed to create a visitor identification system in which motion is detected MATLAB system reads predefined message.
The objective of proposed work is accessing element through remote authentication by hiding the encrypted biometric signal within face image. Statistical Analysis: The biometric input signal is encrypted by using the Arnold Transform algorithm and it is hided into the cover image(face) with Qualified Significant Wave Tree(QSWT).The cover image is the image of a person. The cover image is first compressed and transmitted through the wireless channel for remote authentication. The cover image and encrypted signal information is separately extracted using the Inverse Wavelet Transform and the biometric signal is decrypted by Inverse Arnold Transform algorithm. Findings: The Arnold Transform Algorithm increases the Normalized Cross Correlation (NCC) value to improve the quality of the reconstructed image. The proposed work has minimum error than the existing by increased Peak to Signal Ratio (PSNR) and minimizing the Mean Square Error(MSE). The problems such as data loss, complexity and accuracy of biometric signal is overcome by the proposed work. Application: military areas for the confidential transmission of data in a secured manner.