Articles

  1. SECURITY ENHANCEMENT FOR ACCESS CONTROL SYSTEMS BY USING DYNAMIC AUTHENTICATIONDownload Article

    1M.P. Aarthi, 2S.M. Karpagavalli.

    Abstract

    close

    Access card authentication is critical and essential for many modern access control systems, which have been widely deployed in various government, commercial, and residential environments. However, due to the static identification information exchange among the access cards and access control clients, it is very challenging to fight against access control system breaches due to reasons such as loss, stolen or unauthorized duplications of the access cards. Although advanced biometric authentication methods such as fingerprint and iris identification can further identify the user who is requesting authorization, they incur high system costs and access privileges cannot be transferred among trusted users. In this work, introducing a dynamic authentication with sensory information for the access control systems. By combining sensory information obtained from onboard sensors on the access cards as well as the original encoded identification information are able to effectively tackle the problems such as access card loss, stolen, and duplication. Our solution is backward-compatible with existing access control systems and significantly increases the key spaces for authentication. Theoretically demonstrate the potential key space increases with sensory information of different sensors and empirically demonstrate simple rotations can increase key space by more than 1,000,000 times with an authentication accuracy of 90 percent. It performed extensive simulations under various environment settings and implemented our design on WISP to experimentally verify the system performance.

  2. OPTIMAL BINARY DATA FUSION FOR DISTRIBUTED DETECTION IN WIRELESS SENSOR NETWORKSDownload Article

    1F. Jessy Nirmala, 2K. P. Porkodi.

    Abstract

    close

    Wireless sensor networks (WSNs) are vulnerable to several types of attacks including passive eavesdropping, jamming, compromising (capturing and reprogramming) of the sensor nodes and insertion of malicious nodes into the network. Widespread adoption of WSNs, particularity for mission-critical tasks, hinges on the development of strong protection mechanisms against such attacks. Due to the scarcity of resources, traditional wireless network security solutions are not viable for WSNs. The life span of a sensor node is usually determined by its energy supply which is mostly expended for data processing and communication. Therefore, security solutions which demand excessive processing, storage or communication overhead are not practical. In particular, due to their high computational complexity, public key ciphers are not suitable for WSNs. An important application of WSNs, involves decentralized detection whereby the sensors send their measurements to an ally fusion center (AFC) which attempts to detect the state of nature using the data received from all the sensors .Due to the broadcast nature of the wireless media, the sensors data are prone to interception by unauthorized parties . Considering the problem of secure detection in wireless sensor networks operating over insecure links. It is assumed that an eavesdropping fusion center (EFC) attempts to intercept the transmissions of the sensors and to detect the state of nature. The sensor nodes quantize their observations using a multilevel quantizer. Before transmission to the ally fusion center (AFC), the sensor nodes encrypt their data using a probabilistic encryption scheme, which randomly maps the sensor’s data to another quantizer output level using a stochastic cipher matrix. The communication between the sensors and each fusion center is assumed to be over a parallel access channel with identical and independent branches and with each branch being a discrete memory less channel. Employ J-divergence as the performance criterion for both the AFC and EFC. The optimal solution for the cipher matrices is obtained in order to maximize J-divergence for AFC, whereas ensuring that it is zero for the EFC. With the proposed method, as long as the EFC is not aware of the specific cipher matrix employed by each sensor, its detection performance will be very poor. The cost of this method is a small degradation in the detection performance of the AFC. The proposed scheme has no communication overhead and minimal processing requirements making it suitable for sensors with limited resources. Numerical results showing the detection performance of the AFC and EFC verify the efficacy of the proposed method.

  3. EFFICIENT AND PRIVACY-AWARE DATA AGGREGATION IN MOBILE SENSINGDownload Article

    1R. Sangeetha, 2K. Mubarak Ali.

    Abstract

    close

    The proliferation and ever-increasing capabilities of mobile devices such as smart phones give rise to a variety of mobile sensing applications. This paper studies how a untrusted aggregator in mobile sensing can periodically obtain desired statistics over the data contributed by multiple mobile users, without compromising the privacy of each user. Although there are some existing works in this area, they either require bidirectional communications between the aggregator and mobile users in every aggregation period, or have high-computation overhead and cannot support large plaintext spaces. Also, they do not consider the Min aggregate, which is quite useful in mobile sensing. To address these problems, we propose an efficient protocol to obtain the Sum aggregate, which employs an additive homomorphic encryption and a novel key management technique to support large plaintext space. We also extend the sum aggregation protocol to obtain the Min aggregate of time-series data. To deal with dynamic joins and leaves of mobile users, we propose a scheme that utilizes the redundancy in security to reduce the communication cost for each join and leave. Evaluations show that our protocols are orders of magnitude faster than existing solutions and it has much lower communication overhead.

  4. TEXT MINING-BASED SIMILARITY MEASURE FOR COLLABORATIVE POLICY ADMINISTRATIONDownload Article

    1P. Vidhya Rohini, 2A. Sudha.

    Abstract

    close

    Policy-based management is a very efficient method to protect sensitive information. The over maintain of privileges is common in up-and-coming applications, including mobile applications and social network services. Because the applications’ users involved in policy administration have little knowledge of policy-based management. The over maintain can be leveraged by malicious applications, then lead to serious privacy leakages and financial loss. To solve this issue, in this project we propose a novel policy administration mechanism, referred to as Collaborative Policy Administration (CPA), to simplify the policy administration. In CPA, a policy administrator can refer their own polices and also refer other policy to build a new policy for maintain the data security. This project formally defines CPA and proposes its enforcement framework. We also to obtain similar policies more effectively, which are the key step of CPA, a text mining-based similarity measure method are presented.

  5. CLOUD BASED MULTIFACTOR AUTHENTICATION FOR PERSONAL HEALTH RECORDDownload Article

    1S.Roobini, 1V.Shivaganeshan, 1B.Swathi, 2Mrs.S.Vidya

    Abstract

    close

    Cloud-based services are increasingly becoming extensively adopted by healthcare organizations. The past year alone has seen a rush of attention concerning the prospective of cloud computing with many vendors set to start moving healthcare-related applications across to cloud platforms. Healthcare clouds put forward new possibilities, such as straightforward and everywhere access to medical data, and opportunities for new business models. Still, they also put up with new risks and elevate challenges with respect to security and privacy. Traditional way is to provide security to PHR is Authentication to the data stored in the database. To make more secure, various Authentication techniques are used. In existing system two factors Authentication is performed based on the OTP which is randomly generated and sent through mobile phones as a Short Message Service (SMS). The proposed model describes about a Multifactor Authentication which includes Color Scheme, Graphical Password and One Time Password (OTP) is generated. KEYWORDS: Personal Health Record, Cloud Computing, Multifactor Authentication, Color Scheme, One Time Password

  6. EFFICIENT TRANSITIVE REGION APPROACH FORSHORTEST PATH DISCOVERY IN RDBMSS USING MINIMUM SPANNING TREEDownload Article

    1M.Helen Freda, 2S.Prema M.E.,

    Abstract

    close

    This paper aims to find the shortest path with the help of efficient ETR approach to graph search queries. We have proposed an Efficient Transitive Region (ETR) a performance-based initiative that utilizes the SP transitive regions to construct the shortest path and also to improve it. Here shortest path construction systems as dominated the path overlapping. ETRR constructs an auxiliary shortest path step by step, instead of considering single step at a time using Shortest Path trees during traversals. KEYWORDS: ETR approach, KWS interface, shortest path Shortest Path trees.

  7. CONTROLLING PATTERN ATTACKS AGAINST ANOMALY BASED CLASSIFICATIONDownload Article

    1S. Kalpana, 2V. Saravanakumar.

    Abstract

    close

    Integrity violation, availability violation and privacy violation are the major intrusion caused in the network environment. An integrity violation, if it allows the adversary to access the service or resource protected by the classifier. An availability violation, if it denies legitimate users access to it. A privacy violation, if it allows the adversary to obtain confidential information from the classifier. Pattern classification techniques are extended with adversarial settings. Secure pattern classifiers are designed to control the performance degradation under potential attacks. Secure pattern classifier framework is build with model selection and training and testing data construction method. TR and TS construction algorithm is used to select data for pattern classifier. Secure pattern classifier is improved with attack control mechanism to handle the intruders in the testing process. Training patterns are secured with simulated attack patterns. Pattern update process is monitored and controlled in testing process. Classifier utility rate is improved in testing process.

  8. MULTI PARTY DATA DISTRIBUTION AND RULE MINING WITH PRIVACYDownload Article

    1K. K. Kavishree, 2A. N. Karthikeyan.

    Abstract

    close

    Attribute behavior are identified using the rule mining techniques. Fast Distributed Mining (FDM) algorithm is an unsecured distributed version of the Apriori algorithm. Kantarcioglu and Clifton protocol is used for secure mining of association rules in horizontally distributed databases. Unifying lists of locally Frequent Itemsets Kantarcioglu and Clifton (UniFI-KC) protocol is used for the rule mining process in partitioned database environment. UniFI-KC protocol is enhanced in two methods for security enhancement. Secure computation of threshold function algorithm is used to compute the union of private subsets in each of the interacting players. Set inclusion computation algorithm is used to test the inclusion of an element held by one player in a subset held by another. The distributed mining model is used to fetch attribute behavior under the partitioned database environment. The subgroup discovery process is adapted for partitioned database environment. The system can be improved to support generalized association rule mining process. The system is enhanced to control security leakages in the rule mining process.

  9. CLOUD DATA SHARING SERVICES WITH PUBLIC VERIFIERDownload Article

    1M. Nithiya, 2A. N. Karthikeyan.

    Abstract

    close

    Data auditing process is performed in the centralized verification point called public verifier or Third Party Auditor (TPA). “One Ring to RUle Them All” (Oruta) scheme is used for privacy-preserving public auditing process. In oruta homomorphic authenticators are constructed using Ring Signatures. Ring signatures are used to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept private from public verifiers. Homomorphic authenticable ring signature (HARS) scheme is applied to provide identity privacy with block less verification. Batch auditing mechanism supports to perform multiple auditing tasks simultaneously. Oruta is compatible with random masking to preserve data privacy from public verifiers. Dynamic data management process is handled with index hash tables. Traceability is not supported in oruta scheme. Data dynamism sequence is not managed by the system. The system obtains high computational overhead. The multi user data auditing scheme is enhanced with data dynamism and batch auditing support. Traceability features are provided with identity privacy. Group manager or data owner can be allowed to reveal the identity of the signer based on verification metadata. Data version management mechanism is integrated with the system.

  10. N APPROACH TO CLUSTER DOCUMENTS FOR IMPROVING COMPUTER INSPECTION IN DIGITAL FORENSIC ANALYSISDownload Article

    1S.Padma Sudha, 2S.Prema.

    Abstract

    close

    In computer forensic examination, hundreds and thousands of files are generally inspected, typically in the interest of figuring out what had happened, when it had happened, how it had happened, and finally who was involved in the crime. This might be done for the purpose of performing a root cause analysis of a computer system that had failed or is not operating in a proper manner as it should, or to figure out who is the primary cause for misuse of computer systems, or perhaps to find out who had committed a crime using a computer system or against a computer system. Ample of the data in those files comprises of text without formal organization or structure and therefore called as unstructured text. The analysis of these types of texts by computer examiners is a hardship to be performed. In this circumstance, automated procedures of examination are of great interest. Especially, documents clustering algorithms can render the disclosure of useful and new knowledge from the documents that are under examination or investigation. We propose an approach that will apply document clustering algorithms to forensic examination of computer systems seized in police investigations. We have illustrated the approach that is proposed by carrying out experimentation with K-ROSE (K- Rough Outlier Set Extraction) and hierarchical agglomerative approach (Single link, Complete Link, Average link) applied to datasets obtained from computers seized in investigations by police department. Experiments were performed with distinct combinations of parameters. Our experiments have shown that the Complete Link and Average Link algorithms produce the optimum results for our application realm. If suitably initialized, partitional KRose algorithm also yields to very good results. Lastly, we also present and discuss the modules that help investigators of forensic computing. Keywords—clustering; forensic examination; keyword detection; document indexing; forensic datasets

  11. ROUGH OUTLIER AGENT BASED EFFICIENT QUERY SERVICES ON CLOUDSDownload Article

    1E. Saranya, 2C. Gayathri.

    Abstract

    close

    Wide deployment of public cloud computing infrastructures, using clouds to host data query services has become an appealing solution for the advantages on scalability and cost-saving. However, some data might be sensitive that the data owner does not want to move to the cloud unless the data confidentiality and query privacy are guaranteed. On the other hand, a secured query service should still provide efficient query processing and significantly reduce the in-house workload to fully realize the benefits of cloud computing. The random space perturbation (RASP) data perturbation method to provide secure and efficient range query and kNN query services for protected data in the cloud. The RASP data perturbation method combines order preserving encryption, dimensionality expansion, random noise injection, and random projection, to provide strong resilience to attacks on the perturbed data and queries. It also preserves multidimensional ranges, which allows existing indexing techniques to be applied to speedup range query processing. The kNN-R algorithm is designed to work with the RASP range query algorithm to process the kNN queries. The proposed system carefully analyzed the attacks on data and queries under a precisely defined threat model and realistic security assumptions. Extensive experiments have been conducted to show the advantages of the approach on efficiency and security.

  12. AN EFFICIENT CLIENT AUTHENTICATION MECHANISM FOR REMOTE SERVICE PROVIDERSDownload Article

    1B. Yugadharini, 2S. Lalithambikai

    Abstract

    close

    Images and captcha are integrated to build Captcha as graphical passwords (CaRP) scheme. Online guessing attacks, relay attacks and shoulder surfing attacks are handled in CaRP. CaRP is click-based graphical passwords where a sequence of clicks on an image is used to derive a password. Dynamic captcha challenge image is used for each login attempt in CaRP. Text Captcha and image-recognition Captcha are used in CaRP scheme. Text CaRP scheme constructs the password by clicking the right character sequence on CaRP images. CaRP schemes can be classified into two categories recognition based CaRP and recognition-recall based CaRP. Recognition-based CaRP seems to have access to an infinite number of different visual objects. Recognition-recall based CaRP requires recognizing an image and using the recognized objects as cues to enter a password. Recognition-recall combines the tasks of both recognition and cued-recall. Password information is transferred and verified using hash codes. Secure channels between clients and the authentication server through Transport Layer Security (TLS). The image based passwords are constructed with strength analysis mechanism. Pattern based attacks are handled with Color and Spatial patterns. Pixel colors in click points are considered in the color pattern analysis model. Pixel location patterns are considered in the spatial pattern analysis model.

  13. SECURING DATA DISTRIBUTION UNDER CLOUD ENVIRONMENTDownload Article

    1S.R.Priyadharsani, 2K. Sudhakar M.E.,

    Abstract

    close

    Data categorization methods are used to assign class labels to the transactional data values. Resource requirement for the data categorization process is very high. In cloud environment users’ data are usually processed remotely in unknown machines that users do not own or operate. User data control is reduced on data sharing under remote machines. Anomalous and normal transactions are identified using classification techniques. Neural network techniques are used for the classification process. Back-Propagation Neural network (BPN) is an effective method for learning neural networks. Input layer, hidden layer and output layer are used in the neural network operations. Shared data values are maintained under different parties to perform the data categorization process. A trusted authority (TA), the participating parties (data owner) and the cloud servers entities are involved in the privacy preserved mining process. TA is only responsible for generating and issuing encryption/decryption keys for all the other parties. Participating party is the data owner uploads the encrypted data for the learning process. Cloud server is used to compute the learning process under cloud resource environment. Each participant first encrypts their private data with the system public key and then uploads the ciphertexts to the cloud. Cloud servers execute most of the operations in the learning process over the ciphertexts. Cloud server returns the encrypted results to the participants. The participants jointly decrypt the results with which they update their respective weights for the BPN network. Boneh, Goh and Nissim (BGN) doubly homomorphic encryption algorithm is used to secure the private data values. Data splitting mechanism is used to protect the intermediate data during the learning process. Random sharing algorithm is applied to randomly split the data without decrypting the actual value. Secure scalar product and addition operations are used in the encryption and decryption process. The privacy preserved data categorization scheme is composed without the trusted authority for key management process. Key generation and issue operations are carried out in a distributed manner. Cloud server is enhanced to verify the user and data level details. Privacy preserved BPN learning process is tuned with cloud resource allocation process.

  14. SECURED DEDUPLICATION MECHANISM UNDER CLOUD BACKUP SERVICESDownload Article

    1V. Gandhimathi, 2M. Shanthamani.

    Abstract

    close

    Cloud backup service provides offsite storage for the users with disaster recovery support. De duplication methods are used to control high data redundancy in backup dataset .Data de duplication is a data compression approach applied in communication or storage environment. Limited resource level and I/O overhead are considered in the de duplication process. Data redundancy is controlled using Application aware Local-Global source Deduplication (ALG-Dedupe) scheme. File size filter is used to separate the small size files. Application aware chunking strategy is used in Intelligent Chunker to break the backup data streams. The deduplication scheme is enhanced with security features as Security ensured Application aware Local-Global source Deduplication (SALGDedupe) scheme. Encrypted cloud storage model is used to secure personal data values. Deduplication scheme is adapted to control data redundancy under Smart Phone environment. File level deduplication scheme is designed for global level deduplication process.

  15. SEARCHING ON ENCRYPTED DOCUMENTS UNDER CLOUDSDownload Article

    1S. Keerthiga, 2S. Savitha Karpagam.

    Abstract

    close

    Sensitive cloud data have to be encrypted to protect data privacy, before outsourced to the commercial public cloud. The encryption process makes effective data utilization service a very challenging task. Traditional searchable encryption techniques allow users to securely search over encrypted data through keywords. The privacy enabled data searching scheme provides solution for secure ranked keyword search over encrypted cloud data. Ranked search greatly enhances system usability by enabling search result relevance ranking instead of sending undifferentiated results and further ensures the file retrieval accuracy. The statistical measure approach, i.e., relevance score, from information retrieval is explored to build a secure searchable index. One-to-many orderpreserving mapping technique is developed to properly protect those sensitive score information. The system facilitates server-side ranking without losing keyword privacy. The system is improved to support relevance score dynamics process. Search result authentication is also provided in the system. One-to-many order-preserving mapping technique is also enhanced in reversible manner. The similarity analysis scheme is used to identify the query results under the cloud data storage.

  16. DATA PRIVACY MANAGEMENT IN NEURAL NETWORK BASED PRIVATE DATA CLASSIFICATION USING CLOUD RESOURCESDownload Article

    1M. T. Kiruthika, 2C. Selvi.

    Abstract

    close

    Computational resources and storage resources are shared under the cloud environment through the Internet. In cloud environment users‟ data are usually processed remotely in unknown machines that users do not own or operate. Cloud computing enables highly scalable services to consumed over the Internet. Neural network techniques are used for the classification process. Collaborative Back-Propagation Neural Network (BPNN) learning is applied over arbitrarily partitioned data. A Trusted Authority (TA), the participating parties and the cloud servers are involved in the privacy preserved mining process. TA is only responsible for generating and providing keys for all the other parties. Each participant first encrypts their private data and then uploads the cipher texts to the cloud. Cloud servers execute most of the operations in the learning process over the cipher texts. Secure scalar product and addition operations are used in the encryption and decryption process. The collaborative learning process is handled without the Trusted Authority (TA). Key generation and issue operations are carried out in a distributed manner. Cloud server is enhanced to verify the user and data level details. Privacy preserved BPNN learning process is tuned with cloud resource allocation process.

  17. EMBEDDING DATA PROTECTED DISTANCE BASED MINING IN STEGANOGRAPHYDownload Article

    1K. Nethya, 2S. Sadesh.

    Abstract

    close

    Steganography is the art of inconspicuously hiding data within data. Steganography's goal in general is to hide data well enough that unintended recipients do not suspect the steganographic medium of containing hidden data. The software and links mentioned in this article are just a sample of the steganography tools currently available. As privacy concerns continue to develop along with the digital communication domain, steganography will undoubtedly play a growing role in society. For this reason, it is important that we are aware of digital steganography technology and its implications. Equally important are the ethical concerns of using steganography and steganalysis. Steganography enhances rather than replaces encryption. Messages are not secure simply by virtue of being hidden. Likewise, steganography is not about keeping your message from being known - it's about keeping its existence from being known. This work relates the areas of steganography, network protocols and security for data hiding in communication networks employing TCP/IP.

  18. AN NEURAL NETWORK BASED EFFICIENT RESOURCE UTILIZATION IN CLOUDDownload Article

    1S. Sriemimapriya, 2S. N. Sangeethaa.

    Abstract

    close

    Cloud computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable computing resources. Collaborative Cloud Computing operates in a large-scale environment involving thousands or millions of resources across disparate geographically distributed areas, and it is also inherently dynamic as entities may enter or leave the system on Cloud environment for resource utilization owned by different parties and partitioned in arbitrary ways rather than a single way of partition. In this work to improve resource utilization based on optimal time period to allocate resources by proposing neural network training and to calculate load factor by using dynamic priority for the nodes based on which the virtual machines are scheduled. It schedules the VMs to the nodes depending upon their priority value, which varies dynamically based on their load factor. This dynamic priority concept leads to better utilization of the resources. Priority of a node is assigned depending upon its capacity and the load factor. This algorithm strikes the right balance between performance and power efficiency. Once the highest and next highest priority nodes have been identified, then the scheduling is very quick. It prevents a particular node from being overloaded by considering the load factor.

  19. ATTRIBUTE DEPENDANT DATA LINKAGE SCHEME WITH CLUSTERING TREESDownload Article

    1M. Ramya, 2Dr. V. K. Manavalasundaram.

    Abstract

    close

    Data linkage is the task of identifying different entries or data items refer to the same entity across different data sources. Data sets are joined without a common identifier (Foreign Key). Data linkage is divided into two types one-to-one and one-to-many. One-to-one data linkage model associates an entity from one data set with a single matching entity in another data set. One-to-many data linkage associates an entity from the first data set with a group of matching entities from the other data set. The clustering tree is constructed with each leaves contains a cluster. Each cluster is generalized by a set of rules (conditional probabilities) stored in the appropriate leaf. Clustering tree is used for data leakage prevention, recommender systems and fraud detection. One-to-many data linkage method is used to build links between entities of different natures. One-class clustering tree (OCCT) characterizes the entities should be linked together. The OCC Tree is built to transform into association rules. Splitting and pruning operations are used for inducing the OCCT. Structure identification and split based attribute selection tasks are used in inducing a clustering tree linkage model. Probabilistic models are build to represent the leaves in the tree. The data items are linked with induced models. The linkage model is cross validated with test sets to produce score with matched probability values. The One Class Clustering Tree (OCCT) model is extended to handle Many to Many relationship data items. Positive (matching) and negative (Non matching) pairs are integrated in the training process. The system is enhanced to handle binary, categorical and continuous attributes. Accuracy level based splitting and pruning selection process is used in the system.

  20. AN EFFICIENT ITERATIVE FRAMEWORK FOR SEMI-SUPERVISED CLUSTERING BASED BATCH SEQUENTIAL ACTIVE LEARNING APPROACHDownload Article

    1S. Savitha, 2M. Sakthi Meena.

    Abstract

    close

    Semi-supervised clustering is a machine learning approach which improves clustering performance in the form of point-based and pairwise constraints. Selection of pairwise must-link and cannot-link constraints for semi-supervised clustering is resolved using active learning method in an iterative manner. The system enhances iterative framework with naive batch sequential active learning approach and improve the clustering performance. The iterative framework requires repeated clustering of data with an incrementally growing constraint set. To address incrementally growing constraint set, a batch approach is applied which selects a set of points based on query in each iterative. In the iterative algorithm, k instances select the best matches in the distribution, leading to an optimization problem that term bounded coordinated matching. Leveraging the availability of highly-effective sequential active learning method will improve performance in terms of label efficiency and accuracy with less number of iterations.

Sign in

Registered and Approved by National Science Library (NSL),National Institute of Science Communication And Information Resources(NISCAIR),
Council of Scientific and Industrial Research,New Delhi, India"