May 2015

  1. Lydia S. Quadros and Antony Sylvan D’souza
    Journal Area:
    ABSTRACT:

    Ulnar nerve usually is a branch of the medial cord of brachial plexus. In a study conducted on 20 upper limbs, variation in its origin was observed in one right upper limb. Ulnar nerve had two roots of origin, one arising from the medial cord and another from the lateral cord. The rest of the course of the ulnar nerve was as usual. In another left upper limb a communicating branch was observed between the musculocutaneous nerve and median nerve. This type of variation would be important while planning surgery in the region of axilla or arm as these nerves is more liable to be injured during operations.

    Pub. Date: May 31, 2015
    Paper No:
    116
  2. Dr. Sundar, C.
    ABSTRACT:

    New Mutual Fund product has been major component of competitive strategy to improve investor’s objective and their profitability. There are several mutual fund products available and there is an expectation from asset management companies to customize a new mutual fund product based on the investors’ expectations and their objectives. Asset Management Companies are focusing on the investor's needs based on their objective, by framing a new product development.There exist Research Analyst, Investment Advisors impact on the new mutual fund product creation - for creating an environment and breaking the difficulties in mutual fund product objectives. There is certain amount of internal resource role like Research Analyst, Portfolio Manager, Investment Advisor, and Investment Planning Consultants in developing new mutual fund product activities. It is made mandatory for ensuring the right skill at the right time for driving the tradition of new mutual fund product development.

    Pub. Date: May 31, 2015
    Paper No:
    169
  3. Dr. Sreenivasa Ravi, K., Abhiram, D.V., Santosh, K.S.S.K. and Bharathi Devi, A.
    ABSTRACT:

    Context aware web design is not a new technology and its been in the use from many years. However the algorithms designed for that purpose are written in jsp or similar other server programming languages which includes bringing the huge amounts of the transactional data into a single computing unit (which is usually of very high computing capacity) and run analytics that are necessary. In this paper we have tried to implement the similar website which changes dynamically according to the users latest interests. The web content to be displayed is computed by the Hadoop instead of traditional servlet programs .Hadoop relies on taking the computing power to the data instead of data to the computing source which reduces the cost of the maintenance by one tenth of the present value and the processing power varies linearly with the input data, Unlike in the traditional processing where time varies exponentially with the data. Although the idea presented is being used, this is an implementation of the Hadoop’s ability to adapt to the traditional database structures and run quite efficiently to produce effective results.

    Pub. Date: May 31, 2015
    Paper No:
    181
  4. Abdul Ahad, Ravali, M. and Durga Bhavani, S.
    ABSTRACT:

    Big Data consists of huge modules, difficult, growing data sets with numerous and, independent sources. With the fast development of networking, storage of data, and the capacity of data gathering, Big Data is now rapidly increasing in all science and engineering domains, as well as animal, genetic and biomedical sciences. This paper detailed a HACE theorem that characterizes the features of the Big Data revolution, and proposes a Big Data processing model, from the data mining view. This data-oriented model contains demand-driven aggregation of information sources, mining and study, user data modeling, and security and privacy problems .We examine the difficult issues in the data-oriented model and also in the Big Data revolution.

    Pub. Date: May 31, 2015
    Paper No:
    196
  5. Dr. Ramesh R. Manza and Yogini B. Patil
    ABSTRACT:

    To fulfill software requirements and to satisfy the customer, software must be tested by all angles of users and developer .But the question is that when to stop testing or whether it is enough to test software, if it not tested properly by software engineer then it is handover to customer, if we want to give quality s/w product then we must calculate software reliability as it one of quality measure. s/w reliability is probability of failure free operation of software. usually tester focus on testing functional or normal behavior of units of s/w and it is in developer’s environment but software reliability focus on operational behavior of software in natural setting that is in presence of customers and users, it is need of software community to combine both functional and operational approach so that testing should be done effectively and reliability should be accurately measured. In this paper issue is highlighted that using metrics defect density and statistical modeling software reliability can be measured.

    Pub. Date: May 31, 2015
    Paper No:
    204
  6. Bulli Babu, R., Phani Deepthi, K., Preetham Kumar, J. and Kusuma, P.
    ABSTRACT:

    This paper investigates a framework of search-based face annotation by mining weakly labeled facial pictures that are freely offered on world Wide internet (WWW). One difficult drawback for search-based face annotation theme is a way to effectively perform annotation by exploiting the list of most similar facial pictures and their weak labels that are usually clamorous and incomplete. To tackle this drawback, we have a tendency to propose an efficient unsupervised label refinement (ULR) approach for processing the labels of internet facial pictures exploitation machine learning techniques. We have a tendency to formulate the training drawback as a plan convex improvement and develop effective improvement algorithms to resolve the large-scale learning task expeditiously. To more speed up the planned theme, we have a tendency to additionally propose a clustering-based approximation algorithmic program which might improve the scalability significantly.

    Pub. Date: May 31, 2015
    Paper No:
    208
  7. Eduardo Henrique Pirolla, Alexandre Leme Godoy dos Santos, Fernanda Junqueira Cesar Pirola, Felipe Piccarone Gonçalves Ribeiro and Felipe Fregni
    Journal Area:
    ABSTRACT:

    Background: The quality of a clinical research can be improved, and it´s results can become of a greater value, by exploring the advantages of the randomized controlled trial. This method has become widely explored in research trials given the concept that it is the only valid method that can ensure the results when comparing treatments. In some areas of knowledge, the use of only randomized controlled trial methods can present obstacles. Such studies must be approached with other tools to avoid doubtful bias and outcomes. To review the real advantages of randomized controlled trials in assessing surgical trials, to discuss the methodology challenges and conduct of these surgical studies, as well as to propose and orient possible solutions and options for these studies. Discussion: In many instances, while planning a randomized controlled trial, ethical questions surrounding the trials are encountered. In most cases, the theoretic advantages of randomized controlled trials, when compared to other study designs, do not represent a visible superiority, in cases such as experimental studies that compare estimated side effects in certain treatments. In these cases, the randomized controlled trial superiority as a method should not be regarded in such an axiomatic form. Summary: In our study, we show all the tools available methodologies for research, trying to teach and educate new researchers and assist those in usual researchers conducting studies in the area of surgery. This can bring greater reliability to studies in surgery.

    Pub. Date: May 31, 2015
    Paper No:
    216
  8. Seelam Lakshmi Sravani
    ABSTRACT:

    The main objective of our survey is based on elliptical curve cryptography. It provides security for encryption and decryption of data. We have gone through several papers, each paper describes about some protocols. some of the papers are related to Text based, Wireless Communication and java as implementation tool .The above methods has its own protocols. One is based on customizable cryptography. It produces hardware designs for ECC. Encryption is a process of encoding messages or information in such a way that only authorized parties can read it. Decryption is a process of decoding data that has been encrypted into a secret format. It requires a secret key or password.

    Pub. Date: May 31, 2015
    Paper No:
    220
  9. Pednekar, S. N., Desai S. L., Sae Pol, Kagal, A. S., Dharmashale, S. N., Bharadwaj, R. S.,
    Journal Area:
    ABSTRACT:

    Background: Extrapulmonary tuberculosis (EPTB) often possesses a diagnostic dilemma in comparison to pulmonary tuberculosis which can be easily diagnosed by simple microscopy. Paucibacillary nature of specimens giving negative smear for acid fast bacilli, lack of granulomas on histopathology and failure to culture Mycobacterium tuberculosis do not exclude the diagnosis of EPTB. To overcome these limitations novel diagnostic methods of nucleic acid amplification like Polymerase Chain Reaction have been reported with good sensitivity and rapidity for diagnosis of EPTB. Polymerase Chain Reaction PCR could have a significant advantage over the conventional methods for early diagnosis of clinically suspected cases of EPTB. Objectives: 1) The present study was conducted to evaluate the role of PCR using MPB64 species specific primer in early diagnosis of extrapulmonary tuberculosis. 2) To compare the results of PCR v/s microscopy and culture. Material and Methods: A total of 100 clinical specimens comprising pleural fluid, cerebrospinal fluid, ascitic fluid, fine needle aspiration biopsy, pus and biopsy from clinically suspected EPTB cases were included in the present study. These specimens were processed by conventional diagnostic methods i.e Microscopy by Ziehl Neelsen (ZN) stain and culture on Lowenstein Jensen (LJ) medium. The PCR was performed by using species specific MPB64 primer. Results: In the present study tuberculous pleural effusion (39%) followed by tubercular meningitis (31%) was found to be the commonest clinical presentation of EPTB. The overall positivity of PCR was 53% in patients with EPTB. Microscopy and culture could detect only 12% of these. On histopathological examination 100% positivity by PCR was seen in tissue samples suggestive of tuberculosis. Of the 77 EPTB patients who responded to antituberculosis treatment (ATT), 53 patients were PCR positive. Comparing the results of PCR visa `a vis conventional technique using response to the treatment as a gold standard the sensitivity and specificity of PCR was found to be 68.8% and 100% respectively. Conclusion: This study shows that PCR can serve as a useful complement when used along with conventional diagnostic methods in rapid diagnosis of EPTB.

    Pub. Date: May 31, 2015
    Paper No:
    227