IJACSA Volume 5 Issue 12

Copyright Statement: This is an open access publication licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, even commercially as long as the original work is properly cited.

Abstract: The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network.
Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

Abstract: This study aims to develop a social recommendation media GIS (Geographic Information Systems) specially tailored to recommend tourist spots. The conclusions of this study are summarized in the following three points. (1) Social media GIS, an information system which integrates Web-GIS, SNS and recommendation system into a single system, was conducted in the central part of Yokohama City in Kanagawa Prefecture, Japan. The social media GIS uses a design which displays its usefulness in reducing the constraints of information inspection, time and space, and continuity, making it possible to redesign systems in accordance with target cases. (2) The social media GIS was operated for two months for members of the general public who are more than 18 years old. The total numbers of users was 98, and the number of pieces of information submitted was 232. (3) The web questionnaires of users showed the usefulness of the integration of Web-GIS, SNS and recommendation systems, because the functions of reference and recommendation can be expected to support tourists’ excursion behavior. Since the access survey of log data showed that about 35% of accesses were from mobile information terminals, it can be said that the preparation of an optimal interface for such terminals was effective.

Abstract: As environmental models (such as Accelerated Climate Model for Energy (ACME), Parallel Reactive Flow and Transport Model (PFLOTRAN), Arctic Terrestrial Simulator (ATS), etc.) became more and more complicated, we are facing enormous challenges regarding to porting those applications onto hybrid computing architecture. OpenACC emerges as a very promising technology, therefore, we have conducted a feasibility analysis on porting the Community Land Model (CLM), a terrestrial ecosystem model within the Community Earth System Models (CESM)). Specifically, we used automatic function testing platform to extract a small computing kernel out of CLM, then we apply this kernel into the actually CLM dataflow procedure, and investigate the strategy of data parallelization and the benefit of data movement provided by current implementation of OpenACC. Even it is a non-intensive kernel, on a single 16-core computing node, the performance (based on the actual computation time using one GPU) of OpenACC implementation is 2.3 time faster than that of OpenMP implementation using single OpenMP thread, but it is 2.8 times slower than the performance of OpenMP implementation using 16 threads. On multiple nodes, MPI_OpenACC implementation demonstrated very good scalability on up to 128 GPUs on 128 computing nodes. This study also provides useful information for us to look into the potential benefits of “deep copy” capability and “routine” feature of OpenACC standards. We believe that our experience on the environmental model, CLM, can be beneficial to many other scientific research programs who are interested to porting their large scale scientific code using OpenACC onto high-end computers, empowered by hybrid computing architecture.

Abstract: The focus of this study is application of intelligent agent in negotiation between buyer and seller in B2C Commerce using big data analytics. The developed model is used to conduct negotiations on behalf of prospective buyers and sellers using analytics to improve negotiations to meet the practical requirements. The objective of this study is to explore the opportunities of using big data and business analytics for negotiation, where big data analytics can be used to create new opportunities for bidding. Using big data analytics sellers may learn to predict the buyers’ negotiation strategy and therefore adopt optimal tactics to pursue results that are to their best interests. An experimental design is used to collect intelligent data that can be used in conducting the negotiation process. Such approach will improve quality of negotiation decisions for both parties.

Abstract: Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural network are presented. Firstly, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Secondly, Minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a MCA learning algorithm to enhance the Convergence, where a Convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Simulation results will be furnished to illustrate the theoretical results achieved.

Abstract: E-government development in most European countries was ensured from Structural Funds in the period of 2007-2014. In our paper we show how Hungary has used these funds in order to achieve efficiency and effectiveness in its public services. The main objective of our research has been to explore the budgetary and timing characteristics of public ICT spending, and analyze the implicit and explicit objectives of eGovernment projects in Hungary. We applied exploratory text analyzes as a novel and objective way to analyze the focus of eGovernment development policy. Our main findings are:
- After the text analysis of 85 Electronic Public Administration Operational Programme (EPAOP) and 65 State Reform Operational Programme (SROP) projects we found that keyword statistics are generally consistent with the main policy level objectives of the Operative Programmes, however there are some fields which are not emphasized, such as: the role of participation, social partners, local-government; and the improvement need of user skills through public information campaigns.
- Governmental changes are clearly reflected in the goal hierarchy: contracting in EPAOP and SROP happened in two separate waves - the significant part of financing was committed during stabilized governments in the beginning and end phase of the planning period, with a relatively passive period during governmental change in 2010-2011.

Abstract: The artificial immune system is a new computational intelligence technique that has been investigated for the past decade. By reviewing the literature, two observations were found that could affect the network learning process. First, most researchers do not focus on Paratop-Epitop and Paratop-Idiotop interactions within the network. Second, most researchers depict the interaction within the network with all the network components present from the beginning until the end of the learning process. In this research, efforts were devoted to deal with the aforementioned observations. The findings were able to differentiate between interactions in a node within a network, and total interactions in the network. A small simulation problem was used to show the effect of choosing a steady number of antibodies during network interactions. Results showed that a considerable number of interactions could be saved during network learning, which will lead to faster convergence. In conclusion, it is believed that the designed model is ready to be used in many applications. Therefore, it is recommend the use of our model in different applications such as controlling robots in hazardous rescue environment to save human lives.

Abstract: One of the challenges currently problems in the use of cloud services is the task of designing of data management systems. This is especially important for hybrid systems in which the data are located in public and private clouds. Implementation monitoring functions querying, scheduling and processing software must be properly implemented and is an integral part of the system. To provide these functions is proposed to use an object-relational mapping (ORM). The article devoted to presenting the approach of designing databases for information systems hosted in a hybrid cloud infrastructure. It also provides an example of the development of ORM library.

Abstract: Natural Language Processing (NLP) is an effective approach for bringing improvement in educational setting. Implementing NLP involves initiating the process of learning through the natural acquisition in the educational systems. It is based on effective approaches for providing a solution for various problems and issues in education. Natural Language Processing provides solution in a variety of different fields associated with the social and cultural context of language learning. It is an effective approach for teachers, students, authors and educators for providing assistance for writing, analysis, and assessment procedures. Natural Language Processing is widely integrated with the large number of educational contexts such as research, science, linguistics, e-learning, evaluations system, and contributes resulting positive outcomes in other educational settings such as schools, higher education system, and universities. The paper aims to address the process of natural language learning and its implication in the educational settings. The study also highlights how NLP can be utilized with scientific computer programs to enhance the process of education. The study follows qualitative approach. Data is collected from the secondary resources in order to identify problems faced by the teachers and students for understanding the context due to obstacles of language. Results provide effectiveness of linguistic tools such as grammar, syntax, and textual patterns that are fairly productive in educational context for learning and assessment.

Abstract: Benefits Management provides an established approach for decision making and value extraction for IT/IS investments and, can be used to examine cloud computing investments. The motivation for developing an upper ontology for Benefits Management is that the current Benefits Management approaches do not provide a framework for capturing and representing semantic information. There is also a need to capture benefits for cloud computing developments to provide existing and future users of cloud computing with better investment information for decision making. This paper describes the development of an upper ontology to capture greater levels of knowledge from stakeholders and IS professionals in cloud computing procurement and implementation. Complex relationships are established between cloud computing enablers, enabling changes, business changes, benefits and investment objectives

Abstract: Classification is one of the most frequently encountered problems in data mining. A classification problem occurs when an object needs to be assigned in predefined classes based on a number of observed attributes related to that object.
Neural networks have emerged as one of the tools that can handle the classification problem. Feed-forward Neural Networks (FNN's) have been widely applied in many different fields as a classification tool.
Designing an efficient FNN structure with optimum number of hidden layers and minimum number of layer's neurons, given a specific application or dataset, is an open research problem.
In this paper, experimental work is carried out to determine an efficient FNN structure, that is, a structure with the minimum number of hidden layer's neurons for classifying the Wisconsin Breast Cancer Dataset. We achieve this by measuring the classification performance using the Mean Square Error (MSE) and controlling the number of hidden layers, and the number of neurons in each layer.
The experimental results show that the number of hidden layers has a significant effect on the classification performance and the best classification performance average is attained when the number of layers is 5, and number of hidden layer's neurons are small, typically 1 or 2.

Abstract: Aim: This study proposes to evaluate the spatial distribution of the public obstetric care in the city of Belo Horizonte. It will also correlate the primary care units (PCU) with the immediate neonatal outcomes of a maternity-school of Belo Horizonte, according to risk pregnancy and obstetric outcome. Method: Descriptive geographic-spatial research. This study analyzed a cohort of 2956 newborn who received care at birth in maternity-school, Hospital das Clinicas (HC) of Federal University of Minas Gerais (UFMG) between the January/2013 to July/2014. The gestational risk, the local of primary care unit (PCU) of prenatal, immediate neonatal outcome was studied. The QGIS 2.4 open source software was used to generate thematic maps and analyses. Results: It was observed that among the 2083 births analyzed 1154 (55.4%) were classified as high risk for maternal and 634 (30.4%) with poor neonatal outcome, also, that has a concentration of women living in the northwest of the city to officially refer to their childbirth mothers in the maternity-school. In cases of high risk pregnancy and perinatal complications referencing also occurs from practically all other regions of the city. Discussion: The integration of hospital clinical and administrative data with cartographic databases, through through this study, was able to make clear the patterns of referencing for childbirth in maternity-school in high risk pregnancy. Despite the limitations of a descriptive study, the analysis makes clear that the choice of place of childbirth, exceeds the matters set out in government planning of emergency obstetric referencing by sanitary districts.

Abstract: Protein fold recognition plays an important role in computational protein analysis since it can determine protein function whose structure is unknown. In this paper, a Classified Sequential Pattern mining technique for Protein Fold Recognition (CSPF) is proposed. CSPF technique consists of two main phases: the sequential mining pattern phase and the fold recognition phase. In the sequential mining pattern phase, Mix & Test algorithm is developed based on Grammatical Inference, which is used as a training phase. Mix & Test algorithm minimizes I/O costs by one database scan, discovers subsequence combinations directly from sequences in memory without searching the whole sequence file, has no database projection, handles gaps, and works with variant length sequences without having to align them. In addition, a parallelized version of Mix & Test algorithm is applied to speed up Mix & Test algorithm performance. In the fold recognition phase, unknown protein folds are predicted via a proposed testing function. To test the performance, 36 SCOP protein folds are used, where the accuracy rate is 75.84% for training data and 59.7% for testing data.

Abstract: This paper is concerned with video-based facial expression recognition frequently used in conjunction with HRI (Human-Robot Interaction) that can naturally interact between human and robot. For this purpose, we design a 3D-CNN(3D Convolutional Neural Networks) by augmenting dimensionality reduction methods such as PCA(Principal Component Analysis) and TMPCA(Tensor-based Multilinear Principal Component Analysis) to recognize simultaneously the successive frames with facial expression images obtained through video camera. The 3D-CNN can achieve some degree of shift and deformation invariance using local receptive fields and spatial subsampling through dimensionality reduction of redundant CNN’s output. The experimental results on video-based facial expression database reveal that the presented method shows a good performance in comparison to the conventional methods such as PCA and TMPCA.

Abstract: Social media have brought new opportunities, and also new challenges, for organizations. With them came the rise of a new context of action, largely influenced by the changing habits and the behavior of the consumer. The purpose of the following research is to analyze the views and strategies embraced by Azorean organizations, as well as the perceptions arising from the use of social media.
For this study, a quantitative type of research, of a descriptive nature, was chosen, using an online survey. A total of 232 valid surveys obtained led to a range of perceptions about the use of social media. The study hypotheses were verified using the Kruskal-Wallis analysis.
The results demonstrate that the majority of organizations involved in the study already use social media and that almost all of them use Facebook. The main reasons are to reach a wider audience, to increase notoriety and to communicate with customers. The most relevant difficulty felt after joining the social media is the lack of resources and availability. Marketing initiatives and content creation are the most-used activities. Remarkably, more than half don’t have a defined strategy, nor use measuring instruments to assess their presence. However, they consider that social media enhance their performance.
Social media is a widely studied topic from the consumer’s point of view, but there is still little investigation from an organizational perspective. This work sought to contribute to the knowledge about the use and involvement of organizations in social media, especially in the peripheral context.

Abstract: Despite the importance attached to the weights or strengths on the edges of a graph, a graph is only complete if it has both the combinations of nodes and edges. As such, this paper brings to bare the fact that the node-weight of a graph is also a critical factor to consider in any graph/network’s evaluation, rather than the link-weight alone as commonly considered. In fact, the combination of the weights on both the nodes and edges as well as the number of ties together contribute effectively to the measure of centrality for an entire graph or network, thereby clearly showing more information. Two methods which take into consideration both the link-weights and node-weights of graphs (the Weighted Marking method of prediction of location and the Clique/Node-Weighted centrality measures) are considered, and the result from the case studies shows that the clique/node-weighted centrality measures give an accuracy of 18% more than the weighted marking method, in the prediction of Distribution Centre location of the Supply Chain Management.

Abstract: Privacy breaches and Identity Theft cases are increasing at an alarming rate. Social Networking Sites (SN’s) are making it worse. Facebook (FB), Twitter and other SN’s offer attackers a wide and easily accessible platform. Privacy in the Kingdom of Saudi Arabia (KSA) is extremely important due to cultural beliefs besides the other typical reasons. In this research we comprehensively cover Privacy and Identity Theft in SNs from many aspects; such as, methods of stealing, contributing factors, ways to use stolen information, examples and other aspects. A study on the local community was also conducted. In the survey, the participants were asked about privacy on SN’s, SN’s privacy policies, and whether they think that SN’s benefits outweigh their risks. A social experiment was also conducted on FB and Twitter to show how fragile the systems are and how easy it is to gain access to private profiles. Results from the survey are scary: 43% of all the accounts are public, 76% of participants do not read the policies, and almost 60% believe that the benefits of SN’s outweigh their risks. Not too far from this, the results of the experiment show that it is extremely easy to obtain information from private accounts on FB and Twitter.

Abstract: Architecture reconstruction belongs to a reverse engineering process, in which we move from code to architecture level for reconstructing architecture. Software architectures are the blue prints of projects which depict the external overview of the software system. Mostly maintenance and testing cause the software to deviate from its original architecture, because sometimes for enhancing the functionality of a system the software deviates from its documented specifications, some new modules are included in the system without modifying the architecture of a system which create issues while reconstructing the system, as much as the software is closed to the architecture the more it is easy to maintain and change the document so the conformance of architecture with the product is checked by applying the reverse engineering method. Another reason for reconstructing the architecture is observed in the case of legacy systems, when they need modification or an enhanced version of the system is needed to be developed. This paper includes the methods and tools involved in reconstructing the architecture and by comparing them the best method for reconstructing architecture will be suggested.

Abstract: Wireless sensor network are nowadays considered as a viable solution for medical application . A zigbee network model is more suitable for battery capacity, bandwidth, and computing limitation for WSN. This paper will present an Opnet simulation of a zigbee network performance in order to compare routing results in 3 different topologies ( Star , Mesh and Tree ).