JAQM Volume 5, Issue 1 - March 30, 2010

Knowledge Dynamics

Contents

Semantic Web Service technology can play a vital role in today’s changing economic conditions, as it allows business to quickly adapt to market changes. By combining individual services into more complex systems, web service composition facilitates knowledge dynamics and knowledge sharing between business partners. The proposed framework uses a semi-automatic approach as manual web service composition is both time-consuming and error-prone.

The collaborative system with finite number of states is defined. A very large database is structured. Operations on large databases are identified. Repetitive procedures for collaborative systems operations are derived. The efficiency of such procedures is analyzed.

This paper aims to describe interrelationships between size of the organization and adoption of information and communication technologies (ICT). We hypothesize that size of the organization is interrelated with ICT usage. By analyzing the data from 68 organizations we have classified to micro, small, medium-sized and large enterprises and calculated composite index of ICT adoption in each organization. Afterwards we have analyzed correlations between composite index of ICT adoption and size of the organization. Our results show that large enterprises which have potential to utilize ICT have highest values of composite index of ICT adoption, indicating high ICT usage. Theory considered in the discussion implies that ICT diminishes size of the organization, which complies with our findings because medium enterprises keep high values of composite index of ICT adoption. Small organizations, at least in transitional countries, in average do not show high level of ICT use, but especially in smallest, micro organizations extreme examples both of high and low ICT use indicated by high standard deviation values can be found. That could be explained by greater flexibility and orientation of small enterprises to new technologies, but also lack of resources or interest and implies that in small and micro companies ICT appliance is more dependent on other organizational factors than on size. Our conclusion is that ICT has the potential to diminish size of the company, but that still in average large and medium companies are leaders of ICT use in spite of extreme examples of good practice in small companies.

The operational environments in which information management systems operate determine the existence of complex situations. Consequently, the command and control flow can take different paths, which involve different “sets” of activities. Each of those activities is associated with a specific software application set, known as Application Software Tools (ASTs).
An operational profile represents a sequence of specific processing of distinct activities (from a functional point of view), based on specific Application Software Tools and with a certain time limit interval. Each operational profile has associated a probability of occurrence.
Each activity is performed during a specified period of time, with specific sets of ASTs. Totality resulting AST specification due to operational profiles crowd formed a mission specific software application system, also known as a Mission Specific Tools Set (MSTS). Each MSTS’s element fulfill functions that meet the corresponding command and control activities, found in the form of lists of features of the system operational profile.
The aim of this paper is to present an original MSTS reliability model, which combines the modelling approach based on operational profiles with Rome Research Laboratory software reliability modeling methodology. In this way, it was realized a dual representation of application set’s reliability that quantifies its level of reliability and also the associated weights of each application. The final goal was to offer an adequate basis for the process of reliability growth.
This paper is also going to provide a calculus example of MSTS system reliability using a representative U.S. Navy C4ISR system’s combat action (Time Critical Targeting). The case study demonstrates the validity and the usefulness of the model in order to increase the system’s reliability.

A similarity index is developed in this paper to measure the resemblance of information contained in the websites of several management institutes of India. The data matrix pertaining to information contents of the different websites is populated using indicator variables. A Pair Similarity Index (PSI), for non-mutually exclusive cases, is proposed that can measure the similarity between websites through pairs of observations. A comparison of the proposed similarity index with one such existing index is also done.

ANalysis Of VAriance (ANOVA) is a method to decompose the total variation of the observations into sum of variations due to different factors and the residual component. When the data are nominal, the usual approach of considering the total variation in response variable as measure of dispersion about the mean is not well defined. Light and Margolin (1971) proposed CATegorical ANalysis Of VAriance (CATANOVA), to analyze the categorical data. Onukogu (1985) extended the CATANOVA method to two-way classified nominal data. The components (sums of squares) are, however, not orthogonal. Singh (1996) developed a CATANOVA procedure that gives orthogonal sums of squares and defined test statistics and their asymptotic null distributions. In order to study which exploratory categories are influential factors for the response variable we propose to apply Non-Symmetrical Correspondence Analysis (D'Ambra and Lauro, 1989) on significant components. Finally, we illustrate the analysis numerically, with a practical example.

The paper proposes a network management system architecture based on a geographical information system that allows accurate description and inventorying of the infrastructure. The system contains several models that emulate real life operational networks based on fiber optics, copper and WiFi technologies. The architecture is implemented in a network management systems application and a number of interesting performance and design problems encountered during the implementation are presented along with their solutions.

The coastal pollution is an issue of concern for Mauritius. Since the past two decades, agricultural activities have contributed to pesticide and fertilizer run off in coastal waters. Over the recent years, the major urban and industrial growth in Mauritius have also contributed to water pollution by indirect wastewater discharge containing contaminants into rivers. As polluted water is hazardous to both marine life and human beings, it is of national interest to analyse the levels of water pollutants and the factors effecting these levels. This paper aims at developing a statistical model to evaluate the extent of coastal pollution in urbanized and agricultural regions in Mauritius. The study was carried out at two stations: St-Louis River (an urbanized industrial area) and Tamarin River (an agricultural area). A multifactor statistical model was formulated and analysed for the experimental data on the chemical and physical variables collected at the two sites for the estuary and downstream at each station, during 2001 and 2005 randomly spread over summer and winter seasons. The model is highly efficient in depicting the independent as well as the interactive effects of seasonality, time-interval, strategic locations and activities on the levels of different variables. A series of interesting conclusions were drawn from the analysis of the model. One major derivation was that the seasonal factor and time-interval had a significant effect (p-values < 0.01) on the levels of chromium, lead and nitrates at both the stations. However, the direction and magnitude were different with respect to each variable over the strategic locations. Moreover, considerable interactive effect between various factors regarding salinity was detected. These conclusions among others raise concern.

Using the time series data for USA shadow economy (SE), we examine the relationship between the size of unreported economy estimated as percentage of official GDP and the unemployment rate (UR). Granger causality tests are conducted, with a proper allowance for the non-stationarity of the data. The results indicate a clearly evidence of such causality from the unemployment rate to shadow economy.

In recent years, physicists have been using tools from physics to study social phenomena, an area of study sometimes called sociophysics and econophysics. Most of this work has appeared in physics journals, and the present paper is an attempt to bring this type of work to a largely social science audience. The focus is on the application of a differential equation model, widely used in physics, to the study of the long term trend in changes in the United States’ price level. This model is found to provide an excellent fit to the data, indicating that this trend is an exponential growth trend.

Recently, Romania put into practice the private pension system, which includes compulsory pensions and voluntary ones, as alternatives to statutory (public) pension scheme. According to the Romanian laws, private pension funds can invest in securities issued on regulated and supervised market from Romania, EU Member States, European Economic Space, and third countries a percentage ranging between 10% and 50% of the total pension fund assets. This study should be viewed in the current global financial context whereas, currently, the most powerful and stable financial markets are experiencing problems with the sudden drop in financial asset prices, low liquidity and a reduction in investment on the capital market. The administrators of these funds for the fundamental analysis applied on securities should consider the social component of their activity, since the public pension system is undercapitalized and encountered many problems arising from the influence of economic, social, demographic, political factors.
In this article, for the selection of listed stocks to be included in a portfolio, we propose a score function. Using this method we determine which of the companies analyzed previously had the best performance in terms of an investor with risk aversion, and the final goal will be identify the best three companies included in BET index in terms of return and risk. The weights of each indicator were the results of the use of Likert scale.

Micro spectroscopy can be used for the early diagnosis of critical ailments. Early diagnosis is crucial for identifying the presence of a disease before it progresses beyond a critical stage. It has been shown that micro spectroscopy can be used as a non-invasive or limited-invasive approach for detecting different stages of cervical intraepithelial neoplasia.. The unique spectral "fingerprint" that characterizes premalignant cells can be used to differentiate each stage from normal (healthy) cells. Techniques lacking automatic, objective, sensitive and rapid diagnostic tools are not sufficient. We have developed SPECTRALYZER, a novel micro spectroscopical computational tool for automated complementary diagnostic analysis.

In the study, authors have proposed a mathematical model for unmarried female migrant workers having number of closed boy friends. They are more vulnerable to STDs and HIV transmission. The model is fitted well on the given data and estimate of female migrants having one close boy friend was found maximum. The study based on 362 pre-marital female migrant workers less than 30 years of age in Delhi urban India, while they have wanted lavish life styles and having number rich boy friends.

In particular situations, clinical trials researchers could have a potential interest in assessing trends at the level of individual subjects. This paper establishes a common approach and applies it in two different situations, one from nutritional medicine and one from cardiovascular medicine. The approach consists of running as many regression models as the number of subjects, looking at the behavior of some parameter of interest in time. The regression parameters, particularly the slope of the regression line, offer the general sense of the trend and allow for testing its statistical significance. Extrapolation at the level of the entire sample is possible using some version of the binomial test. In both cases, significant results were obtained despite of small sample sizes.

Breast cancer is the most common form of cancer that affects women. It is a life threatening disease and the most common malignancy in women through out the world. In this study an effort has been made to determine the most likely risk factors of breast cancer and to select a parsimonious model of the incidence of breast cancer in women patients of the age 50 years and above in the population of North West Frontier Province (NWFP), Pakistan. The data were collected from a total of 331 women patients, arriving at Institute of Radiotherapy and Nuclear Medicine Peshawar, NWFP, Pakistan.
Logistic regression model was estimated, for breast cancer patients, through backward elimination procedure. Brown tests were applied to provide an initial model for backward elimination procedure. The logistic regression model, selected through backward elimination procedure contains the factors Menopausal status (M), Reproductive status (R), and the joint effect of Diet and family History (D*H). We conclude that menopausal status; reproductive status and the joint effect of diet and family history were the important risk factors for the breast cancer.
Separate models were then fitted for married and unmarried breast cancer patients. The best-selected model for married females is of factors Feeding (F), R, M, (D*H), whereas the best selected model for unmarried females has only one main factor Menopausal status. We conclude that breast feeding, reproductive status, menopausal status and the joint effect of diet and family history were the important risk factors of breast cancer in married women and the menopausal status was the important risk factor of breast cancer in unmarried women.

Meta-analysis is the combination of results from various independent studies. In a meta-analysis, combining survival data from different clinical trials, an important issue is the possible heterogeneity between trials. Such inter-trial variation can not only be explained by heterogeneity of treatment effects across trials but also by heterogeneity of their baseline risk. In addition, one might examine the relationship between magnitude of the treatment effect and the underlying risk of the patients in the different trials. However, the need for medical research and clinical practice to be based on the totality of relevant and sound evidence has been increasingly recognized. In this paper, we review the advances of meta-analysis using clinical trials TB data. This paper examines sixteen reporting results of randomized clinical trials conducted in a particular centre at consecutive periods. Every study pools that the results from the relevant trials in order to evaluate the efficacy of a certain treatment between cases and control. There is a need for empirical effort comparing random effects model with the fixed effects model in the calculation of a pooled relative risk in the meta-analysis in systematic reviews of randomized controlled clinical trials. We review heterogeneity and random effects analyses and assessing bias within and across studies. We compare the two approaches with regards to statistical significance, summary relative risk, and confidence intervals.