Cloud usage has already become a reality in users’ everyday habits. Combining it with the wide adoption and use of mobile phones, mobile clouds seem to be the next step in mobile business. In our paper we analyze mobile cloud service requests with respect to the relevance they have for m-commerce and economy in large. We describe potential architecture solutions which include the use of UICCs (Universal Integrated Circuit Card). Finally, we illustrate the information flow of such a system with the example of mobile e-tickets and assess such architectures with respect to security, privacy and trust.

Cloud computing providers (CSP) and cloud customers (CC) are not only exposed to existing security risks but to new risks introduced by clouds, like multi-tenancy, virtualization and data outsourcing. Several international and industrial standards target information security and their conformity with cloud computing security challenges. We give an overview of these standards and evaluate their completeness. As a result we propose a new extension to the ISO 27001:2005 standard including a new control objective about virtualization applicable for cloud systems. We also define a new quantitative metric and evaluate the importance of existing ISO 27001:2005 control objectives if customer services are hosted on-premise or in cloud. Our conclusion is that obtaining the ISO 27001:2005 certificate is not enough for CSP and CC information security systems, especially in business continuity detriment that cloud computing produces and propose new solutions that mitigate the risks.

The efficiency of Intrusion Detection Systems depends on their configuration. This configuration is actual vendor-specific. Operations could become complex when multiple systems are used. I briefly discuss why current management protocols are not adequate to manage Intrusion Detection Systems and what is needed to manage Intrusion Detection Systems adequate.
Based on the functional Intrusion Detection model of the IETF the integration approach will be pointed out as well as requirements for the Intrusion Detection Parameterization Exchange Format and its structure will be briefly described. Analog to the Intrusion Detection Message Exchange Format a parameterization format for standardized parameterization was designed.
Separation of baseline configuration and parameterization for individual integration are illustrated on basis of Snort. Usability and functionality of this approach was demonstrated by integration in the network based IDS Snort and the host based IDS Samhain under one parameterization web frontend.
This approach provides administrators a consistently administration frontend for all integrated IDS. The security level is approved by one central administration entity for the complete IDS solution independent from IDS manufacturers. Updates and parameter modifications could be done from this central point. There is no longer imperative to allow connection from analyzers to the Internet or the central operations LAN to update itself or for notifications.
Figure 1. Attack Sophistication vs. Intruder Technical Knowledge [3]
IDS manager are independent from the rest of the IDS. IDS of different vendors and different analyzing level could be managed with one administration interface.

One of primary components of computer systems security is preservation of data integrity. In addition to violation prevention, it also includes methods used to detect such violations after they occur. The most common methods for preserving integrity of binary data are based on various hashing functions. However, an inherent downside of using hashing functions is that a single hash can only be used to verify integrity of a single data string as a whole. This results in it being impossible to locate the exact position within the string where the change occurred. Alternative method entails splitting the data string into blocks, each protected by a hash. While this enables more precise location of changes, storing potentially large number of hashes imposes significant space overhead for large data sets.
In this article, we present a space-efficient method for detection and localization of unwanted changes in large data sets. Our method reuses the idea of splitting data into blocks and hashing each block separately, but with certain added properties: logarithmic instead of linear increase of memory space required to store hashes, ease of parallelization, possible application on distributed data, limited self-verification of hashes and efficient recalculation of hashes for dynamically changing data.

Due to the rising number of internet application and their users, there is also an increase in the amount of developers developing and perfecting them. Oversights by developers while creating an application can cause countless problems and financial losses. In order to minimize vulnerability, it is necessary to take application design into consideration, assume all potential problems, and eliminate them before launching the application. The most frequent type of attack is XSS or Cross Site Scripting. Various flaws of browsers are displayed. Most of them have been corrected, but are nevertheless a good example of some types of attacks.

Software systems continuously grow in size and code complexity, the latter most evident through greater component interconnectedness. This leaves more space for bugs which introduce risks such as exposure to security threats. Most effective test selection approaches under combinatorial testing are experimental design extensions for software testing. Covering array test sets are compact while maintaining at the same time complete combinatorial coverage up to the desired level. Smaller test sets with customizable level of assurance can drive testing costs down substantially. We present a survey of research into combinatorial testing suite factors while also identifying possible future research ideas.

This paper describes challenges faced when building a practical implementation of a smart card authentication protocol. First, a brief description of the protocol is given, followed by a list of parameters and cryptographic protocols that have been selected for the test implementation. Because smart cards based on Java Card technology are considered to be a well-established and recognized standard, they are chosen as a testing platform. However, like any other smart cards, they are constrained in terms of data storage capacity and available computing power, which can create difficulties when building the practical system. For this reason, an analysis of the finished implementation is given, which includes numerical performance data on storage requirements and the time required for the protocol completion in relation to chosen input parameters. This is followed by a conclusion and several recommendations regarding possible protocol improvements.

XML signature is form of digital signature designed for use in XML transactions. The XML Digital Signature standard defines a schema that is used for storing the result of a digital signature operation applied to (in most cases) XML data. Like non-XML digital signatures, XML signatures add authentication, data integrity, and support for non-repudiation to the data that is object of XML digital signing process. However, unlike non-XML digital signature standards, XML signature has been designed to both account for and take advantage of the Internet and XML.
A fundamental feature of XML Signature is the ability to sign only specific portions of the XML contant rather than the complete document. This is relevant when a single XML document may have a long history in which the different components are authored at different times by different parties, each signing only those elements relevant to itself. This flexibility will also be critical in situations where it is important to ensure the integrity of certain portions of an XML document, while leaving open the possibility for other portions of the document to change. Since data security – in form of data verification and authorization - represents an important part of information system security paradigm this article is addressing the questions and possibilities of XMLDigSig usage in everyday information system security procedures.

Security on the Internet is a serious problem without satisfactory solution. One problem is at the level of Internet service providers and autonomous systems. This space is highly distributed and without central control, driven primarily by the economic factors. Many solutions have been proposed, with moderate success, concentrating mainly on the Internet routing protocol BGP. We approached this problem with an observation that there is a certain similarity between the Internet’s organization at the level of autonomous systems and peer-to-peer networks and thus certain similarity with respect to security issues. In peer-to-peer networks reputation mechanisms are the primary
means of protection. We propose similar reputation mechanisms to be applied to autonomous systems. There are many factors that could be used for reputation calculation per autonomous systems,like spam, worms, DoS attacks. In this paper we concentrate only on DNS traffic and propose reputation calculation based on it. Our results show that it is possible to make judgements about entities on the Internet based on the errors found in their traffic.

In 1991 scientists have for the first time demonstrated practical quantum cryptography (QC) and given a proof of its unconditional security against two most obvious attacking strategies. In the next decade several groups of scientists have made more stringent security proofs and it was soon widely accepted that quantum cryptographic protocol BB84 and its variants are the only known protocols that allow unconditionally secure key growth. Moreover, QC was able to offer unique possibility that any eavesdropping activity could be detected through an elevated bit error rate. In 1994 Swiss spin-off IdQuantique has shown the world's first commercial quantum key distribution system Clavis, followed soon by the QPN system of American start-up Magiq. During next 8 years nobody even suspected that there might be something wrong with these systems. Then, in early 2011, a group of Norwegian scientists experimenting with attacks on photon detectors constructed tailored attacks to both commercial systems which were so successful that the plaintext was recovered in full (100%) and moreover the legitimate parties could not detect any suspicious activity ! This raised the question: where does it leave us with quantum cryptography, is it secure or not ? What about the security proofs, were they wrong ? Our conclusion is that security proofs were correct except that they proved security of something else - not the machines that were cracked. At present state of the art it is not clear how or is it even possible to build a provable QC machine.

Having in mind that data constitute the key asset of every company, it is necessary to define the owners of that asset. This paper proposes a model for clear determination of data ownership in information systems. Model is based on enterprise architecture units called domains. Applied domain model divide the complete business of the telecommunication company in to 12 domains where each domain represents a “natural” unit with clear responsibilities and competencies structured according to the business aspects.

Abstract: ICT services are under competitive pressure being launched daily. Because of that, security dimension in ICT services is often overlooked. To assure that security is integral part of such services, an organization needs to establish security infrastructure which will support it. Such infrastructure should consist of security documentation (policies, requirements, procedures …), established processes and appropriate organizational measures. Model and related process for ICT service development or modification, with security as its integral part from initial phase will be described in this article.

The Final goal of certified ISO 27001:2005 Information Security Management System (ISMS) implementation in organization or company of any kind or size is to have fully operational Plan-Do-Check-Act (PDCA) life cycle of constant system improvements. One of the most important issues to make it operational and meaningful is to measure compliance and performance of the system functionalities by using ISO 27004:2009 standard to adopt metrics and ways for measuring performance and compatibility of critical activities. That is not an easy task because of large quantity of data needed to be processed and presented in useful way. Available advances does not introduce complete solutions. Because of that in this paper is presented holistic approach for data collection, and data mining for KPI visualization in practice.

Individuals mostly hesitate to use services offered via Internet due to their suspicions regarding the level of offered (1) protection of their privacy and (2) security of performing online transactions. Privacy is mostly concerned with the identifiable user data and users’ rights to have control over their data. On the other hand, security provides the physical, logical, and procedural safeguards that are needed to keep the data private. Privacy cannot be achieved without obtaining security practice, nor will the usage of security mechanisms guarantee protection of privacy. Despite being closely linked in practice, privacy and security are perceived as separate issues by online users. Therefore, in this article the relationship between various privacy factors (factors that influence users’ privacy concerns) and the perception of security protection during users’ online activities is discussed. The role that perceived privacy and perceived security have in the e-service users’ evaluation of a service is investigated.

19:15 - 19:30

G. Božić (Franck d.d., Zagreb, Croatia)The role of a stress model in the construction of information security culture

Summary – A better understanding of the processes which occur in the implementation of information security is a precondition for its more efficient implementation. This work addresses the areas which focus on man, named “informal methods”. The studies carried out so far show the need to develop information culture as a long-term solution to the problem. This problem can be also seen as a result of the process which implies intense interaction of man and his environment. One part of the results of the research carried out so far will be analyzed from the point of view of other scientific areas, to be more precise, psychology which also studies these processes. Using the stress model we will try to understand better the process which occurs when the user is expected to change his behavior. The attitude the user will adopt in such a situation will influence the process of development of the information security culture. If this is to be successful, the user should see the demands as a challenge rather than a threat. We can influence this if we know which personal characteristics of the user can assist us. One of these is self-efficiency. By increasing belief in the users’ abilities is one of the factors which need to be taken into account in the process of training and raising awareness. Methods to build these characteristics are interesting.

The ICT system’s users can significantly affect overall security level of the system, but problem is that most security solutions do not take into consideration user as possible critical security component of the system.
In this work assessment method is proposed to evaluate users’ awareness regarding security issues. For purpose of collecting data on ICT system user’s awareness special questionnaire was developed that was based on previously defined ontology domain regarding e-mail users’ behavior.
The cluster analysis method was applied in order to group users into categories regarding level of their awareness about security issues. Cluster analysis gave five clusters of users on which Chi square analysis was applied in order to detect potential relationship between level of awareness and gender, age, professional qualification and technical background knowledge. The variables used to predict group membership were identified by applying discriminant analysis.
The evaluation and categorization of the users’ awareness should help in developing new concepts of security solutions with taking into consideration user as component of the ICT system.

M. Bača, P. Koruga (Fakultet organizacije i informatike, Varaždin, Croatia)Building a Framework for the Development of Biometric Forensics

Application of biometric tools today is a major challenge for developers of biometric systems and for users of those systems. A specific application can be seen in systems requiring a certain accuracy in person authentication. A large number of systems is created in a way that the algorithms used are not verified as standardized, thus causing uncertainty in their results and application of those tools in processes which need to confirm obtained results. This paper will describe the basic model necessary to build a framework for the construction of a biometric system applicable in forensics for person authentication.