Authorization or access control

84 Copyright © 2010 Open Geospatial Consortium, Inc. imperative, that is coded in the service that controls the resource or declarative, that is defined in policies e.g. XACML policies. In the Event Service scenario, supporting mission critical application domains like Aviation, authorization can govern the access to particular data publicationsevents and for the registration of a publication of restricted data only from an authority that owns the data. In other words, only authorized entities shall be able to invoke certain operations of the Event Service. For example, the X.509 Attribute Certificates and XACML policies mechanism allow setting user access privileges in a multi-vendor and multi-application environment. SAML and SAML profiles allows entities to make assertions regarding the attributes and entitlements of a subject to other entities in a distributed environment. The OGC extension of XACML, GeoXACML, allows the definition of spatially constrained access control policies that can be used to prevent access to resources in a certain area from organizations that are not authorities in this area or region, for example. Authorization policies may be based on duration of certain operation invocations, time of operation and invocations. Authorization mechanisms may be applied at either end of entities involved in communication and at any intermediate points if desirable e.g. firewall.

8.1.3.3 Non-repudiation

The non-repudiation measure provides means for preventing an individual or entity from successfully denying to have performed a particular action related to data e.g. events by making available proof of various actions e.g. proof of publishing event, proof of delivering event to interested parties by the event service, proof of event origin, proof of event ownership.... It also ensures the availability of evidence that can be presented to a third party. Sometimes, it would be necessary to have some kind of notarization mechanismservice by the third party to provide a non-repudiation service. A trusted third party might provide notarization mechanism by recording data flows or operation invocations between entities e.g. relay messages in order to provide proof and resolve potential disputes. A protocol between entities and notarization services must be well defined and agreed by entities involved in communication. For example, in the Event Service brokered case the data publishers would ask for and obtain non-repudiation services from the Event Service, transmit the data, and receive notice that the data has been received and acknowledged by the Event Service. The non- repudiation services assure both the Event Service and data publisher that data has been successfully transmitted. On the subscriber side, the Event Service would ask for and obtain a non-repudiation service from a subscriber service, transmit the data, and receive notice that the data has been received and acknowledged by the subscriber service event consumer. Usually, the non-repudiation mechanism is provided by using some kind of digital signature mechanisms, data integrity mechanism, secure event logging andor notarization service. Copyright © 2010 Open Geospatial Consortium, Inc. 85

8.1.3.4 Data confidentiality

The data confidentiality measure protects data from unauthorized viewing. The data confidentiality ensures that the data content cannot be understood by unauthorized entities. Encryption and access control policies are some of the methods often used to provide data confidentiality. The encryption mechanism could be applied to the whole message or any pieces of message involved in communication between entities or storage of data. The encryption algorithm usually involves providing a key e.g. predefined key, key generated during session handshake and an actual algorithm. Some of the well known and popular algorithms are: AES Advanced Encryption Standard. Key sizes 128, 192 and 256 bits DES Data Encryption Standard. Key sizes: 56 bits 3DES Triple Data Encryption Standard. Key sizes: 56, 112 2 56, 168 356 bits BlowFish Key sizes: 8-448 bits The key management could be provided by agreeing on pre-shared keys or using well defined standards for managing cryptographic keys e.g. PKI standard.

8.1.3.5 Data integrity

The data integrity measure ensures the accuracy of data. In addition, this measure ensures that unauthorized modification, deletion, creation, and replication can be detected. For example, the process of checking the data integrity requires the sending entity e.g. the Event Service to add to data some digest information e.g. hash code, CRC... which is a function of the data itself. This information may itself be encrypted. The receiving entity e.g. subscribers generates corresponding information and compares it with the received information to determine whether the data has been modified. XML digital signature technologies are some of the most common form of implementing this measure in the SOA environment. Using message sequence numbers and timestamps may additionally provide the detection of message duplication. Some of the most popular algorithms to generate digest information are: MD5 SHA e.g. SHA-1, SHA-2... The key management could be provided by agreeing on pre-shared keys or using well defined standards for managing cryptographic keys e.g. PKI standard.