Home About Us
Forum Society
What's New Join us
     

13th SDL Forum
Papers
(update: 02 Oct 2007)

(goto SDL2007 page)

Title Authors with affiliation and country Abstract
Experiences in Deploying Model-Driven Engineering Thomas Weigert
et al 
(Motorola, US)
In this paper, we describe how Motorola has deployed model-driven engineering in product development, in particular for the development of highly reliable telecommunications systems, and outline the benefits obtained. Model-driven engineering has dramatically increased both the quality and the reliability of software developed in our organization, as well as the productivity of our software engineers. Our experience demonstrates that model-driven engineering significantly improves the development process for telecommunications systems.
SDL Design and Performance Evaluation of a Mobility Management Technique for 3GPP LTE systems Tae-Hyong Kim (Kumoh National Institute of Technology, KR) Using a common model for both functional verification and performance evaluation of a network protocol will reduce a considerable amount of protocol development time and cost. Although there have been several researches trying to achieve this goal, they have not been used widely yet especially in industry. This paper shows a case study in SDL design and performance evaluation of a wireless and mobile technology. In order to evaluate our mobility management technique for 3GPP LTE systems, we designed a simple 3GPP LTE system and its mobility performance system with pure SDL and Tau performance library. This paper describes our experience in pure SDL-based performance evaluation with Tau and discusses SDL design and simulation issues for more efficient performance evaluation with SDL.
OpenComRTOS: An ultra-small network centric embedded RTOS designed using formal modeling Eric Verhulst (Open License Society, BE) OpenComRTOS is one of the few Real-Time Operating Systems (RTOS) for embedded systems that was developed using formal modeling techniques. The goal was to obtain a proven trustworthy component with a clean and high performance architecture useable on a wide range of networked embedded systems. The result is a scalable communication system with real-time capabilities. Besides a rigorous formal verification of the kernel algorithms, the resulting architecture has several properties that enhance the safety and real-time properties of the RTOS. The code size in particular is very small and typically 10 times less than a typical equivalent single processor RTOS.
Automated Generation of Micro Protocol Descriptions from SDL Design Specifications Ingmar Fliege (University of Kaiserslautern, DE); Reinhard Gotzhein (University of Kaiserslautern, DE) A micro protocol is a ready-to-use, self-contained, distributed component that supports structuring of complex communication systems, and reuse of well proven elementary communication solutions. Micro protocol designs can be formally specified with SDL. For documentation purposes and effective reuse, these SDL designs are augmented by further description elements, for instance, typical scenarios capturing the micro protocol service and the interaction of micro protocol entities. In this paper, we show how these additional description elements can be generated from an augmented micro protocol design specification. We have devised a tool that creates a PDF file, containing the complete micro protocol description with graphical elements and a link to the SDL design specification. Our approach enhances the maintenance of micro protocol libraries, and supports the consistency of micro protocol description elements and SDL designs.
Model Driven development and code generation: an automotive case study Michele Banci (Isti - Cnr, IT) Describing an application as a simple composition of services allows advanced features that exploit different platforms to be conceived and to be formalized at a high abstraction level. Several languages and formalisms have been proposed to this aim; UML diagrams are also used to this purpose. Starting from such an abstract description, still much work is needed to derive a working application, with a model-driven development process that needs to introduce and formalize many details. In this paper we report an experience in deriving an executable formal model from a high level specifications, originally given following a mainly architectural UML approach. The development process is illustrated on an automotive case study. A state of the art code generation tool is then applied to produce a prototype implementation of the analyzed system.
Using probabilist models for studying realistic systems: a case study of Pastry Benoît Parreaux (France télécom R&D, FR); Guillaume Châtelet (France Telecom R&D, FR); Yves Marie Quemener (France Télécom R&D, FR) Telecommunication services will be in the future built upon peer-to-peer protocols. This implies the need to have strong guarantees of the dependability of those protocols. One building block for such protocols are distributed hash tables (DHT in short), and Pastry is a protocol implementing distributed hash tables. We have designed a probabilist model of Pastry that enabled us to simulate it. In particular, we have studied the performance of the protocol with respect to the number of nodes. We have used for this study probabilistic model checking tools used in the RNTL project Averročs. This is a significant application of academic tools to industrial concerns.
A Model-based Standard for SDL Andreas Prinz (Agder University College, NO); Markus Scheidgen (Humboldt University Berlin, DE); Merete Tveit (Agder University College, NO) It is still the case that language descriptions have lots of semantic information captured using plain (English) text. These imprecise descriptions are hardly usable for tools to generate language environments out of the language standard. SDL has already managed to define syntax and semantics in a quite formal way. Currently, this formality is usually connected to using grammars in one way or the other. On the other hand, meta-models have proven to be a good way of expressing complex facts and relations. Moreover, there are tools and technologies available realising all language aspects based on completely formal and still easily understandable descriptions. This paper reports about the experiment to define a complete meta-model-based language definition of (a subset of) SDL and to immediately have tool support for the language. This experiment includes the language aspects concrete syntax representation, static semantic constraints, and language behaviour. It turns out that this is almost possible
Translatable Finite State Time Machine Krzysztof Sacha (Warsaw University of Technology, PL) The paper describes syntax, behavior and formal semantics of a new class of timed automata, which are tailored for modeling the behavior of real time systems. A formal method for automatic generation of programs is developed around this model. The method starts from modeling the desired behavior of the system under design by means of an UML-based state machine with the ability to measure time, and ends-up with a complete program written in one of the IEC 1131 languages. The translation process is done automatically, and the semantics of the resulting program is isomorphic to the semantics of the model.
TTCN-3 Quality Engineering: Using Learning Techniques to Evaluate Metric Sets Edith Werner (University of Göttingen, DE); Jens Grabowski (Universität Göttingen, DE); Helmut Neukirchen (University of Goettingen, DE); Benjamin Zeiss (University of Goettingen, DE) Software metrics are an essential means to assess software quality. For the assessment of software quality, typically sets of complementing metrics are used since individual metrics cover only isolated quality aspects rather than a quality characteristic as a whole. The choice of the metrics within such metric sets, however, is non-trivial. Metrics may intuitively appear to be complementing, but they often are in fact non-orthogonal, i.e. the information they provide may overlap to some extent. In the past, such redundant metrics have been identified, for example, by statistical correlation methods. This paper presents, based on machine learning, a novel approach to minimise sets of metrics by identifying and removing metrics which have little effect on the overall quality assessment. To demonstrate the application of this approach, results from an experiment are provided. In this experiment, a set of metrics that is used to assess the analysability of test suites that are specified using the TTCN-3 is investigated.
Synthesizing Components with Sessions from Collaboration-Oriented Service Specifications Frank Alexander Kraemer (Norwegian University of Science and Technology, NO); Rolv Braek (Norwegian University of Science and Technology, NO); Peter Herrmann (Norwegian University of Science and Technology, NO) A fundamental problem in the area of service engineering is the so-called cross-cutting nature of services, i.e., that service behavior results from a collaboration of partial component behaviors. We present an approach for model-based service engineering, where system component models are derived automatically from collaboration models. These are specifications of sub-services incorporating both the local behavior of the components and the necessary inter-component communication. The collaborations are expressed by UML collaborations and activities in a compact and self-contained way. The UML activities are also well-suited to express service compositions precisely, so that components may be derived automatically by means of a model transformation. In this paper, we focus on the important issue on how to coordinate and compose collaborations that are executed with several sessions at the same time. We introduce an extension to activities for session selection. Moreover, we explain how this composition is mapped onto the components and how it can be translated into executable code.
Consistency of UML/SPT Models Abdelouahed Gherbi (Concordia University, CA); Ferhat Khendek (Concordia University, CA) UML supports a multi-view modeling approach for overcoming software complexity. It consists of several diagrams, which allow for considering software systems from different perspectives: structure, behavior and deployment. However, this multi-view approach faces the challenging issue of consistency. Moreover, when UML is used for real-time systems, through its specialized profiles such as UML/SPT for instance, the consistency issue becomes more complex. New aspects, relevant for real-time systems design, should be taken into consideration. These include concurrency, time constraints and schedulability. In this paper, we present a consistency framework for the UML profile for real-time systems. This framework addresses incrementally the various aspects of consistency for a UML/SPT model including syntactic, semantic, concurrency-related and time consistency. In this framework, we introduce an approach for checking time consistency between statecharts and sequence diagrams using schedulability analysis while taking into account deployment constraints.
Experiences in using the SOMT method to support the design and implementation of a network simulator Manuel Rodríguez (University of Valladolid, ES); José María Parra (INSA, Ingeniería de software avanzado, S.A., ES) This paper deals with the analysis and design of a computer network simulator for educational purposes. The SOMT method based on object-oriented analysis and SDL has been used for that purpose. The simulator developed allows protocol description independent of event management mechanisms and run-time network topology configuration without modifying the SDL description of the system developed.
Testing UML2.0 Models using TTCN-3 and the UML2.0 Testing Profile Clive Jervis (Motorola, US); Paul Baker (Motorola Labs, UK) This paper describes a toolset for functional testing UML2.0 models by TTCN-3 test suites and its application within Motorola. The toolset incorporates support for part of the UML2.0 testing profile from which TTCN-3 can be generated. The toolset has been developed within Motorola for models developed using Telelogic Tau G2 and test suites using Telelogic Tester. The models are subsequently used for application code generation. The basic integration of the Telelogic Tau and Tester, called cosim, has novel features, such as the ability to service operations declared as external to the model within TTCN-3, and to control model timer operations within TTCN-3. Translating UML2.0 data structures, such as classes, signal definitions, port definitions, and constants into TTCN-3 is done by a tool called UMB. The paper deals with complexities in mapping Tau UML2.0 types and structuring into TTCN-3. To provide more rigorous test specification a tool supporting part of the UML2.0 Testing Profile has been developed which enables consistency of test specifications to be checked automatically and also the generation of executable TTCN-3 test suites for cosim. The toolset is being used by several different product groups within Motorola, and the paper reports some experience and findings, including areas where TTCN-3 can be extended.
Enhanced Use Case Map Traversal Semantics Jason Kealey (University of Ottawa, CA); Daniel Amyot (University of Ottawa, CA) The Use Case Map (UCM) notation enables the use of graphical scenarios to model grey-box views of a system's operational requirements and behaviour. The scenario traversal mechanism is the most popular UCM analysis technique but its implementation in UCMNav is limited and hard to use, and the high-coupling of its features makes it difficult to maintain and evolve. This paper introduces major analysis enhancements to the recent jUCMNav Eclipse plug-in by providing an extensible scenario traversal semantics accompanied by improved model transformations to Message Sequence Charts. Furthermore, this paper identifies a set of semantic variation points for which the behaviour is unclear in UCMs, laying the groundwork for notational clarifications and user-defined semantic profiles.
Formal Verification of Use Case Maps with Real Time Extensions Jameleddine Hassine (Concordia University, CA); Juergen Rilling (Concordia University, CA); Rachida Dssouli (Concordia University, CA) Scenario-driven requirement specifications are widely used to capture and represent functional requirement. More recently, the Use Case Maps language(UCM), being standardized by ITU-T's as part of the User Requirements Notation (URN) has gained on popularity within the software requirements community. UCM models focus on the description of functional and behavioral requirements as well as high-level designs at the early stages of system development processes. However, timing issues are often overlooked during the initial system design and treated as non-related behavioral issues and described therefore in separate models. We believe that timing aspects must be integrated into the system model during early development stages. In this paper, we present a novel approach to describe timing constraints in UCM specifications.We describe a formal operational semantics of Timed UCM in terms of Timed Automata (TA) that can be analyzed and verified with the UPPAAL model checker tool. Our approach is illustrated using a case study of the IP Multicast Routing Protocol.
Specifying Input Port Bounds in SDL Reinhard Gotzhein (University of Kaiserslautern, DE); Rüdiger Grammes (University of Kaiserslautern, DE); Thomas Kuhn (University of Kaiserslautern, DE) According to the SDL semantics, input ports "may retain any number of input signals", and therefore may grow without upper bound. While this is a convenient property on design level, it may lead to illegal behaviour on concrete hardware platforms, especially in the context of distributed embedded systems with severe storage constraints. In this paper, we present a straightforward extension of SDL in order to specify input port bounds formally. In our solution, bounds are associated with signals and input ports. We define both the concrete and abstract grammar and the formal dynamic semantics of the proposed SDL extension. We have implemented the extension in Cmicro, and illustrate the solution by examples from the Assisted Bicycle Trainer, a wireless sensor network.
Using TTCN for Radio Conformance Test Systems Javier Poncela González (University of Malaga, ES); Carlos Valero Roldán (University of Malaga, ES); Unai Fernández Plazaola (University of Málaga, ES); Juan Gómez Salvador (University of Malaga, ES) While protocol conformance testing methodology is a well formalized field, radio testing methodology still relies on natural language specifications. This paper proposes an improvement on the quality of radio test specifications via the use of formal notation TTCN. This approach, and the fact that protocol and radio conformance testing share most of the underlying concepts, enables the use of a generic architecture for implementations of both types of testers, resulting in a reduction of the development efforts. This architecture has been validated with the implementation of radio test cases for the UMTS technology.
 

Contact the webmaster with questions or comments about this web site.
Copyright © 1997 - 2013 SDL Forum Society

Last update: 01 May 2013 15:34 +0100