Program
Program overview – Thursday Dec. 16th and Friday Dec. 17th
This schedule is in CET time zone (Brussels, Paris).
- Paper 3 – Flexible Enterprise Optimization with Constraint Programming
Authors: Sytze P.E. Andringa and Neil Yorke-Smith
Abstract: Simulation–optimization is often used in enterprise decision-making processes, both operational and tactical. This paper shows how an intuitive mapping from descriptive problem to optimization model can be realized with Constraint Programming (CP). It shows how a CP model can be constructed given a simulation model and a set of business goals. The approach is to train a neural network (NN) on simulation model inputs and outputs, and embed the NN into the CP model together with a set of soft constraints that represent business goals. The paper studies this novel simulation–optimization approach through a set of experiments, finding that it is flexible to changing multiple objectives simultaneously, allows an intuitive mapping from business goals expressed in natural language to a formal model suitable for state-of-the-art optimization solvers, and is realizable for diverse managerial problems.
- Paper 4 – OntoGrapher: a Web-based Tool for Ontological Conceptual Modeling
Authors: Alice Binder and Petr Kremen
Abstract: Conceptual models have traditionally been tools for sharing understanding of system’s structure and behaviour with others. Although this is still the main use-case, some works have already shown usefulness of their machine readable form to derive data schemata, system documentation, as well as semantic vocabularies of the given system or domain. In this work we present OntoGrapher, a visual web-based tool for conceptual modeling based on the OntoUML conceptual modeling language. The tool accepts and produces machine readable outputs in the form of OWL ontologies and SKOS thesauri. We show its main features, benefits as well as use-cases in which the tool has been successfully applied and include a hands-on demo of the tool. Finally, we perform user testing evaluating the tool. A roadmap for the tool’s future development is laid out based on the testing results and already planned features.
- Paper 5 – Applying Normalized Systems Theory in a data integration solution, a single case study
Authors: Hans Nouwens, Edzo A. Botjes and Christiaan Balke
Abstract: When asked to describe an architecture for a data integration solution, one of the requirements was to make the design and solution able to adapt to a changing environment, to be evolvable. Attention to potential change drivers and possible combinatorial effects turned out to be one of the main drivers behind the architecture and solution designs. This single case study describes this architecture, the used architecture principles and a summary of the resulting solution design. A design which is loosely based on the generic Extract-Transform-Load pattern, with the addition of the application of Normalized Systems Theory. The architecture (model and principles), governance, solution design and realisation are evaluated.
- Paper 6 – Social-Collaborative Inductive Reference Model Mining in a Knowledge-Based Organization
Author: Andreas Sonntag
Abstract: Reference model mining is supposed to support the efficient execution of process instances. Social network analysis has great potential for reference model mining as it reveals critical social and functional relations for the efficiency of collaborative business processes. This study demonstrates and evaluates an approach for applying social network analysis on a human aspect of reference model mining. The approach is based on a dynamic performer network explaining the evolution of social collaboration efficiency in a knowledge-based organization. For this purpose, agent-based simulation is applied on a longitudinal dataset of researcher collaboration in an internationally renowned ‘center of excellence’ for industry-near research in the field of artificial intelligence. The resulting performer network is a reference model for efficient researcher collaboration, capable to be reused for the organization’s future publication process. Our evaluation presents implications for modelling an efficient process collaboration of knowledge workers.
- Paper 7 – Use of EA Models in Organizational AI Solution Development
Authors: Kurt Sandkuhl and Jack Daniel Rittelmeyer
Abstract: Digital transformation in combination with service ecosystems exploits and incorporates innovative technologies such as artificial intelligence, Internet of things or data analytics. For most enterprises, new digital business models and participation in service eco-systems result in severe changes of the enterprise architecture (EA) and the need for methodical support to systematically perform this change process. The focus of this paper is on the implementation of AI applications in organizations. Based on the analysis of industrial case studies, our observation is that (a) different kinds of AI applications require different prerequisites in an organizational IT landscape, some of which can be extracted from an EA model, and (b) some enterprises intend to use AI but are not prepared for it. The main contributions of our work are (a) an enhanced and updated literature analysis on EA use for AI introduction, (b) an analysis of differences in requirements from different AI applications, and (c) an im-proved AI context analysis method prepared for these differences.
- Paper 8 – Modeling Patterns for Payments and Linked Obligation Settlements
Authors: Glenda Amaral, Tiago Prince Sales and Giancarlo Guizzardi
Abstract: Recently, digital innovation has revolutionized the world of payments and settlement services. Innovative technologies, such as the tokenization of assets, as well as new forms of digital payments, have challenged not only the current business models but also the existing models of regulation. In this scenario, semantic transparency is fundamental not only to adapt regulation frameworks, but also to support information integration and semantic interoperability. In this paper, we deal with these issues by proposing an ontology-based approach for the modeling of payments and linked obligation settlements, that reuses reference ontologies to create ontology-based modeling patterns that are applied to model the domain-related concepts.
- Paper 9 – Towards an Ontology Network in Finance and Economics
Authors: Glenda Amaral, Tiago Prince Sales and Giancarlo Guizzardi
Abstract: Finance and economics are wide domains, where ontologies are useful instruments for dealing with semantic interoperability and information integration problems, as well as improving communication and problem solving among people. In particular, reference ontologies have been widely recognized as a key enabling technology for representing a model of consensus within a community to support communication, meaning negotiation, consensus establishment, as well as semantic interoperability and information integration. In domains like economics and finance, which are too large and complex to be represented as a single, large and monolithic ontology, it is necessary to create an ontological framework, built incrementally and in an integrated way, as a network. Therefore, in this paper we introduce OntoFINE, an Ontology Network in Finance and Economics that organizes and integrates knowledge in the realm on finance and economics, serving as a basis to several applications. We discuss the development of OntoFINE and present some of its applications.
- Paper 10 – Business Driven Microservice Design – An Enterprise Ontology Based Approach to API Specifications
Authors: Marien Krouwel and Martin Op T Land
Abstract: As technology is evolving rapidly and market demand is changing quicker than ever, many are trying to implement service orientation and adopt market standards to improve adaptivity. A microservice architecture makes applications easier to scale and faster to develop, enabling innovation and accelerating time-to-market for new features. The question then arises how to design a manageable and stable set of microservices that is sufficient for the business. In this paper we systematically deduce an algorithm to derive a set of microservices, expressed according to the OpenAPI standard, from an ontological model, that is stable by nature, based on units of similar size and sufficient for the business. This algorithm has been evaluated with the Social Housing case at OrgY by creating a SwaggerHub implementation. Further research should clarify the role of implementation choices in the algorithm.
- Paper 11 – Towards the X-theory: an evaluation of the perceived Quality and Functionality of DEMO’s Process Model
Authors: Dulce Pacheco, David Aveiro, Duarte Pinto and Bernardo Gouveia
Abstract: The Design and Engineering Methodology for Organizations (DEMO) comprises a set of models and diagrams to represent an organization. A proposal for a new Process Diagram and a Transaction Description Table fuses the contents of DEMO’s Process and Cooperation models and claims to have achieved a more agile and comprehensive solution to depict the essence of organizational reality. We designed and conducted a pilot study to evaluate the perceived Quality and Functionality of the former and new diagrams. Our study was designed to collect feedback both from a group of professionals experienced in the modelled processes (N = 8) and a group experienced in modelling language, DEMO (N = 14). Subjects attended a presentation about the old and the new diagrams and then filled out a questionnaire. Our study supports the claims that the new way to represent the Process Model is more accessible and easier to grasp by the professionals working with those processes, but also by students with knowledge on DEMO. These findings set the ground and first steps to the development of the X-theory, which aims for more effective representations of DEMO models with a sound theoretical and empirical basis.
- Paper 12 – Evaluation of the perceived quality and functionality of Fact Model diagrams in DEMO
Authors: Dulce Pacheco, David Aveiro, Bernardo Gouveia and Duarte Pinto
Abstract: DEMO’s Way of Modelling comprises a set of models and diagrams to represent an organization. They are correlated with each other, representing the organizational reality in a coherent and platform-independent way; however, previous authors argue that the syntax and semantics of DEMO’s Fact Model is too complex and cluttered, being difficult to interpret by laymen. Therefore, a novel version of DEMO’s Fact Model was introduced. Authors of this newer version claim that the synthesization and expressive power of the improved Fact Model allow to overcome the complexity and intricacies of the processes and create representations that are easily understood and productively discussed by the full range of stakeholders regardless of their technical prowess, experience, or background. To evaluate the Quality and Functionality, as well as, compare the subjects preferences over the former Fact Models diagrams and the newer representations, we designed a pilot study (N = 22) where subjects, from different backgrounds and with different expertise, evaluated the Quality and Functionality of these diagrams. The subjects attended a presentation about the old and the new Fact Model diagrams and assessed its Empirical Quality, Social Pragmatic Quality and Functionality. Our study supports previous claims that, when comparing the former and the new representations of the Fact Model, the newer version was evaluated as having a higher level of Quality and Functionality and, also, was the preferred version of the vast majority of the subjects in the pilot study.
- Doctoral Consortium Session 1 – PhD Research Design on How can organisational learning be leveraged to optimise the resilience of an organisation?
Candidate: Edzo A. Botjes and Tim Huygh
Abstract: The current VUCA worlds demands from organisations to be resilient and sometimes even antifragile. The domain focusing on staying relevant is that of risk management. Information security is a sub-domain of risk management where the threats and response to the threats are very well documented. Within the sub-domain of information security there is an ever going rat-race be- tween the people that want to exploit the threat and the people reacting to the thread by for example mitigating the thread. In this research we want to look into the role of the learning organisation in the resilient behaviour of the organisation. Why the learning organi- sation? Since there are scholars that argue that human resilience is the key to organisational resilience.
- Doctoral Consortium Session 2 – A first step towards a new TAO roadmap, a guide for sensemaking, architecturing, designing, and changing enterprises
Candidate: Hans Nouwens
Abstract: This paper is a request for comments on a research setup and a first conceptualisation and expressions of a new TAO roadmap. An addition on the arrangement of Teleology – Affordance – Ontology, to guide the way on sensemaking, architecturing, designing, and chang- ing of enterprises. Where existing theory depicts a linear relation subject – affordance – object, in practice the processes and usage of language of architecturing and designing are much more interwoven. By clarifying these we strive to improve the quality of the processes, positively con- tributing to enterprise cohesion, and improving enterprise viability. Using the design science research methodology, we argue the initial relevance and rigor, propose a first artefact, a conceptualisation with several visu- alisations, and present initial evaluations.
Previous Session
Program overview – Friday Nov. 12th
This schedule is in GMT time zone (London, Lisbon).
- Paper 1 – Adapting and evaluating the story-card-method
Authors: Marne De Vries.
Abstract: Domain modelling languages (DMLs) grow and change over time. These languages are artefacts that are developed within communities via multiple participants. Methods, associated with the emerging DMLs, also need to be supported and need adaptation, informed by practice. This study refers to a DML called DEMO (Design and Engineering Methodology for Organizations), of which the language specification evolved from the DEMO Specification Language (DEMOSL) version 3 to version 4. We adapt a method, called the story-card-method (SCM), to accommodate DEMOSL 4. Also, the previous DEMOSL 3-based SCM implied physical interaction between participants, using sticky notes to create a shared understanding, whereas the adapted SCM has to facilitate a digital way-of-collaboration due to COVID-19 restrictions. We re-visit participant feedback from the initial version of the SCM and demonstrate how we applied design science research to design an adapted SCM as the main contribution of this article. In addition, we evaluate whether the adapted SCM is useful in providing ample guidance in compiling a Coordination Structure Diagram (CSD) in a collaborative way, and we evaluate the quality of the CSDs.
- Paper 2 – A Reference Motivation Layer for Smart Health – an Enterprise Architecture Approach
Authors: Alberto Silva, André Vasconcelos and Helena Alves.
Abstract: The concept of smart health has emerged with the aim of improving citizens’ quality of life and better healthcare services. As the cost of medical services in-creases and the population ages, along with time and space constraints, existing healthcare systems are facing great challenges. The implementation of smart health solutions imposes a set of requirement, best practices, concerns and moti-vations. We conducted a systematic literature review (SLR) with the purpose of identifying the key motivation elements that shall be present in smart health solu-tions. Based on this SLR, we propose an enterprise architecture for smart health solutions based on the SLR conclusions that can be used as a reference model and a set of guidelines for city authorities and other decision makers to follow.