Document Type

Journal Article

Abstract

In any modelling activity, a framework to determine the maturity of a developed model before its use is highly advantageous. Such a framework would save modellers expensive time in many areas of information systems. It would also lower the risk of users relying on an incomplete or inaccurate model. In this paper, we develop a framework which uses internal inconsistencies as a quantitative indicator for estimating the completeness and correctness of a model as it is cooperatively evolved. Whilst internal inconsistencies are due to bad fit between different parts of a model, we argue that they are also correlated with how the evolved model fits with the "world". This argument underpins our framework to evaluate integrated models. Contributions of this paper are three folds: firstly, it presents a theoretically grounded framework for integrating models. We extend an existing incremental modelling framework, NRDR, which represents multiple hierarchical restricted domains (MHRD), with automatic concept integration to allow NRDR to deal with multiple experts. Secondly, we couple this integration framework with a theoretically grounded monitoring process to assess the quality of the cooperatively developed model. Thirdly, we illustrate an initial empirical study of our evaluation and integration framework in a computer hardware administration domain. We capture and integrate computer hardware models from several experts and we use our modelling evaluation framework to evaluate the resultant cooperative model.

RIS ID

15625

Share

COinS