A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Introduction
[chapter]
2008
Lecture Notes in Computer Science
The Web has now been in existence for quite some time and it is pervasive in its influence on all aspects of society and commerce. It has also produced a major shift in our thinking on the nature and scope of information processing. However in its technological nature and its supporting theoretical foundations, it was relatively rudimentary, being largely suitable for information dissemination. It is rapidly moving away from this, to application deployment and knowledge deployment that require
doi:10.1007/978-3-540-89784-2_1
fatcat:ewc2brfrzndjha7dgcfruj2h3a
more »
... omplex interactions and properly structured underlying semantics. This has been a sudden upsurge of research activity in the problems associated with adding semantics to the Web. This work on semantics will involve data, knowledge, and process semantics.
A Markup Language for ORM Business Rules
2002
Redirecting ...
Meersman</dc:contributor> </ORMMeta>…. ...
title>ORM-ML example</dc:title> <dc:creator>Jan Demey</dc:creator> <dc:description>A complete example of an ORM-ML file</dc:description> <dc:contributor>Mustafa Jarrar</dc:contributor> <dc:contributor>Robert ...
dblp:conf/rml/DemeyJM02
fatcat:vm4q4fbddvddrlbw62kns2bk54
Data modelling versus ontology engineering
2002
SIGMOD record
Ontologies in current computer science parlance are computer based resources that represent agreed domain semantics. Unlike data models, the fundamental asset of ontologies is their relative independence of particular applications, i.e. an ontology consists of relatively generic knowledge that can be reused by different kinds of applications/tasks. The first part of this paper concerns some aspects that help to understand the differences and similarities between ontologies and data models. In
doi:10.1145/637411.637413
fatcat:qbweqk5c25gqtidwxnkoqrf7i4
more »
... e second part we present an ontology engineering framework that supports and favours the genericity of an ontology. We introduce the DOGMA ontology engineering approach that separates "atomic" conceptual relations from "predicative" domain rules. A DOGMA ontology consists of an ontology base that holds sets of intuitive context-specific conceptual relations and a layer of "relatively generic" ontological commitments that hold the domain rules. This constitutes what we shall call the double articulation of a DOGMA ontology 1 .
Ontology Engineering – The DOGMA Approach
[chapter]
2008
Lecture Notes in Computer Science
This chapter presents a methodological framework for ontology engineering (called DOGMA), which is aimed to guide ontology builders towards building ontologies that are both highly reusable and usable, easier to build and to maintain. We survey the main foundational challenges in ontology engineering and analyse to what extent one can build an ontology independently of application requirements at hand. We discuss ontology reusability verses ontology usability and present the DOGMA approach, its
doi:10.1007/978-3-540-89784-2_2
fatcat:hiaishqfqrerleqmxs4f54rxo4
more »
... philosophy and formalization, which prescribe that an ontology be built as separate domain axiomatization and application axiomatizations. While a domain axiomatization focuses on the characterization of the intended meaning (i.e. intended models) of a vocabulary at the domain level, application axiomatizations focus on the usability of this vocabulary according to certain application/usability perspectives and specify the legal models (a subset of the intended models) of the application(s)' interest. We show how specification languages (such as ORM, UML, EER, and OWL) can be effectively (re)used in ontology engineering. 5 The uniqueness rule in ORM is equivalent to 0:1 cardinality restriction. (notation: ' '), it can be verbalized as "each book must have at most one ISBN". 6 The mandatory rule in ORM is equivalent to 1-m cardinality restriction. (notation: ' '), it can be verbalized as "each book must have at least one ISBN". 7 Also called "ontology models".
OPTIMAL USE OF INFORMATION IN CERTAIN ITERATIVE PROCESSES
[chapter]
1976
Analytic Computational Complexity
Architecting Ontology for Scalability and Versatility
[chapter]
2005
Lecture Notes in Computer Science
This paper discusses methodological strategies for architecting ontologies. The development context is an EC IST project, aimed at the use of ontology to help detect and prevent financial frauds. The ontology engineering in this context faces challenges of the scalability of ontology, its applicability in multiple applications, its semantic consistency and comprehensiveness, among others. The paper discusses architecting strategies: layering and modularizing with design patterns, used in
doi:10.1007/11575801_42
fatcat:kuc6qqfjevezdjih5opoutnx5m
more »
... ng topical ontologies of fraud and VAT. Conceptual vs. Formal Modeling Knowledge engineering often faces the contention of purposes in either capturing domain expertise as it is with understatements, ambiguity, multi-perspective, fuzziness and inconsistency or representing its application as mathematical objects and processes, consistent, unambiguous, and rigorous for a dedicated use in a given problem-solving paradigm. We refer to the former as 'conceptual modeling' in the recognition of human cognitive potential and the latter 'formal' for its axiomatic emphasis and computational requirements.
Ontology-Based Customer Complaint Management
[chapter]
2003
Lecture Notes in Computer Science
., Verlinden R. and Meersman R.,: Ontology-based Consumer Complaint Management. ...
doi:10.1007/978-3-540-39962-9_63
fatcat:xsw3gt5xrngntgz7fgehvdjs5e
Versioning of Technical Documents — Design and Implementation
[chapter]
1992
Integrated Management of Technical Documentation
Assisting Ontology Integration with Existing Thesauri
[chapter]
2004
Lecture Notes in Computer Science
Meersman affiliation:
keywords
ontology alignment and merging
number
STAR-2004-19
date
2/09/2004
corresponding author Peter Spyns
status
final
reference
Meersman R., Tari Z. et al. ...
and restructuring
Fig. 3 . 3 The merge operation
Fig. 4 . 4 The merge result of the two example ontologies Ωs and Ωt
Fig. 5 . 5 Mediator approach for data integration
Jan De Bo, Peter Spyns & Robert ...
doi:10.1007/978-3-540-30468-5_51
fatcat:zncjv37nnfetvg7qgsbpojlpwa
Formal Ontology Engineering in the DOGMA Approach
[chapter]
2002
Lecture Notes in Computer Science
This paper presents a specifically database-inspired approach (called DOGMA) for engineering formal ontologies, implemented as shared resources used to express agreed formal semantics for a real world domain. We address several related key issues, such as knowledge reusability and shareability, scalability of the ontology engineering process and methodology, efficient and effective ontology storage and management, and coexistence of heterogeneous rule systems that surround an ontology mediating
doi:10.1007/3-540-36124-3_78
fatcat:rfr722ilr5ckjkmdnmi3hqmg2m
more »
... between it and application agents. Ontologies should represent a domain's semantics independently from "language", while any process that creates elements of such an ontology must be entirely rooted in some (natural) language, and any use of it will necessarily be through a (in general an agent's computer) language. To achieve the claims stated, we explicitly decompose ontological resources into ontology bases in the form of simple binary facts called lexons and into socalled ontological commitments in the form of description rules and constraints. Ontology bases in a logic sense, become "representationless" mathematical objects which constitute the range of a classical interpretation mapping from a first order language, assumed to lexically represent the commitment or binding of an application or task to such an ontology base. Implementations of ontologies become database-like on-line resources in the model-theoretic sense. The resulting architecture allows to materialize the (crucial) notion of commitment as a separate layer of (software agent) services, mediating between the ontology base and those application instances that commit to the ontology. We claim it also leads to methodological approaches that naturally extend key aspects of database modeling theory and practice. We discuss examples of the prototype DOGMA implementation of the ontology base server and commitment server.
Towards Community-Based Evolution of Knowledge-Intensive Systems
[chapter]
2007
Lecture Notes in Computer Science
This article wants to address the need for a research effort and framework that studies and embraces the novel, difficult but crucial issues of adaptation of knowledge resources to their respective user communities, and vice versa, as a fundamental property within knowledge-intensive internet systems. Through a deep understanding of real-time, community-driven evolution of so-called ontologies, a knowledge-intensive system can be made operationally relevant and sustainable over longer periods
doi:10.1007/978-3-540-76848-7_65
fatcat:r6kapqjd3zcltctfp3uy6t6tfu
more »
... time. To bootstrap our framework, we adopt and extend the DOGMA ontology framework, and its community-grounded ontology engineering methodology DOGMA-MESS, with an ontology that models community concepts such as business rules, norms, policies, and goals as firstclass citizens of the ontology evolution process. Doing so ontology evolution can be tailored to the needs of a particular community. Finally, we illustrate with an example from an actual real-world problem setting, viz. interorganisational exchange of HR-related knowledge.
On Using Conceptual Data Modeling for Ontology Engineering
[chapter]
2003
Lecture Notes in Computer Science
This paper tackles two main disparities between conceptual data schemes and ontologies, which should be taken into account when (re)using conceptual data modeling techniques for building ontologies. Firstly, conceptual schemes are intended to be used during design phases and not at the runtime of applications, while ontologies are typically used and accessed at runtime. To handle this first difference, we define a conceptual markup language (ORM-ML) that allows to represent ORM conceptual
doi:10.1007/978-3-540-39733-5_8
fatcat:orurbeymejcbhjjcimjnau5gvy
more »
... ms in an open, textual syntax, so that ORM schemes can be shared, exchanged, and processed at the run-time of autonomous applications. Secondly, unlike ontologies that are supposed to hold application-independent domain knowledge, conceptual schemes were developed only for the use of an enterprise application(s), i.e. "in-house" usage. Hence, we present an ontology engineering-framework that enables reusing conceptual modeling approaches in modeling and representing ontologies. In this approach we prevent application-specific knowledge to enter or to be mixed with domain knowledge. To end, we present DogmaModeler: an ontology-engineering tool that implements the ideas presented in the paper. 1 Conceptual relations can be unary relations (usually called "concepts" or object types as called in ORM), or n-ary relations, which also are called fact types in ORM. 2 Rules, formally, are well-formed formulae; defined in an ontology (or a conceptual schema) in order to specify and constrain the intended models that can hold. In conceptual data modeling they are commonly called "constraints". Notice that rules can be used for e.g. enforce integrity, derivation and inference, taxonomy, etc.
Semantically Unlocking Database Content Through Ontology-Based Mediation
[chapter]
2005
Lecture Notes in Computer Science
Meersman at the Database Management Research Lab (Brussels) of Control Data. ...
doi:10.1007/978-3-540-31839-2_9
fatcat:jq5wmcknxjea5aw3hv3tzx5jeq
A survey of techniques in applied computational complexity
1975
Journal of Computational and Applied Mathematics
An attempt is made to introduce the non-expert reader to the many aspects of a relatively new and varied field which seems to be at the same time analysis, algebra and computer science. Computational complexity can be roughly described as the theory of optimizing finite and infinite algorithms for use on digital computers. Even for "simple" problems like the finding of a zero of a real function or even the evaluation of a polynomial, surprisingly deep techniques are necessary. A representative
doi:10.1016/0771-050x(75)90005-4
fatcat:dyx3k2snbvcdpot33glnotccu4
more »
... ample of the presently existing bibliography on the subject is included at the end.
« Previous
Showing results 1 — 15 out of 365 results