We begin at the beginning, with an outline of Aristotle’s views on ontology and with a discussion of the influence of these views on Linnaeus. We move from there to consider the data standardization initiatives launched in the 19th century, and then turn to investigate how the idea of computational ontologies developed in the AI and knowledge representation communities in the closing decades of the 20th century. We show how aspects of this idea, particularly those relating to the use (...) of the term 'concept' in ontology development, influenced SNOMED CT and other medical terminologies. Against this background we then show how the Foundational Model of Anatomy, the Gene Ontology, Basic Formal Ontology and other OBO Foundry ontologies came into existence and discuss their role in the development of contemporary biomedical informatics. (shrink)
Biomedical ontologies exist to serve integration of clinical and experimental data, and it is critical to their success that they be put to widespread use in the annotation of data. How, then, can ontologies achieve the sort of user-friendliness, reliability, cost-effectiveness, and breadth of coverage that is necessary to ensure extensive usage? Methods: Our focus here is on two different sets of answers to these questions that have been proposed, on the one hand in medicine, by the SNOMED CT (...) community, and on the other hand in biology, by the OBO Foundry. We address more specifically the issue as to how adherence to certain development principles can advance the usability and effectiveness of an ontology or terminology resource, for example by allowing more accurate maintenance, more reliable application, and more efficient interoperation with other ontologies and information resources. Results: SNOMED CT and the OBO Foundry differ considerably in their general approach.Nevertheless, a general trend towards more formal rigor and cross-domain interoperability can be seen in both and we argue that this trend should be accepted by all similar initiatives in the future. Conclusions: Future efforts in ontology development have to address the need for harmonization and integration of ontologies across disciplinary borders, and for this, coherent formalization of ontologies is a prerequisite. (shrink)
To enhance the treatment of relations in biomedical ontologies we advance a methodology for providing consistent and unambiguous formal definitions of the relational expressions used in such ontologies in a way designed to assist developers and users in avoiding errors in coding and annotation. The resulting Relation Ontology can promote interoperability of ontologies and support new types of automated reasoning about the spatial and temporal dimensions of biological and medical phenomena.
While representation learning techniques have shown great promise in application to a number of different NLP tasks, they have had little impact on the problem of ontology matching. Unlike past work that has focused on feature engineering, we present a novel representation learning approach that is tailored to the ontology matching task. Our approach is based on embedding ontological terms in a high-dimensional Euclidean space. This embedding is derived on the basis of a novel phrase retrofitting strategy through (...) which semantic similarity information becomes inscribed onto fields of pre-trained word vectors. The resulting framework also incorporates a novel outlier detection mechanism based on a denoising autoencoder that is shown to improve performance. An ontology matching system derived using the proposed framework achieved an F-score of 94% on an alignment scenario involving the Adult Mouse Anatomical Dictionary and the Foundational Model of Anatomy ontology (FMA) as targets. This compares favorably with the best performing systems on the Ontology Alignment Evaluation Initiative anatomy challenge. We performed additional experiments on aligning FMA to NCI Thesaurus and to SNOMED CT based on a reference alignment extracted from the UMLS Metathesaurus. Our system obtained overall F-scores of 93.2% and 89.2% for these experiments, thus achieving state-of-the-art results. (shrink)
The National Center for BiomedicalOntology is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists, funded by the National Institutes of Health (NIH) Roadmap, to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use (...) ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease. (shrink)
The National Center for BiomedicalOntology is now in its seventh year. The goals of this National Center for Biomedical Computing are to: create and maintain a repository of biomedical ontologies and terminologies; build tools and web services to enable the use of ontologies and terminologies in clinical and translational research; educate their trainees and the scientific community broadly about biomedicalontology and ontology-based technology and best practices; and collaborate with a variety of (...) groups who develop and use ontologies and terminologies in biomedicine. The centerpiece of the National Center for BiomedicalOntology is a web-based resource known as BioPortal. BioPortal makes available for research in computationally useful forms more than 270 of the world's biomedical ontologies and terminologies, and supports a wide range of web services that enable investigators to use the ontologies to annotate and retrieve data, to generate value sets and special-purpose lexicons, and to perform advanced analytics on a wide range of biomedical data. (shrink)
Knowledge-making practices in biology are being strongly affected by the availability of data on an unprecedented scale, the insistence on systemic approaches and growing reliance on bioinformatics and digital infrastructures. What role does theory play within data-intensive science, and what does that tell us about scientific theories in general? To answer these questions, I focus on Open Biomedical Ontologies, digital classification tools that have become crucial to sharing results across research contexts in the biological and biomedical sciences, and (...) argue that they constitute an example of classificatory theory. This form of theorizing emerges from classification practices in conjunction with experimental know-how and expresses the knowledge underpinning the analysis and interpretation of data disseminated online. (shrink)
An accurate classification of bacteria is essential for the proper identification of patient infections and subsequent treatment decisions. Multi-Locus Sequence Typing (MLST) is a genetic technique for bacterial classification. MLST classifications are used to cluster bacteria into clonal complexes. Importantly, clonal complexes can serve as a biological species concept for bacteria, facilitating an otherwise difficult taxonomic classification. In this paper, we argue for the inclusion of terms relating to clonal complexes in biomedical ontologies.
The meeting focused on uses of ontologies, with a special focus on spatial ontologies, in addressing the ever increasing needs faced by biology and medicine to cope with ever expanding quantities of data. To provide effective solutions computers need to integrate data deriving from myriad heterogeneous sources by bringing the data together within a single framework. The meeting brought together leaders in the field of what are called "top-level ontologies" to address this issue, and to establish strategies among leaders in (...) the field of biomedicalontology for the creation of interoperable biomedical ontologies which will serve the goal of useful data integration. (shrink)
Background: In order to improve ontology quality, tool- and language-related tutorials are not sufficient. Care must be taken to provide optimized curricula for teaching the representational language in the context of a semantically rich upper level ontology. The constraints provided by rigid top and upper level models assure that the ontologies built are not only logically consistent but also adequately represent the domain of discourse and align to explicitly outlined ontological principles. Finally such a curriculum must take into (...) account the pre-existing skills and knowledge of the target audience. -/- Objective: To develop a well-structured curriculum aligned to the particular requirements of life science professionals, in order to enable them to create logically sound, domain adequate and predicable ontologies using the Web Ontology Language (OWL) in Protégè. -/- Methods: Content selection for the curriculum was based on the literature, pre-existing tutorials, and a guideline for good ontology development (i.e ontology design enhancing domain adequacy, sustainability and interoperability) that drew on the authors previous experiences with large ontology development projects. Learning objectives were formulated according to a needs assessment of the targeted learners, who were students trained in life sciences with basic knowledge and practical skills in computer science. As instructional format we choose an approach with a high amount of practical exercises. The curriculum was first implemented with 24 Students and 7 lecturers/ tutors over 5 full days. The curriculum was evaluated by gathering the participants feedback via a questionnaire. -/- Results: Curricular development produced 16 modules of approximately 2 hours each, which covered basic principles of Applied Ontology, description logic syntax and semantics, as well as best design practices outlined in ontology design patterns and variants of the BioTop upper ontology. An opinion survey based on questionnaires indicated that the participants took advantage from the teaching strategies applied, as they indicated good knowledge gain and acknowledged the relevance of the modules. The difficulty was rated slightly lower. -/- Conclusion: The development of teaching material for principled ontology design and best practices is of crucial importance in order to enhance the quality of biomedical ontologies. Here, we present a curriculum for a week long workshop, leveraging on current educational principles, focusing on interactive hands-on exercises, group interactions, and problem-oriented learning. Whereas evaluation clearly showed the success of this approach, in particular regarding student’s satisfaction, the objective measurement of traceable effects on the quality of the generated ontology, although of much higher interest, has just started. (shrink)
We present a novel methodology for calculating the improvements obtained in successive versions of biomedical ontologies. The theory takes into account changes both in reality itself and in our understanding of this reality. The successful application of the theory rests on the willingness of ontology authors to document changes they make by following a number of simple rules. The theory provides a pathway by which ontology authoring can become a science rather than an art, following principles analogous (...) to those that have fostered the growth of modern evidence-based medicine. Although in this paper we focus on ontologies, the methodology can be generalized to other sorts of terminology-based artifacts, including Electronic Patient Records. (shrink)
The Foundational Model of Anatomy (FMA) symbolically represents the structural organization of the human body from the macromolecular to the macroscopic levels, with the goal of providing a robust and consistent scheme for classifying anatomical entities that is designed to serve as a reference ontology in biomedical informatics. Here we articulate the need for formally clarifying the is-a and part-of relations in the FMA and similar ontology and terminology systems. We diagnose certain characteristic errors in the treatment (...) of these relations and show how these errors can be avoided through adoption of the formalism we describe. We then illustrate how a consistently applied formal treatment of taxonomy and partonomy can support the alignment of ontologies. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology greatly benefits application ontologies.[1] To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO).[2] With this project we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase® are standardized in a framework of first-order logic. In this paper we describe how this standardization has already led to an improvement in the LinKBase® structure that allows for a greater degree of internal coherence than ever before possible. We then show the use of this philosophical standardization for the purpose of mapping external databases to one another, using LinKBase® as translation hub, with a greater degree of success than possible hitherto. We demonstrate how this offers a genuine advance over other application ontologies that have not submitted themselves to the demands of philosophical scrutiny. (shrink)
The integration of biomedical terminologies is indispensable to the process of information integration. When terminologies are linked merely through the alignment of their leaf terms, however, differences in context and ontological structure are ignored. Making use of the SNAP and SPAN ontologies, we show how three reference domain ontologies can be integrated at a higher level, through what we shall call the OBR framework (for: Ontology of Biomedical Reality). OBR is designed to facilitate inference across the boundaries (...) of domain ontologies in anatomy, physiology and pathology. (shrink)
PURPOSE—A substantial fraction of the observations made by clinicians and entered into patient records are expressed by means of negation or by using terms which contain negative qualifiers (as in “absence of pulse” or “surgical procedure not performed”). This seems at first sight to present problems for ontologies, terminologies and data repositories that adhere to a realist view and thus reject any reference to putative non-existing entities. Basic Formal Ontology (BFO) and Referent Tracking (RT) are examples of such paradigms. (...) The purpose of the research here described was to test a proposal to capture negative findings in electronic health record systems based on BFO and RT. METHODS—We analysed a series of negative findings encountered in 748 sentences taken from 41 patient charts. We classified the phenomena described in terms of the various top-level categories and relations defined in BFO, taking into account the role of negation in the corresponding descriptions. We also studied terms from SNOMED-CT containing one or other form of negation. We then explored ways to represent the described phenomena by means of the types of representational units available to realist ontologies such as BFO. RESULTS—We introduced a new family of ‘lacks’ relations into the OBO Relation Ontology. The relation lacks_part, for example, defined in terms of the positive relation part_of, holds between a particular p and a universal U when p has no instance of U as part. Since p and U both exist, assertions involving ‘lacks_part’ and its cognates meet the requirements of positivity. CONCLUSION—By expanding the OBO Relation Ontology, we were able to accommodate nearly all occurrences of negative findings in the sample studied. (shrink)
Biomedical research is increasingly a matter of the navigation through large computerized information resources deriving from functional genomics or from the biochemistry of disease pathways. To make such navigation possible, controlled vocabularies are needed in terms of which data from different sources can be unified. One of the most influential developments in this regard is the so-called Gene Ontology, which consists of controlled vocabularies of terms used by biologists to describe cellular constituents, biological processes and molecular functions, organized (...) into hierarchies via the relation of class subsumption. Here we seek to provide a rigorous account of the logic of classification that underlies GO and similar biomedical ontologies. Drawing on Aristotle, we develop a system of axioms and definitions for the treatment of biological classes and instances. (shrink)
The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such (...) as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource providing details on the people, policies, and issues being addressed in association with OBI. (shrink)
In previous work on biomedical ontologies we showed how the provision of formal definitions for relations such as is_a and part_of can support new types of auto-mated reasoning about biomedical phenomena. We here extend this approach to the transformation_of characteristic of pathologies.
BACKGROUND -/- In biomedical ontologies, mereological relations have always been subject to special interest due to their high relevance in structural descriptions of anatomical entities, cells, and biomolecules. This paper investigates two important subrelations of has_proper_part, viz. the relation has_grain, which relates a collective entity to its multiply occurring uniform parts (e.g., water molecules in a portion of water), and the relation has_component, which relates a compound to its constituents (e.g., molecules to the atoms they consist of). -/- METHOD (...) -/- We distinguish between four kinds of complex entities and characterize them in first order logic. We then discuss whether similar characterizations could be given in description logics, and finally apply the results to mixtures. -/- RESULTS -/- At first sight, collectives and compounds seem to be disjoint categories. Their disjointness, however, relies on agreement about what are uniform entities, and thus on the granularity of description. For instance, the distinction between isomeric subtypes of a molecule can be important in one use case but might be neglected in another one. We demonstrate that, as implemented in the BioTop domain upper level ontology, equivalence or subsumption between different descriptions of same or similar entities cannot be achieved. Using OWL-DL, we propose a new design pattern that avoids primitive subrelations at the expense of more complex descriptions and thus supports the needed inferences. (shrink)
Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical defi nitions thereby also supporting reasoning over the tagged (...) data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data. (shrink)
Current approaches to formal representation in biomedicine are characterized by their focus on either the static or the dynamic aspects of biological reality. We here outline a theory that combines both perspectives and at the same time tackles the by no means trivial issue of their coherent integration. Our position is that a good ontology must be capable of accounting for reality both synchronically (as it exists at a time) and diachronically (as it unfolds through time), but that these (...) are two quite different tasks, whose simultaneous realization is by no means trivial. The paper is structured as follows. We begin by laying out the methodological and philosophical background of our approach. We then summarize the structure and elements of the Basic Formal Ontology on which it rests, in particular the SNAP ontology of objects and the SPAN ontology of processes. Finally, we apply the general framework to the specific domain of biomedicine. (shrink)
In the management of biomedical data, vocabularies such as ontologies and terminologies (O/Ts) are used for (i) domain knowledge representation and (ii) interoperability. The knowledge representation role supports the automated reasoning on, and analysis of, data annotated with O/Ts. At an interoperability level, the use of a communal vocabulary standard for a particular domain is essential for large data repositories and information management systems to communicate consistently with one other. Consequently, the interoperability benefit of selecting a particular O/T as (...) a standard for data exchange purposes is often seen by the end-user as a function of the number of applications using that vocabulary (and, by extension, the size of the user base). Furthermore, the adoption of an O/T as an interoperability standard requires confidence in its stability and guaranteed continuity as a resource. (shrink)
Successful biomedical data mining and information extraction require a complete picture of biological phenomena such as genes, biological processes, and diseases; as these exist on different levels of granularity. To realize this goal, several freely available heterogeneous databases as well as proprietary structured datasets have to be integrated into a single global customizable scheme. We will present a tool to integrate different biological data sources by mapping them to a proprietary biomedicalontology that has been developed for (...) the purposes of making computers understand medical natural language. (shrink)
The automatic integration of rapidly expanding information resources in the life sciences is one of the most challenging goals facing biomedical research today. Controlled vocabularies, terminologies, and coding systems play an important role in realizing this goal, by making it possible to draw together information from heterogeneous sources – for example pertaining to genes and proteins, drugs and diseases – secure in the knowledge that the same terms will also represent the same entities on all occasions of use. In (...) the naming of genes, proteins, and other molecular structures, considerable efforts are under way to reduce the effects of the different naming conventions which have been spawned by different groups of researchers. Electronic patient records, too, increasingly involve the use of standardized terminologies, and tremendous efforts are currently being devoted to the creation of terminology resources that can meet the needs of a future era of personalized medicine, in which genomic and clinical data can be aligned in such a way that the corresponding information systems become interoperable. (shrink)
Objectives: Medical decision support and other intelligent applications in the life sciences depend on increasing amounts of digital information. Knowledge bases as well as formal ontologies are being used to organize biomedical knowledge and data. However, these two kinds of artefacts are not always clearly distinguished. Whereas the popular RDF(S) standard provides an intuitive triple-based representation, it is semantically weak. Description logics based ontology languages like OWL-DL carry a clear-cut semantics, but they are computationally expensive, and they are (...) often misinterpreted to encode all kinds of statements, including those which are not ontological. Method: We distinguish four kinds of statements needed to comprehensively represent domain knowledge: universal statements, terminological statements, statements about particulars and contingent statements. We argue that the task of formal ontologies is solely to represent universal statements, while the non-ontological kinds of statements can nevertheless be connected with ontological representations. To illustrate these four types of representations, we use a running example from parasitology. Results: We finally formulate recommendations for semantically adequate ontologies that can efficiently be used as a stable framework for more context-dependent biomedical knowledge representation and reasoning applications like clinical decision support systems. (shrink)
The goal of the OBO (Open Biomedical Ontologies) Foundry initiative is to create and maintain an evolving collection of non-overlapping interoperable ontologies that will offer unambiguous representations of the types of entities in biological and biomedical reality. These ontologies are designed to serve non-redundant annotation of data and scientific text. To achieve these ends, the Foundry imposes strict requirements upon the ontologies eligible for inclusion. While these requirements are not met by most existing biomedical terminologies, the latter (...) may nonetheless support the Foundry’s goal of consistent and non-redundant annotation if appropriate mappings of data annotated with their aid can be achieved. To construct such mappings in reliable fashion, however, it is necessary to analyze terminological resources from an ontologically realistic perspective in such a way as to identify the exact import of the ‘concepts’ and associated terms which they contain. We propose a framework for such analysis that is designed to maximize the degree to which legacy terminologies and the data coded with their aid can be successfully used for information-driven clinical and translational research. (shrink)
Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such navigation (...) will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C’s LinkSuiteTM with the philosophical rigor of IFOMIS’s Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology greatly benefits application ontologies. To this end r®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this project we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. Our general procedure has been the implementation of a meta-ontological definition space in which the definitions of all the concepts and relations in LinKBase® are standardized in a framework of first-order logic. In this paper we describe how this standardization has already led to an improvement in the LinKBase® structure that allows for a greater degree of internal coherence than ever before possible. We then show the use of this philosophical standardization for the purpose of mapping external databases to one another, using LinKBase® as translation hub, with a greater degree of success than possible hitherto. We demonstrate how this offers a genuine advance over other application ontologies that have not submitted themselves to the demands of philosophical scrutiny. LinKBase® is one of the world’s largest applications-oriented medical domain ontologies, and BFO is one of the world’s first philosophically driven reference ontologies. The collaboration of the two thus initiates a new phase in the quest to solve the so-called “Tower of Babel”. (shrink)
The value of any kind of data is greatly enhanced when it exists in a form that allows it to be integrated with other data. One approach to integration is through the annotation of multiple bodies of data using common controlled vocabularies or ‘ontologies’. Unfortunately, the very success of this approach has led to a proliferation of ontologies which itself creates obstacles to integration. The Open Biomedical Ontologies (OBO) consortium has set in train a strategy to overcome this problem. (...) Existing OBO ontologies, including the Gene Ontology, are undergoing a process of coordinated reform and new ontologies being created on the basis of an evolving set of shared principles governing ontology development. The result is an expanding family of ontologies designed to be interoperable, logically well-formed, and to incorporate accurate representations of biological reality. We describe the OBO Foundry initiative, and provide guidelines for those who might wish to become involved. (shrink)
Biomedical ontologies are considered a serious innovation for biomedical research and clinical practice. They promise to integrate information coming from different biological databases thus creating a common ground for the representation of knowledge in all the life sciences. Such a tool has potentially many implications for both basic biomedical research and clinical practice. Here I discuss how this tool has been generated and thought. Due to the analysis of some empirical cases I try to elaborate how (...) class='Hi'>biomedical ontologies constitute a novelty also from an epistemological point of view. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to (...) move beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
Biomedical terminologies are focused on what is general, Electronic Health Records (EHRs) on what is particular, and it is commonly assumed that the step from the one to the other is unproblematic. We argue that this is not so, and that, if the EHR of the future is to fulfill its promise, then the foundations of both EHR architectures and biomedical terminologies need to be reconceived. We accordingly describe a new framework for the treatment of both generals and (...) particulars in biomedical information systems that is designed: 1) to provide new opportunities for the sharing and management of data within and between healthcare institutions, 2) to facilitate interoperability among different terminology and record systems, and thereby 3) to allow new kinds of reasoning with biomedical data. (shrink)
As biological and biomedical research increasingly reference the environmental context of the biological entities under study, the need for formalisation and standardisation of environment descriptors is growing. The Environment Ontology (ENVO) is a community-led, open project which seeks to provide an ontology for specifying a wide range of environments relevant to multiple life science disciplines and, through an open participation model, to accommodate the terminological requirements of all those needing to annotate data using ontology classes. This (...) paper summarises ENVO’s motivation, content, structure, adoption, and governance approach. (shrink)