I'm passionate about ontologies. There's nothing I like to do better than to open up a simple text editor, TextEdit or Notepad, take your pick and type RDF/XML syntax for OWL. Here I talk about my experiences in ontology and ontology based application development. WARNING: Contents may be unfit for non-supergeek consumption.
Sunday, December 4, 2011
Filling a gap in my academic experience
Looking through my CV, there is one glaring limitation; in all these years since I finished up a PhD, I have never ever authored or even been involved in preparing a research proposal. Until now. The very good news is I am lead author on a proposal I am putting together with plenty of help from my colleagues Jaehwan John Lee from the ECE Department (where I'm research faculty) and Jake Chen in the School of Informatics at IUPUI. It's not really Semantic Web related; the project is about the use of parallel computing architectures and compatible software libraries for accelerating bioinformatics applications. Given my background in software development and bioinformatics, I will be a core contributor. And I am very excited. I also believe parallel computational strategies may help mitigate the tractability and scalability issues in OWL DL ontologies for example. There may be something I learn from here that I could use in future research work. It never hurts to know! Wish me luck!
Saturday, September 17, 2011
How to get exactly the definition I need and nothing more..Is MIREOT the answer?
I am back to writing small scale ontologies for my research prototypes. In the interests of reusability, I would like to extend definitions of my atomic concepts from concepts in high level ontologies such as the Basic Formal Ontology (BFO), the Information Artifact Ontology (IAO), and the General Formal Ontology (GFO). I would like to reuse relation definitions from the Relation Ontology from the OBO Foundry and Michel Dumontier's extensions. However, a simple import on my ontology development tool (TopBraid) for any of these ontologies is enough to bring the whole application to its knees. Let us say I want to define a concept Fusion as a subconcept of the high level ontology concept Chronoid from GFO. Fusion is the root concept of my ontology and I only need to extend one concept from GFO. Instead, I am stuck with importing the ENTIRE GFO ontology, lock, stock, and barrel. With no other alternative, I am stuck with creating "concept silos," isolated from the rest of the Semantic Web. Rather ironical. Wouldn't it be nice to use just what I want and leave everything else out?
That was the idea behind MIREOT (Minimum Information to Reference an External Ontology Term), a nice little workaround developed by the good folks at the OBI (Ontology for Biomedical Investigations) workgroup. MIREOT can actually used as a verb by as in, "I MIREOTed the BiologicalProcess concept from the Gene Ontology." Compare that with importing all of GO! MIREOT has been around for a few years now, however, there seems to be no available MIREOT plugin for popular ontology development tools like Protege and TopBraid. I'm not sure about the Sentient Knowledge Explorer from IO Informatics. This was a very nice looking tool I was introduced to during my post doctoral stint with Mark Wilkinson at UBC. This may be something worth looking into.
Another idea. Why do we need to import these huge ontologies in the first place? Given network connections speeds today, wouldn't be even easier for an agent to check a URI to confirm a referenced concept exists remotely, and create a subconcept or even a subrelation link from the referencing ontology to the referenced concept? This would work just same as a hyperlink from Web 1.0. Thoughts?
That was the idea behind MIREOT (Minimum Information to Reference an External Ontology Term), a nice little workaround developed by the good folks at the OBI (Ontology for Biomedical Investigations) workgroup. MIREOT can actually used as a verb by as in, "I MIREOTed the BiologicalProcess concept from the Gene Ontology." Compare that with importing all of GO! MIREOT has been around for a few years now, however, there seems to be no available MIREOT plugin for popular ontology development tools like Protege and TopBraid. I'm not sure about the Sentient Knowledge Explorer from IO Informatics. This was a very nice looking tool I was introduced to during my post doctoral stint with Mark Wilkinson at UBC. This may be something worth looking into.
Another idea. Why do we need to import these huge ontologies in the first place? Given network connections speeds today, wouldn't be even easier for an agent to check a URI to confirm a referenced concept exists remotely, and create a subconcept or even a subrelation link from the referencing ontology to the referenced concept? This would work just same as a hyperlink from Web 1.0. Thoughts?
Saturday, August 13, 2011
Improving my TopSPIN forehand: Working with TBC and SPIN
In my new position as visiting research faculty at IUPUI, I am working on synthesizing sensor based systems based upon mission requirements from infrared sensors and allied processing algorithms for visualization and classification. For me, this marks a return to where I started off during my PhD years at the University of Memphis. Back then, I developed the very first version of the OntoSensor ontology for the Center for Advanced Sensors. Since then, OntoSensor has grown into a definitive resource for sensor descriptions. It gives me great satisfaction to learn this work has 72 citations. OK, quite a few of them are from the same group under Dr David Russomanno, who is now Dean at the Purdue School of Engineering and Technology at IUPUI. But still..
This project follows up on some good work by Joseph Qualls. In his PhD thesis work,he has used an ontological framework to describe mission tasks and objectives and combined it with the OntoSensor ontology and a simple ontology to describe algorithms to synthesize novel systems that bring together sensors and algorithms to satisfy a mission objective. Examples of these simple objectives are "identify human carrying a backpack" and "generate a silhouette profile of target at coordinates." Algorithms implemented in Java are coupled to output data through SPARQL CONSTRUCT queries. The project has been implemented using TopBraid Composer (TBC), the licensed ontology development software being developed at TopQuadrant.
After all these years working with open source ontology development tools like Protege and Swoop , it is something new for me to work with a licensed product like TBC. TBC has been built upon the Eclipse platform and brings together a lot of the current state of the art Web 3.0 technologies such as OWLIM, Jena, and Pellet.
To this, TBC brings in its own in-house reasoner called SPIN or the SPARQL Inferencing Notation. SPIN modules and templates attempt to circumvent the reliance of full blown OWL 2 ontologies on restrictions, which cause well documented issues of tractability and scalability. SPIN is actually mapped to a fragment (or profile) of OWL 2 called the OWL 2 RL. SPIN allows for constraints and inferencing rules to be added to simple RDF classes in much the same way as OWL, while dispensing with complications such as cardinality restrictions. SPIN modules also allow for the invocation of Java functions from a SPARQL query using the LET and BIND constructs. Java functions can be wrapped in a SPIN module, allocated a URI that can be shared on the Web, saved, and subsequently invoked as part of a SPARQL CONSTRUCT or even a SELECT query. For more, consult Holger Knoblauch's blog . Whether SPIN rules and templates can be used outside the TBC environment is an open question.
To avoid this potential pitfall, I am also looking into encapsulating each sensor and algorithm as a Semantic Web service, that can be dynamically configured and invoked in the same fashion as in biological workflow engines such as Taverna. Talk about learning from past experiences!
This project follows up on some good work by Joseph Qualls. In his PhD thesis work,he has used an ontological framework to describe mission tasks and objectives and combined it with the OntoSensor ontology and a simple ontology to describe algorithms to synthesize novel systems that bring together sensors and algorithms to satisfy a mission objective. Examples of these simple objectives are "identify human carrying a backpack" and "generate a silhouette profile of target at coordinates
After all these years working with open source ontology development tools like Protege and Swoop , it is something new for me to work with a licensed product like TBC. TBC has been built upon the Eclipse platform and brings together a lot of the current state of the art Web 3.0 technologies such as OWLIM, Jena, and Pellet.
To this, TBC brings in its own in-house reasoner called SPIN or the SPARQL Inferencing Notation. SPIN modules and templates attempt to circumvent the reliance of full blown OWL 2 ontologies on restrictions, which cause well documented issues of tractability and scalability. SPIN is actually mapped to a fragment (or profile) of OWL 2 called the OWL 2 RL. SPIN allows for constraints and inferencing rules to be added to simple RDF classes in much the same way as OWL, while dispensing with complications such as cardinality restrictions. SPIN modules also allow for the invocation of Java functions from a SPARQL query using the LET and BIND constructs. Java functions can be wrapped in a SPIN module, allocated a URI that can be shared on the Web, saved, and subsequently invoked as part of a SPARQL CONSTRUCT or even a SELECT query. For more, consult Holger Knoblauch's blog . Whether SPIN rules and templates can be used outside the TBC environment is an open question.
To avoid this potential pitfall, I am also looking into encapsulating each sensor and algorithm as a Semantic Web service, that can be dynamically configured and invoked in the same fashion as in biological workflow engines such as Taverna. Talk about learning from past experiences!
Subscribe to:
Posts (Atom)