Home » Artificial Intelligence » Ontology and the Power of Understanding

Ontology and the Power of Understanding

March 30, 2015

supplu-chain

The term “semantic web” has been bantered about for a few years now. It’s supposed to be the next big thing in web search and promises to make searches much more relevant and meaningful. Amit Singhal (@theamitsinghal), a Google Fellow and Senior Vice President, explained to Lance Ulanoff (@LanceUlanoff), Chief Correspondent and Editor-at-Large of Mashable, why semantics is so important to web searches. Concerning their conversation, Ulanoff writes, “Search, Singhal explained, started as a content-based, keyword index task that changed little in the latter half of the 20th century, until the arrival of the World Wide Web, that is. Suddenly search had a new friend: links. Google, Amit said, was the first to use links as ‘recommendation surrogates.’ In those early days, Google based its results on content links and the authority of those links. Over time, Google added a host of signals about content, keywords and you to build an even better query result.”[1] Ulanoff continues, “Eventually Google transitioned from examining keywords to meaning.” Singhal told him that the reason for the switch was obvious. “We realized that the words ‘New’ and ‘York’ appearing next to each other suddenly changed the meaning of both those words.” As a result, “Google developed statistical heuristics that recognized that those two words appearing together is a new kind of word. However, Google really did not yet understand that New York is a city, with a population and particular location.” In order to take that next step, Google needed “a system [that could] transform words that appear on a page into entities that mean something and have related attributes. It’s what the human brain does naturally.”

 

Google calls its system a Knowledge Graph, which technically is a simple and limited ontology. Let me explain. A knowledge graph is an interrelation of specific data, and can be dynamically built from data. For example, a search to determine if someone has a spouse could be extended to many family relations by writing multiple additional extractors and inference rules. Such a search would generate a knowledge graph of people and how they are related. In contrast, an ontology also models the concept of the knowledge and the interrelationships of those concepts. With an ontology one can ask interesting questions about the data but also answer questions that could not be determined simply by looking at the data. For example, one can ask an ontology “what is implied by people being married?” (e.g., were engaged, had a wedding, commonly raise children, live together, own communal property, can get divorced, share assets, can inherent property by default, have tax benefits, and so on). With a knowledge graph, questions are limited to direct implications of the data within the graph (e.g., does a specific person have children?).

 

Pete Ianace (@IanacePete), Executive Vice President at Datafloq, explains, “An ontology formally represents knowledge as a hierarchy of concepts within a domain, using a shared vocabulary to denote the types, properties and interrelationships of those concepts. Ontologies are the structural frameworks for organizing information and are used in artificial intelligence, the Semantic Web, systems engineering, software engineering, biomedical informatics, library science, enterprise bookmarking, and information architecture as a form of knowledge representation about the world or some part of it. The creation of domain ontologies is also fundamental to the definition and use of an enterprise architecture framework.”[2] In other words, the use of an ontology is critical if one wants to develop true cognitive computing. At Enterra Solutions®, our Cognitive Reasoning Platform™ (CRP) uses the world’s largest common sense ontology, which can be adapted to provide industry-specific solutions.

 

To understand why an ontology is necessary for better understanding, you can examine a simple word like “tank.” By itself, you have no idea whether it means an armored military vehicle, a container for liquid, or even if it’s a noun. It could be a verb (an enterprise can tank (i.e., fail), or an adjective (e.g., a tank top). Even if you determine that the tank under consideration is a container that holds liquid, you don’t know what kind of liquid (i.e., it could be a gas tank or a water tank). By understanding relationships, an ontology can help a computer understand the meaning of tank so that improper insights or incorrect conclusions are not generated. Jim Benedetto (), co-founder and CTO of Gravity, writes, “The simplest way to imagine an ontology is as a graph that shows how things are connected to each other.”[3] Benedetto uses the example of professional athlete, but I prefer using someone like Bill Gates to demonstrate how a simple ontology can relate things.

 

 

 

As Benedetto points out, a simple graph like this only shows a small subset of the myriad things to which someone like Bill Gates is actually connected. He adds, a good ontology will “have millions of entities and abstract concepts all interconnected with hundreds of millions of edges. Topics run the gambit from every person of note throughout history to every song ever recorded to diseases of every flavor.” Ianace notes that in a business setting an ontology is important because “it eliminates the need to integrate systems and applications when looking for critical data or trends.”

 

Benjamin Recchie, Communications Manager at University of Chicago Research Computing Center, notes “Better understanding of natural language morphology can lead to better designed human-machine interfaces and a better way to search large databases.”[4] Words do morph and an ontology must be updated constantly to keep pace. Singhal told Ulanoff that Google’s goal is to develop a computer as capable as the one used on the Starship Enterprise in the Star Trek television series. He noted, you could ask the Star Trek computer “virtually any question and get an intelligent answer.” Ulanoff, a self-professed robot geek, pictures ontologies being used to inform robots how to better interface with humans. He writes:

“Future robots with access to Google’s entity-based search engine might be able to understand that the ‘tiny baby’ they’re caring for (What? You wouldn’t leave your baby with a robot?) is small, fragile and always hungry. The robot might even know how to feed the baby because it would know the entity ‘always hungry’ has to be cross-referenced with the fact that it’s a ‘baby,’ which is also an entity in the knowledge graph, and includes attributes like ‘no solids.'”

It’s not just robots, however, that need to access that kind of information. A cognitive computing system that uses a good ontology could recommend recipes for people that go beyond their taste preferences. Ulanoff’s example is that a baby can’t eat solid food. But some people have dietary restrictions based on religion, or allergies, or personal preference (e.g., vegetarians). All of those factors can be accounted for in a cognitive computing system that uses an ontology. The fact of the matter is that there are few challenges to which a cognitive computing system using an ontology can’t be applied. Such systems also have a better chance of working through logical conundrums found in situations where rules may clash. Humans face such challenges frequently. And like humans, cognitive computing systems learn as they ingest information and as associated ontologies grow. Singhal told Ulanoff, “The beauty of the human mind is that it can build things and decide things in ways we didn’t think were possible.” He hopes that Google’s Knowledge Graph will also become “a tool to aid the creation of more knowledge. It’s an endless quantitative cycle of creativity.” To be more useful, however, the Knowledge Graph must evolve into a much richer ontology.

 

The beauty of a cognitive computing system is that it can discover new (and often surprising) relationships among databases. One key to ensuring that those relationships are real and not coincidental is through the use of a good ontology. Ianace puts it this way, “Ontology uses a unique combination of an inherently agile, graph-based semantic model and semantic search to reduce the timescale and cost of complex data integration challenges. Ontology is rethinking data acquisition, data correlation and data migration projects in a post-Google world.”

 

Footnotes
[1] Lance Ulanoff, “Google Knowledge Graph Could Change Search Forever,” Mashable, 13 February 2012.
[2] Pete Ianace, “The Role Ontology plays in Big Data,” Datafloq, 21 January 2015.
[3] Jim Benedetto, “What is an Ontology?
[4] Benjamin Recchie, “Billions of Words: Visualizing Natural Language,” Scientific Computing, 27 February 2015.

Related Posts: