Better Practices for Collaborative Knowledge Graph Modeling

As the scale and scope of the modern data ecosystem grows, we at EK see an increasing number of enterprises recognizing the potential for ontologies to make their data findable and usable throughout the organization. An ontology can provide robust support for standardized data definitions, enterprise-wide alignment among data producers and consumers, and interoperability among diverse data sources and systems, but it usually requires a shift away from legacy data models that were not meeting these needs.

An ontology or knowledge graph cannot be modeled in isolation: in order to make sure the ontology answers business needs and technical requirements, knowledge engineers must partner with and consult stakeholders or domain experts, technical teams, data producers, and data consumers. In addition, a larger project might require a team of knowledge engineers working together to model the domain in question, which carries with it the risk of duplicative effort, divergent modeling approaches, and unclear direction.

These risks can threaten near-term progress goals, long-term sustainability, and the ability of the ontology or knowledge graph to power next-generation data usage across the organization. At EK we have drawn on our extensive experience designing ontologies and implementing knowledge graph solutions to develop a framework of better practices for collaborative modeling among knowledge engineers to help mitigate these risks and drive the immediate and ongoing success of collaborative modeling projects.

Establish Roles and Responsibilities

The solution to effectively dividing labor among a team of modelers has two parts. First, ensuring individual focus and eliminating duplicated effort requires clear, team-wide consensus and alignment on roles and responsibilities. Second, facilitating in-flight work and productive review requires your process to record explicitly who was, is, and will be responsible for each piece of the modeling effort.

In order to facilitate focused modeling and reduce duplication of effort, establish clear roles and responsibilities among members of the team. Dividing the work so that each team member is able to take the lead on a conceptual area of the model allows that team member to become a localized subject-matter expert and facilitates cohesive modeling. While the optimal approach to this division will depend on the specifics of the data and domain, dividing the work by subject area, data element, or data source are often effective strategies.

Then, make it easy to track who on the team is responsible for which parts of the model. This could be simple or complex, from a column in a shared spreadsheet to a dedicated Jira board, as long as it is accessible and comprehensible to everyone on the team. This transparency helps prevent duplication of effort and allows feedback to be targeted appropriately. You can amplify these benefits by simultaneously implementing a version control system for your ontology.

 

Routine, Frequent Peer Review

When you’re working on a team, make sure you actually collaborate! Division of labor has many benefits, but it is essential that team members do not become siloed from each other. Instituting a process of one-on-one peer reviews and having a regular forum for group feedback and suggestions help to foster collaboration, prevent internal siloing, and create near-term benefits and long-term value.

One-on-one peer review, in addition to offering the quick wins of finding typos or other human errors while they’re easily fixed, can strengthen your modeling effort by providing an opportunity for the ontologists to share conceptual approaches. This promotes alignment among the team and reduces the modeling of similar concepts in divergent ways.

Group feedback sessions are a powerful tool for achieving team-wide alignment, providing a forum in which to explore different approaches and resolve them into a unified conceptualization of the model. By surfacing errors and inconsistencies early, iterative review and refinement helps reduce the need for later remodeling and revision.

Measuring Success and Progress

As you embark on any major project, you should always ask yourself what constitutes success and what measures you will use to evaluate whether you achieved it, and an ontology modeling project is no exception. In addition, consider how you will measure and report progress throughout the project, and what information you need to collect as part of your workflow to support this.

Being able to provide concrete measurements to demonstrate the progression of in-flight work will increase accountability, facilitate organizational buy-in, and improve internal morale when the project encounters stumbling blocks. When the project is completed, being able to demonstrate and quantify the project’s success and impact will prove the ontology’s value and help secure organizational resources for maintaining and enhancing the ontology, knowledge graph, or semantic systems that you have established.

There are many ways to measure the success of an ontology design project, and the particulars of the project will inform which one will be most effective and persuasive. Useful success metrics can include the number of concepts modeled into an ontology; the number of legacy data fields or a percentage of a legacy data model mapped to the ontology; or a number or percentage of high-impact data uses that can be supported and enhanced through your new graph systems.

Moving Forward

When the scope and resources of a project allow it, a team of knowledge engineers modeling collaboratively can tackle bigger projects and achieve more sophisticated results than a single modeler. Using a collaborative approach, we have built ontologies that support standardization and integration of distributed financial data, enable 360-degree customer and enterprise views, power automated content recommendation, and drive enterprise-wide semantic search.

At the same time, this kind of collaboration carries the risk of duplicative effort, divergent modeling approaches, and unclear direction. Leveraging this framework of better practices for knowledge engineer collaboration by establishing roles and responsibilities, engaging in frequent and routine peer review, and measuring success and progress throughout the project can help you mitigate these risks and unlock the long-term value of an enterprise ontology and knowledge graph.

Whether you are starting out in your ontology or knowledge graph journey and aren’t sure where to start or already have a project in flight with mature roadmaps and goals, EK is here to help with our deep experience in scoping, planning, and implementing enterprise ontologies and knowledge graphs. Contact us here to get started.

Garrett Morton Garrett Morton Senior Ontology Analyst who has worked with stakeholders at corporate, non-profit, and academic organizations to design user-centered information solutions to business problems. A strong believer that the value of information is predicated on its findability and usability, he is passionate about using ontologies, knowledge graphs, and semantic technologies to empower people to get the most out of data. More from Garrett Morton »