Meeting Date
Attendees
- Drummond Reed
- Rieks Joosten
- Daniel Hardman
- Dan Gisolfi
- Foteinos Mergroupis-Anagnou (GRNET)Scott Perry
- line.kofoed
- Scott Whitmire
- RJ Reiser
Main Goal of this Meeting:
...
Time | Item | Lead | Notes |
5 min | Start recording | Chairs | |
5 mins | Update on the status of the bounty request | ||
10 mins | Lessons learned from development of the Good Health Pass Glossary | ||
30 mins | Discussion of integrating the ToIP Term tool with a Confluence wiki instance | Chairs | |
5 mins | Review of Decisions and Action Items and planning for next meeting | Chairs |
Recording
- link to the fileLink
Presentation(s)
- link to the file
...
- New members
- Update on the status of the bounty request
- Lessons learned from development of the Good Health Pass Glossary - Drummond Reed
- This has been a fantastic exercise in applying the designs we have been developing in the CTWG.
- The first lesson is that avoiding the convention of "First Letter Caps" for terms has been very helpful because now proper nouns can be fully distinguished.
- The second lesson is that the scoping of terms—both to identify their origin (and licensing if applicable) and to clarify their context—has been extremely helpful.
- This leads to the question: with the ToIP Term tool spec, how will a scope owner (e.g., the Good Health Pass WG) be able to adopt a subset of the terms that already exist in the ToIP terminology corpus for the purpose of including them in their own glossary document but NOT seeking to redefine these terms in its own scope. Could it be as simple as an
include
list in the definition of the scope, where theinclude
list is the list of terms from other scopes that the scope owner wants to include it its glossary?- Daniel Hardman suspected that would be the case, but that is still to be confirmed.
- Rieks Joosten said that the architecture of the tool allows for this to be implemented (it may need an extension of the tool).
- Discussion of integrating the ToIP Term tool with a Confluence wiki instance
- Drummond Reedshared the insight from the active work on the Good Health Pass Glossary that a very natural model for creating and maintenance of a glossary/terminology corpus is a wiki. He then brought this up with Daniel in a discussion last week.
- Daniel Hardman talked about how our ToIP Term tool could be integrated with a wiki in that the wiki could be used as both a front-end to add or edit terms to the overall terminology corpus, as well as to be an ongoing display of the "state of the corpus" and an easy tool for navigating and exploring the corpus.
- Rieks Joosten was a somewhat ambivalent about the idea.
- He liked the ease-of-use and the ability to easily navigate across the totality of the corpus - it would certainly solve problems that people in other WGs/TFs have (e.g. WPTF).
- However he feels it is very important that those responsible for a scope be able to monitor and approve proposed changes to the terms in that scope.
- Drummond Reed agreed that the ability to control how terms in a specific scope are modified is important, but was confident that with Confluence there would be good options for doing that.
- line.kofoed wanted to understand more about how multiple scopes/glossaries would work together within such a wiki.
- RJ Reiser liked the idea as being easy to use.
- Scott Perry liked the idea as long as the wiki was used in a consistent way across all the scopes.
- ACTION: Drummond Reed Talk to LF about the Confluence licensing, explore how challenging it would be to write an import script, and determine if there other LF groups who have needed this.
- Review of Decisions and Action Items and planning for next meeting
Slides
#1 —
- In two weeks we hope to have an answer to the bounty plan and also to our questions about using Confluence as a wiki.
Decisions
- Sample Decision ItemNone
Action Items
- Sample Action Item
- ACTION: Drummond Reed Continue to follow up with LF on the bounty.
- ACTION: Drummond Reed Talk to LF about the Confluence licensing, explore how challenging it would be to write an import script, and determine if there other LF groups who have needed this.