tl;dr : Model Context Protocol expressed in RDF because WWW.
This is a first pass, most of the heavy lifting done by Claude, so while it looks plausible, it might not be. I've not had a play yet.
There would be more but Message limit reached for Claude 3.5 Sonnet until 9 PM.
Get your Linked Data on!
The Model Context Protocol (MCP) Ontology provides a formal vocabulary for describing interactions between AI language models and external context providers. It defines core concepts for resources, tools, prompts and messaging that enable structured communication between AI systems and domain-specific data sources.
Preferred prefix: mcp
This ontology models the key components of the Model Context Protocol ecosystem:
The MCP ontology supports Retrieval-Augmented Generation (RAG) through structured access to context resources. Here are key integration patterns:
PREFIX mcp: <http://purl.org/stuff/mcp/>
SELECT ?resource ?name ?type WHERE {
?server a mcp:Server ;
mcp:providesResource ?resource .
?resource mcp:name ?name ;
mcp:mimeType ?type .
}
PREFIX mcp: <http://purl.org/stuff/mcp/>
SELECT ?tool ?desc WHERE {
?server mcp:providesTool ?tool .
?tool mcp:description ?desc .
FILTER(CONTAINS(LCASE(?desc), "text analysis"))
}
Combining vector similarity with graph traversal:
from rdflib import Graph, Namespace
from sentence_transformers import SentenceTransformer
MCP = Namespace("http://purl.org/stuff/mcp/")
g = Graph()
g.parse("mcp-store.ttl")
encoder = SentenceTransformer('all-MiniLM-L6-v2')
# Index resources
resources = []
for s,p,o in g.triples((None, MCP.text, None)):
resources.append({
'id': str(s),
'text': str(o),
'embedding': encoder.encode(str(o))
})
# Query with RAG
def query_with_context(question):
q_embedding = encoder.encode(question)
relevant = find_similar(q_embedding, resources)
context = []
for r in relevant:
# Get metadata
meta = g.query("""
SELECT ?name ?type WHERE {
?res mcp:name ?name ;
mcp:mimeType ?type .
FILTER(?res = <%s>)
}""" % r['id'])
context.append({
'text': r['text'],
'metadata': list(meta)[0]
})
return build_prompt(question, context)