Ask HN: Has AI breathed new life into Semantic (web) Technologies?
4 rottyguy 8 5/7/2025, 11:04:37 AM
The Knowledge Graph Conference is currently happening in NYC and there's a bit of talk around KG assisting AI with various things like truth grounding (RAG) and Agentic development. Curious if anyone is seeing more discussions of Semantic Technologies in their orbit these days?
So I would say the opposite is true. AI tools are removing the need for special declarative wrappers around a lot of text. For example, there's no need to surround a headline with <H1> when you can ask a GPT to "get the headlines from all these articles."
There are a couple kinds of wrapping that do help working with LLMs. That's markdown in prompts and JSON and XML in system instructions for MCP. But RAG refers to the non-LLM end of the process, getting data from files or a database, so the style of training data doesn't directly affect how that works.
I can't see anyone making a special tool just for that as a general use case. Everybody wanting such output would want their own format for the markup, and we're right back to XML.
Today, it's simpler to use RAG, which is just using the LLM to figure out the "English" part, then using regular (procedural, normal programming code) tools to put things in boxes, or data storage, (or markdown), etc. If you really want consistent output, you can't have the LLM generate it. You would need to RAG that output.
EDIT1: A few weeks ago, a team of Brazilian researchers published a report about using ChatGPT to enhance agriculture-focused OWL dataset[4].
EDIT2: In addition to training LLMs on ontologies, it looks like Palantir is using ontologies as guardrails to prevent LLM hallucinations[5]. Makes sense.
[0] https://json-ld.org/
[1] https://www.w3.org/TR/owl2-syntax/
[2] https://www.w3.org/TR/sparql11-query/
[3] https://arxiv.org/abs/2009.14654
[4] https://arxiv.org/abs/2504.18651
[5] https://blog.palantir.com/reducing-hallucinations-with-the-o...