국가법령정보MCP v4.0 | 법제처 41개 API → 17개 MCP 도구. 법령·판례·조례 검색 + LLM 환각 방지 인용검증 + 조문 영향 그래프(impact_map) + 시점 비교 자동 diff(time_travel) + 시민 5단계 실행 가이드(action_plan) | 41 Korean legal APIs → 17 MCP tools
-
Updated
May 8, 2026 - TypeScript
국가법령정보MCP v4.0 | 법제처 41개 API → 17개 MCP 도구. 법령·판례·조례 검색 + LLM 환각 방지 인용검증 + 조문 영향 그래프(impact_map) + 시점 비교 자동 diff(time_travel) + 시민 5단계 실행 가이드(action_plan) | 41 Korean legal APIs → 17 MCP tools
UQLM: Uncertainty Quantification for Language Models, is a Python package for UQ-based LLM hallucination detection
[NeurIPS 2025] SECA: Semantically Equivalent and Coherent Attacks for Eliciting LLM Hallucinations
RAG Hallucination Detecting By LRP.
(개요) 국가법령정보센터와 알리오의 공공기관 내부규정을 검색·비교·분석하는 MCP. (도구) 법제처 87 + ALIO 공공기관 규정 23 = 110개 MCP 도구. (데이터) 1,600 법률, 10,000 행정규칙, 수만건 판례, 344개 공공기관 35,000 내부규정.
CRoPS (TMLR)
Novel Hallucination detection method
Build your own open-source REST API endpoint to detect hallucination in LLM generated responses.
Semi-supervised pipeline to detect LLM hallucinations. Uses Mistral-7B for zero-shot pseudo-labeling and DeBERTa for efficient classification.
Research paper on how agentic debate pipelines can be constructed to reduce hallucinations in LLMs with open-source and commercial models
Lecture-RAG is a grounding-aware Video-RAG framework that reduces hallucinations and supports algorithmic reasoning in educational, Slide based, Black board tutorial videos.
Automated detection, visualization and suppression of hallucination-associated neurons in open-source LLMs — LLM mechanistic interpretability research tool
Source code for the paper: A Hallucination Mitigation Scheme in Security Policy Generation with Large Language Models
This repository contains the codebase for the PoC of LLM package hallucination and associated vulnerabilties.
UQLM: Uncertainty Quantification for Language Models, is a Python package for UQ-based LLM hallucination detection
Search and analyze Korean laws, administrative rules, and public institution regulations using 110 MCP tools to provide accurate legal data for AI models.
Add a description, image, and links to the llm-hallucination topic page so that developers can more easily learn about it.
To associate your repository with the llm-hallucination topic, visit your repo's landing page and select "manage topics."