Show HN: Omni-NLI – 多接口自然语言推理服务器
1 分•作者: habedi0•5 天前
大家好,
我开发了一个用于自然语言推理的开源工具(名为 Omni-NLI)。它可以利用不同的模型来检查一段文本(称为前提)是否支持另一段文本(假设)。这种工具的主要应用是用于软事实核查和句子等文本之间的连贯性检查。
目前,Omni-NLI 具有以下功能:
```
- 可以通过 `pip install omni-nli[huggingface]` 安装为 Python 包。
- 可以在您自己的计算机上使用,因此您的数据保持本地和私密。
- 具有 MCP 接口(用于代理)和 REST API,可作为微服务进行常规使用。
- 支持使用来自不同来源的模型(Ollama、OpenRouter 和 HuggingFace)。
- 可用于检查模型是否似乎自相矛盾。
- 支持显示推理过程,以便您可以看到它认为某个主张是错误的理由。
```
总而言之,如果您有兴趣了解更多信息,以下链接提供了更多信息:
项目 GitHub 仓库:[https://github.com/CogitatorTech/omni-nli](https://github.com/CogitatorTech/omni-nli)
项目文档:[https://cogitatortech.github.io/omni-nli/](https://cogitatortech.github.io/omni-nli/)
查看原文
Hi everyone,<p>I've made an open-source tool (called Omni-NLI) for natural language inference. It can use different models to check if
a piece of text (called a premise) supports another piece of text (a hypothesis). The main application of a tool like
this is for soft fact-checking and consistency checking between pieces of texts like sentences.<p>Currently, Omni-NLI has the following features:<p><pre><code> - Can be installed as a Python package with `pip install omni-nli[huggingface]`.
- Can be used on your own computer, so your data stays local and private.
- Has an MCP interface (for agents) and a REST API for conventional use as a microservice.
- Supports using models from different sources (Ollama, OpenRouter, and HuggingFace).
- Can be used to check if it seems that a model is contradicting itself.
- Supports showing the reasoning so you can see why it thinks a claim is wrong.
</code></pre>
In any case, if you are interested in knowing more, there is more information in the links below:<p>Project's GitHub repo: <a href="https://github.com/CogitatorTech/omni-nli" rel="nofollow">https://github.com/CogitatorTech/omni-nli</a><p>Project's documentation: <a href="https://cogitatortech.github.io/omni-nli/" rel="nofollow">https://cogitatortech.github.io/omni-nli/</a>