Ask HN: 消费者 AI 盒子是个可行的想法吗?
1 分•作者: spprashant•大约 2 小时前
我怀疑在某个时候,当前形式的 LLM 将被认为足够好,可以胜任一般的研究和编码任务。我不明白为什么我们需要继续采用事实上的云端方法。在我看来,云解决了运营复杂性,这是值得为此支付溢价的。但似乎只要你有硬件,在本地运行一个开源模型也并非那么复杂。随着时间的推移,我怀疑模型会变得更好、更便宜。
未来是否会出现这样的情况:我们可以期待人们像购买电视机一样,直接从百思买购买“AI”?它可能会预装一些模型——如果是开源的,价格会更便宜,如果是前沿实验室的模型,价格会更高。硬件基本上就是一堆 GPU,足以进行本地推理。
把它带回家,插到你的家庭网络上,你就可以通过在任何本地设备上访问 IP 地址来打开一个聊天实例。如果你愿意,你可以让它访问互联网。也许它还可以接收 OTA 更新。
好奇其他人对此的看法——本地优先的 AI 感觉有可能吗?这会带来哪些经济和社会挑战?
查看原文
I suspect at some point LLM in its current form will be deemed good enough for general research and coding tasks. I don't get why we need to continue with a de-facto cloud-based approach. Cloud in my opinion solves operational complexity, which is worth paying a premium for. But it seems it isn't quite all that complex to get an open source model running locally as long as you have the hardware. Over time I suspect the models get better and cheaper.<p>Is there a future where we can expect people to just buy "AI" from BestBuy, like a TV set? It ll probably come with some model preloaded - cheaper if open-source, premium pricing for frontier lab models. The hardware is basically a bunch of GPUs enough for local inference.<p>Take it home and plug it into your home network and you can open a chat instance by going to the IP on any local device. You can give it access to internet if you want. Maybe it can also receive OTA updates.<p>Curious how others think about this - does local-first AI feel like a possibility? What are the economic and social challenges with this?