I've hosted Abbey for sometime at https://abbey.us.ai and now made a version you can self host. Students use it a lot for reading.
On the whole, it orchestrates a variety of models (TTS, OCR, LLM, etc.) and other tools (search engines, file storage) to make a configurable, private AI gateway. You can plug in your own third party API keys or self-host the models as well using Ollama.
Thanks, this seems like a really useful. It seems like it defaults to private collections of documents for each user, which is surprisingly hard to find.
We released the Workspace feature 2 weeks before Google released theirs, which was shockingly similar obviously. Watching Google succeed (?) with notebook LM has been... frustrating. (ofc they did not copy Abbey, it was just annoying)
I've hosted Abbey for sometime at https://abbey.us.ai and now made a version you can self host. Students use it a lot for reading.
On the whole, it orchestrates a variety of models (TTS, OCR, LLM, etc.) and other tools (search engines, file storage) to make a configurable, private AI gateway. You can plug in your own third party API keys or self-host the models as well using Ollama.
Thanks, this seems like a really useful. It seems like it defaults to private collections of documents for each user, which is surprisingly hard to find.
Hey, thank you! Could you explain that point about private collections more?
This seems to be aiming to do something similar to notebooklm.
We released the Workspace feature 2 weeks before Google released theirs, which was shockingly similar obviously. Watching Google succeed (?) with notebook LM has been... frustrating. (ofc they did not copy Abbey, it was just annoying)