there's a new client for olama out on the olama GitHub integration list and it's called lob chat okay well it's not actually new but it's new for me and it's getting a lot of updates every couple of days it seems to be updated a little bit further adding new features adding new capabilities looking at the website it feels like it's going to be a great front end for olama and I thought I'd take a look at it in this video how do I know about these updates well I've got a pay on my website yeah
I know my website doesn't get a lot of love especially for me but there is one page that shows an anotated list of all the olama Integrations and it has links to my videos about those Integrations it's automatically generated every hour or so while my Mac is running the list on the official GitHub repo is sorted by well there is no real sort to it it's simply where did the author of The Tool want to put it in the list in fact at one point there were a couple of tools that were just simply broken
and they were at the top of the list which was super frustrating for everyone but every time I look at that list for inspiration for What video should I build next it's hard to tell because I don't know if the application's been updated recently or if it's any good or anything it's just this randomized list of applications so looking at my list I saw this lobe chat and decided to take a look it's not of the top of the list now but it was when I started looking at it because this list updates and other
things update so let's start with their site scroll down and we get an overview of features and a few different ways to go over those features it's a very busy web page which makes it kind of hard to figure out how to get started and it's not much easier when you go to the docs so we can come down to the bottom and click on the quick start link which brings us to something that is not a quick start it doesn't show us anything about how to get started okay now I happen to know that
they seem to have an online version as well as a self-hosted version and so you have to search around for the link for this self-hosting piece but even then it's not really clear how to get started there is no quick start eventually you find this deploy with Docker and deploy with Docker compose page but both of them seem to suggest that you have to have an open AI API key and we're using this specifically with AMA so why should I need to have an open AI API key it's not until you come down to integrating
with olama that you see this Docker command for running L chat locally when you're going to want to work with olama so I'll run this command and within a few seconds I'm able to log in to Local Host Port 3210 now it's worth noting that in other places in the documentation is suggest that you have to set some environment variables to get olama to work but it turns out none of that's accurate so when you finally start up lob chat at local hostport 3210 you're greeted with this lob chat Local Host app functionality that you
have to install and if you click install nothing happens and if you refresh the page you get that same dialogue popup which is really bizarre so you just have to close it each time now first let's go into settings to see how to configure this you can get to settings either by clicking the hammer icon at the top right corner or the lobe chat icon in the top left corner they seem to be the same thankfully they offer a light mode and dark mode for those that want to choose one versus the other and you
can set some font sizes and primary colors and so forth now we can set up some system assistance so there's a model that you can use for naming the topic for whenever you start a new chat it'll automatically name that topic and then there's a translation assistant so you can specify a different model for that as well as a model to generate description Avatar and tags which not really sure what that's for but this is really nice being able to specify different models for different activities is a really nice idea so come down to language
model and this is really interesting olama is definitely supported but so are open Ai and open AI posted on Azure or Google Gemini or anthropic or Bedrock or grock or open router or together all sorts of things that are supported here which is pretty cool but that makes you wonder if it's a least common denominator kind of integration or are they really taking advantage of each solution and we're going to learn the answer to that pretty quickly so we can set the interface address strange that they call it a proxy address but whatever next is
client side fetching now this one is totally bizarre you can turn it on or off but it makes absolutely no sense for olama and then there's a model list here we can choose from a list of models installed in ol finally there's a connectivity check which is really nice next comes text to speech it's unfortunate but the only options are either open AI which I assume is whisper hosted on open AI or browser there isn't an option for having a local whisper model which I found to be extremely powerful but again that's just not an
option then you can have it automatically stop speech recognition when you're quiet as well then there's some options around which model you want to use for that open AI textto speech model and speech to text model next we have default assistant a lot of this stuff is bizarre but down at P settings we can again a lot of this doesn't make much sense but down when we get to model settings we can set a default model but we can also I assume override model settings I'm not really sure if this is going to ignore the
settings that are stored in the model for temperature and top PE and presence penalty if it's just going to ignore anything that's already set in the model or or what it it's it's kind of unclear and then there's a text to speech service and your only option seems to be open AI or something called Edge or Microsoft I would have thought Edge was Microsoft but maybe not then which voice do you want to use for that text to speech and which language you want to have and then there's a list of plugins I have a
few enabled but none of them seem to actually work finally we have an about dialogue now at the main screen for having a chat with a model let's take a look at those little icons above the input window first off there's a little button to choose which model you want to use we chose the available models earlier in settings and this just allows us to choose from one of those models then we have an image which is supposed to support image uploads here I've chosen a lava model which does support Vision but we're not able
to select an image to upload next we have temperature again it's un unclear if it's always overriding the temperature set in the model or is there a way to get the temperature that was set in the model by the model author that's a function of a llama but it's unclear if we have access to that next we can set how long our history is how many messages do we want to keep in our history it's again unclear what happens to older messages I would imagine that most tools that allow you to set something like this
would summarize any older messages or is it just forgetting all that old stuff next comes voice input now no matter what I have chosen in the settings I couldn't get this to function so again this doesn't seem to work I don't know why next comes extensions and this seemed to be a big part of what lob chat supports and there seems to be a lot of different extensions available now if you had an earlier version of L chat there was a way to get these extensions to be enabled but they didn't work and now they
just don't work when you hover over that icon it claims that this model doesn't support function calling which is flat out not true this tells me that the creators of this tools I don't know got lazy and didn't want to add support here function calling is a base functionality supported in olama and it works for all models and that's been there for at least 8 months assistants are supported even with AMA these are simply system prompts which with the AMA would normally be supplied with the model there are a lot of options here but to
figure out what they're actually doing you need to activate them then click this icon and then most of them don't actually seem to be all that interesting this would be a great place to integrate Fabric or S similar tools but that hasn't been done and that's really about all there is to L chat it seems that either it has a lot of starts of functionality that hasn't been completed or those features only work with the online services it's it's very unfortunate that you can't use any of the plugins because lchat hasn't enabled function calling when
using olama even though olama has great support for it it's also very unfortunate that you can't use images with any of the models in olama even the vision models remember what I said at the beginning about lob chat being updated a lot it is but very little of that work is happening for offline use this tool must be for the folks who don't have the privacy and security concerns that would restrict the use of online Solutions who who don't work for companies who impose those concerns or who are never offline it feels like olama was
added as an afterthought to L chat and shouldn't really be supported by the application it's a greatl looking UI it just doesn't have any of the functionality one would would expect really if I had any influence anymore on the project I would have it removed from the olama GitHub repo since it only reflects badly on the project what do you think have you used L chat specifically with olama is there something you think I'm missing because I feel like there's got to be something I'm missing because it seems there's just not a lot there there's
been barely any mention of it in the AMA Discord so maybe I is just the last to find out that it's not worth using oh well it's always fun to see what other tools exist and how they function or don't function let me know if there's a client you use and think I should cover I'll try to continue updating my list of Integrations so be sure to keep an eye on it whenever you need to find a new tool to play with you can find it at this URL maybe I should reincarnate my old link
shortener and fix that another day well thanks so much for watching goodbye