Page 1 of 2

LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 7:24 pm
by retroD0d0
Hey Everyone. Has anyone broached the idea of integrating a chatbot into the forum using the entire forum thread archive as context? Perhaps other mx documentation also? I've no idea if the site has already been scraped by LLM trainers, does/would anyone know? It might help users to get quicker responses and reduce unnecessary threads. Thoughts?

Re: LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 7:38 pm
by j2mcgreg
No!

Re: LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 7:52 pm
by BV206
From what little I know (reading detailed reports from others) you can't rely on AI answers to be accurate. Maybe you get quicker responses but it would increase the number of forum posts when people find out the hard way they need correct aswers.

Re: LLM chatbot forum integration?  [Solved]

Posted: Tue Apr 08, 2025 8:06 pm
by richb
As indicated earlier no. And will not happen on my watch.

Re: LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 8:37 pm
by Adrian
It's good to use the tools where they make sense. If you are on the forum it makes sense to talk to people, if you want to talk with an AI there are other sites for that.

Re: LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 8:52 pm
by Stevo
What if we combine LLM AI with systemd?

Image

Image

Re: LLM chatbot forum integration?

Posted: Tue Apr 08, 2025 9:07 pm
by Adrian
In MX style we'll let the user choose if they want to boot with InitV, systemd, or LLM.

Re: LLM chatbot forum integration?

Posted: Wed Apr 09, 2025 2:46 am
by Eadwine Rose
No.

Re: LLM chatbot forum integration?

Posted: Wed Apr 09, 2025 5:26 am
by retroD0d0
Ok, guys, thanks for your responses, prevailing sentiment noted, accepted and possibly understood!
Adrian wrote: Tue Apr 08, 2025 8:37 pm It's good to use the tools where they make sense. If you are on the forum it makes sense to talk to people, if you want to talk with an AI there are other sites for that.
In a prompt with a Llama3b model, a question asking it to compare how MX Package Installer retrieves it's packages across both Popular Applications and manual package install returned a response indicating it had no idea. I think the first time I have encountered this answer. It simply had no context to give even a vague, non-descript answer. Maybe the question was dumb :confused: .(Heads-Up: follow-up thread on the same subject incoming :p ). It seems, forum.mxlinux.org wasn't scraped for this model instance's training and I am not aware of any agent which accepts tagging of an entire forum (as opposed to a webpage,pdf etc) as prompt context. This is the problem I was trying to solve for myself. Does anyone have any ideas?

Re: LLM chatbot forum integration?

Posted: Wed Apr 09, 2025 5:38 am
by Melber
Why spend time formulating a query for AI when you could just ask the actual people who wrote MXPI here on the forum?

(I shall refrain from any further AI comment otherwise I'll get this thread locked too. ;))