LLM chatbot forum integration? [Solved]
LLM chatbot forum integration?
Hey Everyone. Has anyone broached the idea of integrating a chatbot into the forum using the entire forum thread archive as context? Perhaps other mx documentation also? I've no idea if the site has already been scraped by LLM trainers, does/would anyone know? It might help users to get quicker responses and reduce unnecessary threads. Thoughts?
Re: LLM chatbot forum integration?
No!
HP 15; ryzen 3 5300U APU; 500 Gb SSD; 8GB ram
HP 17; ryzen 3 3200; 500 GB SSD; 12 GB ram
Idea Center 3; 12 gen i5; 256 GB ssd;
In Linux, newer isn't always better. The best solution is the one that works.
HP 17; ryzen 3 3200; 500 GB SSD; 12 GB ram
Idea Center 3; 12 gen i5; 256 GB ssd;
In Linux, newer isn't always better. The best solution is the one that works.
Re: LLM chatbot forum integration?
From what little I know (reading detailed reports from others) you can't rely on AI answers to be accurate. Maybe you get quicker responses but it would increase the number of forum posts when people find out the hard way they need correct aswers.
Re: LLM chatbot forum integration? [Solved]
As indicated earlier no. And will not happen on my watch.
Forum Rules
Guide - How to Ask for Help
richb Administrator
System: MX 23 KDE
AMD A8 7600 FM2+ CPU R7 Graphics, 16 GIG Mem. Three Samsung EVO SSD's 250 GB
Guide - How to Ask for Help
richb Administrator
System: MX 23 KDE
AMD A8 7600 FM2+ CPU R7 Graphics, 16 GIG Mem. Three Samsung EVO SSD's 250 GB
Re: LLM chatbot forum integration?
It's good to use the tools where they make sense. If you are on the forum it makes sense to talk to people, if you want to talk with an AI there are other sites for that.
Re: LLM chatbot forum integration?
MXPI = MX Package Installer
QSI = Quick System Info from menu
The MX Test repository is mostly backports; not the same as Debian testing
QSI = Quick System Info from menu
The MX Test repository is mostly backports; not the same as Debian testing
Re: LLM chatbot forum integration?
In MX style we'll let the user choose if they want to boot with InitV, systemd, or LLM.
- Eadwine Rose
- Administrator
- Posts: 14417
- Joined: Wed Jul 12, 2006 2:10 am
Re: LLM chatbot forum integration?
No.
MX-23.6_x64 July 31 2023 * 6.1.0-34amd64 ext4 Xfce 4.20.0 * 8-core AMD Ryzen 7 2700
Asus TUF B450-Plus Gaming UEFI * Asus GTX 1050 Ti Nvidia 535.216.01 * 2x16Gb DDR4 2666 Kingston HyperX Predator
Samsung 870EVO * Samsung S24D330 & P2250 * HP Envy 5030
Asus TUF B450-Plus Gaming UEFI * Asus GTX 1050 Ti Nvidia 535.216.01 * 2x16Gb DDR4 2666 Kingston HyperX Predator
Samsung 870EVO * Samsung S24D330 & P2250 * HP Envy 5030
Re: LLM chatbot forum integration?
Ok, guys, thanks for your responses, prevailing sentiment noted, accepted and possibly understood!
.(Heads-Up: follow-up thread on the same subject incoming
). It seems, forum.mxlinux.org wasn't scraped for this model instance's training and I am not aware of any agent which accepts tagging of an entire forum (as opposed to a webpage,pdf etc) as prompt context. This is the problem I was trying to solve for myself. Does anyone have any ideas?
In a prompt with a Llama3b model, a question asking it to compare how MX Package Installer retrieves it's packages across both Popular Applications and manual package install returned a response indicating it had no idea. I think the first time I have encountered this answer. It simply had no context to give even a vague, non-descript answer. Maybe the question was dumb


Re: LLM chatbot forum integration?
Why spend time formulating a query for AI when you could just ask the actual people who wrote MXPI here on the forum?
(I shall refrain from any further AI comment otherwise I'll get this thread locked too.
)
(I shall refrain from any further AI comment otherwise I'll get this thread locked too.
