For interesting topics. But remember this is a Linux Forum. Do not post offensive topics that are meant to cause trouble with other members or are derogatory towards people of different genders, race, color, minors (this includes nudity and sex), politics or religion. Let's try to keep peace among the community and for visitors.
No spam on this or any other forums please! If you post advertisements on these forums, your account may be deleted.
Do not copy and paste entire or even up to half of someone else's words or articles into posts. Post only a few sentences or a paragraph and make sure to include a link back to original words or article. Otherwise it's copyright infringement.
You can talk about other distros here, but no MX bashing. You can email the developers of MX if you just want to say you dislike or hate MX.
I would want to know where they sourced it before I bought it sight unseen.
OK...but not sure what you mean...like:
Where is it shipped from (e.g. 3rd party)
From where they bought it
Anything else to ask?
Amazon will tell you if it ships from a 3rd party, or from "Amazon", whatever that means!
I don't know how I would find-out from where they acquired it.
Also, all the other items I am looking at are new items from Amazon, in-stock, and delivered next day with Amazon Prime. I don't think any of the items come from re-sellers.
If you have any pointers/guidance, please let me know
I am most interested in what folks think about my:
ASSUMPTIONS:
i7 CPU for better VM performance
Large Boot Drive (NVME) for hosting multiple VMs & Distros
Better GPU for AI tools
(MY) AXIOM(S) OF CHOICE(S):
After playing with VMs in VirtualBox, I see that it's best to have at least 20GB reserved for "the machine". Thus, I concluded that I should have a pretty hefty NVME.
I'm assuming that an 8GB GPU (versus current 2GB) will boost AI modelling
I'm also assuming that for running AI models, I don't "have" to have Nvidia drivers
Awaiting Your Analysis
Last edited by operadude on Mon Sep 08, 2025 8:17 am, edited 1 time in total.