I have two Nvidia Quadro RTX A4000 GPUs that I'm looking to sell. They both work great and I've used them to learn a fair bit about AI and Large Language Models.
Each has 16GB of RAM, the combination is enough to run 30b models competently.
They are single slot GPUs and require a single 6-pin PCIE connector.
If you require something compact for media work these are also a good option, with high performance for the size and power draw.