I haven’t looked into paperless-ai yet, but I hope my machine would be beefy enough for this task
You need a GPU with a decent amount of VRAM to get LLMs working well locally. I don’t have a new enough GPU to be useful - my server just has the Intel iGPU, and my desktop PC only has a GTX1080, which is from before Nvidia added Tensor cores for AI.
This depends a lot on if your employer is good or not. I get 20 days bereavement leave per year for close family (spouse, kids, parents) and 10 days for extended family (grandparents)