@Ruth_Mottram StabilityLabs who are behind #StableDiffusion have released #StableLM you can run at home. For it to give decent results you have to use the really large checkpoint models which use a lot of VRAM. I'd recommend at least an RTX 3090 with 24gb of VRAM unless you've got an Nvidia CUDA server setup.
An acquaintance has been running it on his PC to generate uncensored fan fiction stories. With #ChatGPT you're limited to their filtered model and API so running it yourself there are no limitations other than requiring a beefy gaming PC.
I haven't tried running it myself since I've got an RTX 2080Ti with 11gb of VRAM. From checking YouTube videos you won't get any good results at all running smaller models.
https://github.com/Stability-AI/StableLM