Thread:Project:Support desk/Gigabytes of data taken up by MySQL database for mediawiki installation

I would give you my site address but I can't think it will help, because it's been shut down due to the database being too large and affecting others (they are okay up to ~3GB and mine is over 80GB at present). http://ngwiki.culex.us/

Now, I don't know jack about MySQL, I can only access my cPanel type stuff on my website. I can only guess somehow the persistence of bots (I had registration and edit-own-talk-page-only activated) registering a dozen or so a day, and then editing their personal pages to have text ads, somehow got this bloat? I don't know, and my hosting company can't help me, and I can't download my database because it's just entirely too huge.

I don't want to lose the data on the site, though, even though I don't care about the user list since only I matter, only I have editing on the wiki. But people do visit and use it regularly, and I have no backup that isn't again, 80GB for just the one database, _mdw1.

Is there a way to optimize, delete, whatever the database to make it small? Why is it huge when my wiki is literally just text and a logo?

I really apologize for being unable to describe more. It's the current non-beta build. I edit it about three times a month. There is no archive.org of ANY of the data anymore. I can find nothing on google about huge database sizes like this.