Ventrillo destroyed
...friggin' ouch!
A hacker <a rather nasty one> hit -all- ventrillo servers, even the backups were truncated, so now no vent. Go to zebgames.com for more info. personally I think this was an inside job. |
ZebGames suffered a catastrophic data loss. We attempted to restore using our daily backup, but the backups were apparently truncated each time and thus corrupted. Next we contacted the developer of our billing system to attempt a partial restore from the corrupted backups, but they were unable to restore usable data. Unfortunately, this has effectively put ZebGames out of business. We have kept the services online for the past month and a half without billing clients while attempting to get everything restored, hoping we could at least send an email to our client list about the issue. We are very sadden by this issue and that we were not able to correct the problem or contact our customers to let them know about the issue. Our email database was also lost in the same data loss. It was a great 8 years and we appreciated all of you. We hope you find a wonderful new hosting provider and wish you all the best. |
Triumph: White Succubus: 50 Ill/Emp/PF Snow Globe: 50 Ice/FF/Ice Strobe: 50 PB Shi Otomi: 50 Ninja/Ninjistu/GW Stalker My other characters
Right. Thanks for the correction, My brain is just getting started and this is what I am waking up to. I'm floored.
Ignoring anyone is a mistake. You might miss something viral to your cause.
Speeding Through New DA Repeatables || Spreadsheet o' Enhancements || Zombie Skins: better skins for these forums || Guide to Guides
This is why I always tell clients "If you don't test restore backups once in a while YOU DON'T HAVE A BACKUP".
One of my clients, a few years ago had their local techs pull something like this on them.
The app we were supporting for them was an old dBase CRM that used direct file access. So, invariably, someone would leave it open every night.
Because the local tech company was too stupid to have these guys just buy a live file module for their backup software, they were getting "failure" reports on the backup every night.
So they "fixed" it.
By taking the application's directory out of the backup routine...
*6 Months Later*
The customer suffered a catastrophic server failure. They got everything else restored but when they went to the CRM, a several tables, that had been live when the server **** the bed, were hollowed out. 0-byte files.
So we're like, well, THAT sucks. Let's roll back to the last backup before the crash. Hey...where are your backups?
That's when the whole ugly mess came to a head. LUCKILY these guys still had a 6 month old copy on a laptop of one of their salesguys who never synced. We were able to get the data moved back to the server and get them up and running again, minus all their current data. Still, a damn sight better than losing their ENTIRE customer database...
This little career-limiting-move saw the local tech outfit bounced out the front door and told, pointedly, that they were blacklisted from EVER working for this company or any of its subsidiaries again.
for some reason, a few of my friends still have working services.
kudos to them and anyone else that survived.
Ignoring anyone is a mistake. You might miss something viral to your cause.
This is why I always tell clients "If you don't test restore backups once in a while YOU DON'T HAVE A BACKUP".
One of my clients, a few years ago had their local techs pull something like this on them. The app we were supporting for them was an old dBase CRM that used direct file access. So, invariably, someone would leave it open every night. Because the local tech company was too stupid to have these guys just buy a live file module for their backup software, they were getting "failure" reports on the backup every night. So they "fixed" it. By taking the application's directory out of the backup routine... *6 Months Later* The customer suffered a catastrophic server failure. They got everything else restored but when they went to the CRM, a several tables, that had been live when the server **** the bed, were hollowed out. 0-byte files. So we're like, well, THAT sucks. Let's roll back to the last backup before the crash. Hey...where are your backups? That's when the whole ugly mess came to a head. LUCKILY these guys still had a 6 month old copy on a laptop of one of their salesguys who never synced. We were able to get the data moved back to the server and get them up and running again, minus all their current data. Still, a damn sight better than losing their ENTIRE customer database... This little career-limiting-move saw the local tech outfit bounced out the front door and told, pointedly, that they were blacklisted from EVER working for this company or any of its subsidiaries again. |
reading this makes me think of all the times I was disappointed with issues in this game, now... I count my blessings...
Ignoring anyone is a mistake. You might miss something viral to your cause.
reading this makes me think of all the times I was disappointed with issues in this game, now... I count my blessings...
|
Is something occasionally lost?
Sure! It's computers. It happens.
But in most cases, you can get the stuff back in very short order.
This brings me back to a day... long ago... where I was semi-forced to 'upgrade' a Novell server to a Windows NT Server... and the SCSI card failed... and the new drive got a little trashed... and when the old Novell was put back on, the Novell backup WASN'T WORKING (not my fault) and they lost about 20 percent of their data...
I scrambled my *** off to keep what they had, that 80%, but a lot of the 20% was pretty new stuff.
Had a big meeting, things were said, bottom line, I said I didn't want to do it that way in the first place, you cheapskates... it was a really bad deal.
And I swore to NEVER, EVER, upgrade ANY server ever again. And I haven't. Direct replacement, and KEEP THE OLD ONE FOR A MONTH, to ensure the new one is working, THEN decommission/retask the old one.
This really sucks for all effected, including the company. Man.
---
Secondary thoughts...
The backup got hosed? Wow.
Then again, if you're backing up bad data, and don't have weekly, monthly backups behind it...
Snowball. Ugly.
Geez, I can imagine, but I can't imagine, know what I mean?
August 31, 2012. A Day that will Live in Infamy. Or Information. Possibly Influence. Well, Inf, anyway. Thank you, Paragon Studios, for what you did, and the enjoyment and camaraderie you brought.
This is houtex, aka Mike, signing off the forums. G'night all. - 10/26/2012
Well... perhaps I was premature about that whole 'signing off' thing... - 11-9-2012
Also, just throwing this out there, as I used to work in an off-site storage company; if you're sending media off site for storage, it's a good idea to keep track of what you send.
We had one client, three times a week, sent 1-5 CD/DVDs each in a jewel case (along with standard tape backup cartridges). We dutifully labeled and stored them, until one day we ran out of shelf space for their optical media (it took over an entire 9 foot high, 5 foot wide rack).
The kicker? They didn't even know what was on those disks, never called 'em back, or ordered their destruction. We ended up charging them for another shelf.
"My inner mind has become a reality-cracking overgod. He torments me! Help!"
We run our own TeamSpeak server for The Conclave. And yes, we do have offsite backups in several countries (mainly the website, database and forums).
And just making backups isn't good enough if you're a company. I've once seen a company go bankrupt when their office was broken into and the servers were stolen. They made backups from one machine to the other, but no off-site backups.
And yes, you need to check if your backups actually work. Otherwise it's a lot of money thrown away for nothing. As is clear from this story, it can be the end of your company.
We run our own TeamSpeak server for The Conclave. And yes, we do have offsite backups in several countries (mainly the website, database and forums).
And just making backups isn't good enough if you're a company. I've once seen a company go bankrupt when their office was broken into and the servers were stolen. They made backups from one machine to the other, but no off-site backups. And yes, you need to check if your backups actually work. Otherwise it's a lot of money thrown away for nothing. As is clear from this story, it can be the end of your company. |
reminds me of one of the external dealers we purchase contracts from. I got a phone call " We've had a virus wipe out the data on our PC. Can you reinstall the software".
You've got to love companies that don't keep their AV up to date.
Talked them through reinstalling the system until we got to the point where I asked them to copy their database from their backup only to be told they didn't have one so they got stuck with starting with an empty database having just lost about 8 years worth of sales leads.
That triggered our dealership department sending out a notice reminding all the dealers of the importance of backups.
Mind of Gaia lvl 50 Defiant's first Mind/Storm 'troller.
Deadly Doc 50 Dark/Dark Corr
and lots more on Pinnacle,Union and Defiant
"Your backup is only as good as your ability to restore it", a memorable quote my mentor shared back in the 90s when I first started consulting.
@Texarkana
@Thexder
What's the line?
Redundant backups are rather redundant until the primary And secondary fail.
...friggin' ouch!
A hacker <a rather nasty one> hit -all- ventrillo servers, even the backups were truncated, so now no vent.
Go to zebgames.com for more info.
personally I think this was an inside job.
Ignoring anyone is a mistake. You might miss something viral to your cause.