not only did the hosts' primary server crash, but the mirror server which holds the monthly backup also got fried. i dont know all the technicalities of it but they're trying to restore the monthly backup stuff right now, we'll see what happens.
oh shit did you (we?) just lose everything after march 3rd?
It's possible. They are telling Ed that our stuff MIGHT be backed up, but I guess they lost a lot of backed up stuff too. Right now it's looking about as bad as it could look.
why did it have to go down in the first place? is this somthing the web host told you?
Dear Customer, On March 30th, 2006 we received a mail from our data center's admin stating that there was a problem with one of our servers - Athena which was showing a red-light on their monitoring system. Upon further investigation they found that one of the memory modules was corrupted and was replaced immediately. Later on they also discovered that the secondary disk in our RAID array, which was used a mirror of the primary disk, had gone bad as well. As this disk was not in stock, they ordered it and was to be added to the machine. On April 5th, they informed us about the arrival of the new disk and scheduled the installation for the next day. and that the new disk's installation is fairly simple, just add it to the machine and let the RAID controller rebuild the array (mirror the disk) while the server is on line. This process works in the background and would not hinder much as the server would be fully operational during this process. But after the drive installation, the controller went wrong and destroyed the raid volume taking the entire data with it. Unfortunately, the system froze during the rebuild and refused to respond. All disk activity ceased as well. At this point, I was forced to reboot the server. Upon reboot, I can no longer boot the array and gain access to the OS despite the array being intact. I've attempted to recover via the rescue cd but I can't access the array at all by this means either.
why did it have to go down in the first place? is this somthing the web host told you?
Dear Customer, On March 30th, 2006 we received a mail from our data center's admin stating that there was a problem with one of our servers - Athena which was showing a red-light on their monitoring system. Upon further investigation they found that one of the memory modules was corrupted and was replaced immediately. Later on they also discovered that the secondary disk in our RAID array, which was used a mirror of the primary disk, had gone bad as well. As this disk was not in stock, they ordered it and was to be added to the machine. On April 5th, they informed us about the arrival of the new disk and scheduled the installation for the next day. and that the new disk's installation is fairly simple, just add it to the machine and let the RAID controller rebuild the array (mirror the disk) while the server is on line. This process works in the background and would not hinder much as the server would be fully operational during this process. But after the drive installation, the controller went wrong and destroyed the raid volume taking the entire data with it. Unfortunately, the system froze during the rebuild and refused to respond. All disk activity ceased as well. At this point, I was forced to reboot the server. Upon reboot, I can no longer boot the array and gain access to the OS despite the array being intact. I've attempted to recover via the rescue cd but I can't access the array at all by this means either.
oofers.
do I need to re-register to the forums?
if you guys are using PHPBB, I can probably hook you up with a new server. My friend runs a hosting co. and will do you well.
Oh, okay, great, I see how it works. $$$ I thought I was giving for mix cds and site upkeep was really for Ed and Crink's beer and records. FUCKING GREAT!!!!!!!!!!
Seriously though, that's a real bummer. Hatin' for those that lost work they had contributed. Let the peeps know what needs to be done to get 'er up and runnin again at full functionality.
Seriously though, that's a real bummer. Hatin' for those that lost work they had contributed. Let the peeps know what needs to be done to get 'er up and runnin again at full functionality.
for those who posted: if you remember each record you posted, type in "waxidermy" and the album title in a google search & click "CACHED" link when the results show up to recover your post text. then you can go in and re-post the review and the images/soundclips
why did it have to go down in the first place? is this somthing the web host told you?
Dear Customer, On March 30th, 2006 we received a mail from our data center's admin stating that there was a problem with one of our servers - Athena which was showing a red-light on their monitoring system. Upon further investigation they found that one of the memory modules was corrupted and was replaced immediately. Later on they also discovered that the secondary disk in our RAID array, which was used a mirror of the primary disk, had gone bad as well. As this disk was not in stock, they ordered it and was to be added to the machine. On April 5th, they informed us about the arrival of the new disk and scheduled the installation for the next day. and that the new disk's installation is fairly simple, just add it to the machine and let the RAID controller rebuild the array (mirror the disk) while the server is on line. This process works in the background and would not hinder much as the server would be fully operational during this process. But after the drive installation, the controller went wrong and destroyed the raid volume taking the entire data with it. Unfortunately, the system froze during the rebuild and refused to respond. All disk activity ceased as well. At this point, I was forced to reboot the server. Upon reboot, I can no longer boot the array and gain access to the OS despite the array being intact. I've attempted to recover via the rescue cd but I can't access the array at all by this means either.
Your web server folks don't back up?? RAID is redundant, but not infallable. * 100000
PS: I can show you how to backup data from MySQL using phpmyadmin. You can even automate it.
why did it have to go down in the first place? is this somthing the web host told you?
Dear Customer, On March 30th, 2006 we received a mail from our data center's admin stating that there was a problem with one of our servers - Athena which was showing a red-light on their monitoring system. Upon further investigation they found that one of the memory modules was corrupted and was replaced immediately. Later on they also discovered that the secondary disk in our RAID array, which was used a mirror of the primary disk, had gone bad as well. As this disk was not in stock, they ordered it and was to be added to the machine. On April 5th, they informed us about the arrival of the new disk and scheduled the installation for the next day. and that the new disk's installation is fairly simple, just add it to the machine and let the RAID controller rebuild the array (mirror the disk) while the server is on line. This process works in the background and would not hinder much as the server would be fully operational during this process. But after the drive installation, the controller went wrong and destroyed the raid volume taking the entire data with it. Unfortunately, the system froze during the rebuild and refused to respond. All disk activity ceased as well. At this point, I was forced to reboot the server. Upon reboot, I can no longer boot the array and gain access to the OS despite the array being intact. I've attempted to recover via the rescue cd but I can't access the array at all by this means either.
Your web server folks don't back up?? RAID is redundant, but not infallable. * 100000
PS: I can show you how to backup data from MySQL using phpmyadmin. You can even automate it.
i did one phpmyadmin backup in mid-march (at least i think i did, i had never used it before so the whole thing was new to me) and was planning on doing another in april, guess i didn't act fast enough. oof
Comments
It's being worked on I believe, they scheduled a downtime and let folks know it would be down.
Nah, I think Ed is updating the site.
He said the board would be out for some time.
Peace
BRRRRRRRRRRRRRRRRR
oh shit did you (we?) just lose everything after march 3rd?
better get the google cache searchs going...
It's possible. They are telling Ed that our stuff MIGHT be backed up, but I guess they lost a lot of backed up stuff too. Right now it's looking about as bad as it could look.
i guess the mirror server crashed during the backup of the month of march
it's still bad, but it's not as bad as i thought
this is our only hope for posts in the month of march
oofers.
do I need to re-register to the forums?
if you guys are using PHPBB, I can probably hook you up with a new server. My friend runs a hosting co. and will do you well.
please. pm me doode
How about a "wha' happened?" graemlin?
Seriously though, that's a real bummer. Hatin' for those that lost work they had contributed. Let the peeps know what needs to be done to get 'er up and runnin again at full functionality.
for those who posted: if you remember each record you posted, type in "waxidermy" and the album title in a google search & click "CACHED" link when the results show up to recover your post text. then you can go in and re-post the review and the images/soundclips
hahahaha!!!!
laughing helps me fight back the tears.
ouch
thanks for the heads up
Your web server folks don't back up??
RAID is redundant, but not infallable.
* 100000
PS: I can show you how to backup data from MySQL using phpmyadmin. You can even automate it.
i did one phpmyadmin backup in mid-march (at least i think i did, i had never used it before so the whole thing was new to me) and was planning on doing another in april, guess i didn't act fast enough. oof
i'm being told right now that they are in the process of recovering ALL data.
lets hope it works
NICE!