Author Topic: Removing all drives.  (Read 6275 times)

bilbus

  • Calf
  • *
  • Posts: 11
Removing all drives.
« on: March 02, 2009, 03:28:40 PM »
   

Ok, so i have two objectives.

 

A. Upgrade the 4x 250gb drives to 4x1tb. (done)

 

B. Document and test recovering from a failed raid array. I was under the impresion that linux is installed on the drives some how (i hope not) Since this will be our new onsite backup location i want to make sure recovery is quick in case of a failure.

 

Does anyone know if the firmware (linux?) is stored on flash, or the HDs?

 

So I have removed all 4 drives, and replaced them with 4 new larger drives.

 

I removed them all to acomplish objective B.

 

So after loading all the new drives the device fails to boot. It seems to be in EM mode. It also seems to have a DHCP address, and 192.168.11.150.

 

I setup a tftp server and it seems to try to connect to my server at 192.168.11.1. It aslo seems to be lookign for a image of some kind, i put the image in the root of the tftp server and it seems to try to load it, but does not seem to work out.

 

i also ran the firmware recovery tool with similor results.

 

I am also unable to access the webpage at either address (dhcp or default ip) I was able to prior to failing the array.

 

Does anyone have documentation for this recovery?

 

Also does the device normaly function if drives are removed or the array is failed?

 

Last, does anyone know how the raid is done? is it a software or hardare solution?


bilbus

  • Calf
  • *
  • Posts: 11
Re: Removing all drives.
« Reply #1 on: March 03, 2009, 09:30:01 PM »
   

anyone?


bilbus

  • Calf
  • *
  • Posts: 11
Re: Removing all drives.
« Reply #2 on: March 04, 2009, 10:25:48 AM »
   

come on, help please.


GeorgiaRebel

  • Calf
  • *
  • Posts: 4
Re: Removing all drives.
« Reply #3 on: March 04, 2009, 04:03:18 PM »
   

I asked essentially the same question and have gotten the same silent treatment you are getting. So much for Buffalo's responsive tech support (unless you consider no response to be responsive). :smileymad:

 

- Alan


bilbus

  • Calf
  • *
  • Posts: 11
Re: Removing all drives.
« Reply #4 on: March 04, 2009, 04:13:58 PM »
   

Well i was able to figure out a few facts that make me sad.

 

Software raid.

Linux is NOT stored on flash like i was told. Its on the disks. The "server" stores linux on the HDs not the flash. Wow lame.

 

When i asked support where the linux install was located, they told me the flash. One guy also told me that the array would be fine with a loss of two disks (out of 4) with raid 5 ....

 


GeorgiaRebel

  • Calf
  • *
  • Posts: 4
Re: Removing all drives.
« Reply #5 on: March 04, 2009, 04:53:30 PM »
   I was thinking of removing the drives one at a time and replacing with larger drives.  After that I would rebuild the raid with the new drive sizes, but not knowing how Buffalo handles the RAID (software or hardware), I am not sure that would work.

bilbus

  • Calf
  • *
  • Posts: 11
Re: Removing all drives.
« Reply #6 on: March 04, 2009, 05:11:16 PM »
   

I have heard that works fine.

 

You will have to delete the raid array though. You can delete before or after you add drives.

 

Since linux is on each drive ... what a joke. As long as one drive is alive its fine.


GeorgiaRebel

  • Calf
  • *
  • Posts: 4
Re: Removing all drives.
« Reply #7 on: March 04, 2009, 05:21:49 PM »
   I'll order the drives and try it.  It is a screwed up way to increase capacity.  I should have gone with a ReadyNAS.  I just upgraded the ReadyNAS to a 2TB system from a 1TB setup with absolutely no problems.

bilbus

  • Calf
  • *
  • Posts: 11
Re: Removing all drives.
« Reply #8 on: March 04, 2009, 05:34:17 PM »
   

I wanted to buy a new server .. but i got stuck with this thing due to cost.

 

Not even hardware raid or hot swapable


bilbus

  • Calf
  • *
  • Posts: 11
Re: Solved Removing all drives.
« Reply #9 on: March 08, 2009, 04:28:46 PM »
   

Looks like the trick is this

 

Keep one orginal drive in the nas, add 3 blank new larger drives.

 

Create a raid array, this will force the linux OS to be installed on the new drives.

 

Remove the orignal drive, if the device boots up correctly then move on.

 

Add the 4th blank drive to the nas, kill the raid array. Reboot.

 

It should boot, now create a new array using all 4 disks.

 

The device should be all set.

 

A few things to note.

 

software raid.

The bootable linux OS is on all 4 HDs. Each HD has 2 partitions (even without the raid setup)

My guess is /boot and something else.

 

So each of the 4 disks will have identical 2x partitions, the nas can boot off any of these.

 

Keep the atleast one of your orignal HDs in storage, it can be used to rescue the device if all 4 drives die.