News:

Buffalo provides Data Recovery services. Read about it here.

Main Menu

Reinstalling to an out-of warranty NAS with blank drives

Started by Eastmarch, September 06, 2019, 12:25:28 PM

Previous topic - Next topic

borgan@yahoo.com

I have a LS-QVL/R5, and the device started to hang on large uploads.  Support asked me to refresh the firmware, but the device has error E06 after the reboot.  I have gone through all of the steps listed, and am able to blank the disks, add the boot files on an ext3 partition and have the device boot in EM mode.  Any time that I attempt to add the firmware back on (goes through the partition table build, firmware load, and reboot), the device comes back up to the E06.  Can anyone give me any ideas about next steps, if any?  Thanks!

1000001101000

Getting all the way through a firmware install without error but still ending up with 6 red blinks is certainly odd. Have you done anything to validate the health of the hard drives?

borgan@yahoo.com

Good call on the drives.  Checked all four and disk #1 had SMART failures.  I swapped drives 1 and 4 to get the bad drive out of the initial slot.  Unfortunately, it failed after the initial partition, firmware load, and reboot again with an E06.  Will I need to replace that drive?  I was hoping not to until I verified that the NAS was working (would replace otherwise).  Thanks!

1000001101000

I don't have a great explanation for why that keeps happening, I would guess it has something to do with the bad drive but I would expect a failing drive to be more of a problem during the firmware install than at boot time in most scenarios.

On thing you can try is pulling that drive out and trying to boot, if it's getting hung up on that drive for some reason that might help.

If that still doesn't work I'd try an install without the bad drive. If you don't have a replacement handy you'll most likely need to trick the device into thinking it's normal to have less than 4 drives at install time ( you can always add more later). I have a note about how to do this in the guide:
Quotejava -jar acp_commander.jar -t <device ip address> -c "echo MAX_DISK_NUM=2 >> /etc/nas_feature"

You run that before running lsupdater using acp_commander which can be found here:
https://github.com/1000001101000/acp-commander


borgan@yahoo.com

OK, ran through all of that and the same E06 issue is occurring.  Is there a way to image the drives to the specific setup required?  Once the RAID partitions are dropped onto the drives, I cannot get access to them directly for obvious reasons.  It seems that there is something that is either disconnected in this process or has failed in the NAS control board.  Any other ideas on what could be done?  Thanks!

borgan@yahoo.com

#35
BTW, I did get Disk1 mounted and can review it.  Is there anything specific that I should be looking for or ensuring is there?  Also, would using an older rev of the firmware matter?  Currently using the 1.74 version (latest).

borgan@yahoo.com

OK, started looking at the drives in the RAID and realized something interesting...  The 1.74 firmware is dropping the uImage files into the first partition as "uImage-88f5182.buffalo" and "uImage-lsp.5.x.buffalo".  When I am re-building the drives from scratch, I set up drive 1 with these files, but had to rename the "uImage-lsp.5.x.buffalo" to "uImage.buffalo" for it to work.  After the latest firmware load and E06 errors, I pulled the first two drives and checked the first partition.  The uImage-lsp filename is not updated, as well as the RAID was still showing four drives instead of two.  I renamed the file on both drives and put them back into the NAS.  Booted it up, the drives resynced and I started getting an E16 instead.  Added the two other drives, rebooted, they all synced , and now the NAS is back up and operational.  That seems like a very wonky thing for the firmware names to cause that issue, especially given that it is the firmware from Buffalo that is causing it. 

Thanks for all of your help and hopefully this helps someone else!
B

1000001101000

That makes sense with the errors you've seen though I still don't understand how it got that way. Usually E06 indicates that it is able to access the boot volume has an issue with the boot files themselves (happens a lot when testing alternative OS), E16 usually means it can't find the boot partition.

Not installing/renaming the boot file explains the error, it just seems extremely odd that that would happen without lots of other errors happening at the same time. Ultimately I'd still blame it on the bad drive but exactly how that adds up is still confusing to me.

If you do get a replacement I'd be interested how installing with 4 known good drives turns out for you.

borgan@yahoo.com

#38
Drive is on the way, be here on Friday.  We'll see what happens then.  Thanks!

Other weird thing is that it is trying to recover old shares...  It think that there may be some funk on the device that is causing some issues as well.  I am completely reiniting the device and disks to see what happens.  If that drive were really the problem, I should be getting a deprecated disk which I am not.  I let you know what happens.

borgan@yahoo.com

Last post on this...  New drive arrived and did not change anything.  This is after I initialized/reset the device to factory (which did fix the E16 error issue) and wiped the drives.  The only fix for the E06 is the file rename.  Currently rebuilding the array (RAID 10) and will repopulate the device afterward.  Thanks for your help!

1000001101000

I'm glad you were able to work past it. The issue sounds sort of familiar but lots of folks have been restoring these devices without running into it.

mloh

Hi, am new with this forum. Pardon me if this has been asked before. I have a linkstation ls-qvl and recently there is a power failure. as not able to access in, i have switch off and switch on again the linkstation but now the power is on as i can see the light at the back but not in front panel and the ethernet is connected cos is blinking but there are no light indication whatsoever on the function. Any idea if the motherboard is fried or just not able to connect to the drives?

Your respond and help is much appreciated.

1000001101000

There are a few possible explanations for this. I would start by unplugging the power adapter and plugging it back in and verify whether the led next to where the power adapter connects to the device turns on in addition to the network led. Then try turning it on with the button on the front, if it still doesn't work you can try starting it without drives and see if that makes a difference.

A power failure could put the device in an odd state that can be corrected by unplugging it which has a good chance of resolving your issue. If something is wrong with the power supply or mainboard it could also cause what you are seeing. If the power supply isn't providing the normal amount of power the device might start without drives (which take most of the power).

mloh

Hi,

Thanks for your reply. I have tried what you have suggested to remove all hard drives and plug in the adapter. The green light is on after plugin the adapter. When press the power button in front, the fan starts turning and the ethernet port lights up (even though the cable is not plug in tho) There are still no lights indication on the front side. Does this means that the mainboard has issue?

Thanks again.

Quote from: 1000001101000 on June 17, 2020, 08:43:46 AM
There are a few possible explanations for this. I would start by unplugging the power adapter and plugging it back in and verify whether the led next to where the power adapter connects to the device turns on in addition to the network led. Then try turning it on with the button on the front, if it still doesn't work you can try starting it without drives and see if that makes a difference.

A power failure could put the device in an odd state that can be corrected by unplugging it which has a good chance of resolving your issue. If something is wrong with the power supply or mainboard it could also cause what you are seeing. If the power supply isn't providing the normal amount of power the device might start without drives (which take most of the power).

1000001101000

Some sort of mainboard or power supply issue by the sound of it. It's hard to separate the two without additional testing. I've seen basically the same behavior from 12v supplies that were broken and only providing ~5v but have also seen similar from broken boards as well. Either way i can't think of a non-hardware issue at this point. I would say try a different power supply but that assumes you have another 19v supply handy which is somewhat unlikely.

If only the leds failed for some reason it might look like it was doing nothing when actually mostly working, but I doubt there's a scenario that would cause all front leds to fail without taking out more vital things as well.

Browser ID: smf (is_webkit)
Templates: 4: index (default), Display (default), GenericControls (default), GenericControls (default).
Sub templates: 6: init, html_above, body_above, main, body_below, html_below.
Language files: 5: index+Modifications.english (default), Post.english (default), Editor.english (default), Drafts.english (default), StopForumSpam.english (default).
Style sheets: 4: index.css, attachments.css, jquery.sceditor.css, responsive.css.
Hooks called: 390 (show)
Files included: 35 - 1354KB. (show)
Memory used: 1128KB.
Tokens: post-login.
Queries used: 14.

[Show Queries]