IBM DS3512/24 firmware upgrades.

Anyone had any experience with this on the latest firmware versions? Just want a little verification before I do an upgrade. Two lots of firmware to upgrade (not doing the disks) ESM and Controller.

New version of the ESM firmware says "Recommend controller firmware 7.77.18" we're on 7.70.38.New version of the Controller firmware says "Upgrade ESM before upgrading controller firmware." but doesn't say anything about a minimum supported ESM version in any of the READMEs or txt files. Installed is 319, new is 343.

Doesn't look like there are a huge number of releases in between as far as the ESM goes, but I thought I'd see if anyone else had done this recently and which order you went about it in.

I have a DS3512 with six expansion enclosures (84 x 2TB disks) that we use as a backup to disk target. I am currently running 7.70.45, and it has been long enough that I should probably review the firmware updates available.

I did do a firmware update to it after it was running; it only has a couple of attached windows boxes and they were Ok with the controllers failing over and back.

Are you considering the upgrade to resolve an issue, or just as maintenance?

I accidentally did this upgrade last week, during the middle of the day!

There's ESM firmware included with controller firmware, if you dig for it in the .zip that you download for the Controller Firmware. I had no issues. We only had about 85iops/sec average when the upgrade happened.

That said, read very carefully, because you have to "opt in" staging the firmware upgrade, otherwise it will just upload it and immediately begin installing it (which is why ours accidentally upgraded, someone...okay me...misread and thought that if you kept the box checked it would stage the firmware). Clear the box and it stages, leave it checked and you go directly to firmware upgrade, do no pass go, do not collect unemployment.

We have a single DS3524, fully populated, and a EXP3512 hung off it it, with a dual-controller setup.

I'll have to have a harder look, but I don't think I saw the ESM in the controller package, just FIRMWARE and NVRAM.

@Zaphod : One of the units we have keeps sending an alert saying a Drive enclosure component failed or removed. Checked the logs and it complains it was removed, then put back 2 or 3 seconds later. This was happening once every two or three months, now has sent about 50 alerts over the last 3 days.

We've got about 2 3524s and 8 3512s, and just one of the 3512s is doing this.

@Zaphod : One of the units we have keeps sending an alert saying a Drive enclosure component failed or removed. Checked the logs and it complains it was removed, then put back 2 or 3 seconds later. This was happening once every two or three months, now has sent about 50 alerts over the last 3 days.

We updated our DS4700's because of this. It's not fun getting a san alert at 3am then logging in to find out the error cleared the very next second. The funny part is IBM replaced 3 power supplies before the word came down that it was a bug in the firmware.

@chalex : Been through the docs a dozen times now. The section on updating the ESM in the Controller docs says check the readme files distributed with the Firmware, and the only text files in the directory that actually contains the firmware is exactly the same document.

Looking at the SSIC at the moment, see if I can navigate my way through that. There's a brief firmware page that covers all of the DS3xxx series that mentions that originally this stuff was shipped with 7.70, and new versions are shipped with 7.77, nothing about upgrading from 7.70 to 7.77, and that may be the problem. Might have to see if I can find the email from IBM asking us to do the upgrade and see if they wanted us to go from 7.70 to 7.77, or just get the latest 7.70.x

Edit : Confirmed, IBM want us to go from 7.70 to 7.77.SSIC has everything you wanted to know, except ESM version compatibility.

Uhuh... Fixes guide says it should have something else in the problem summary so I know if it's battery or cache related. Nothing there. All the various subsystems check out. Googling reports one page on IBMs website which says this might happen with firmware 7.77.18 and 7.77.19 after a firmware upgrade, where the system does a 24 hour battery test. Fixed in 7.77.20... Which I just installed.

Upgraded ESM, just to make sure it was really broken. No change. Nothing in the logs about battery tests either.

Currently contemplating a 24 hour wait to see if the battery tests succeed and the issue is cleared, or upgrading to the newly minted 7.77.34.

@Zaphod : One of the units we have keeps sending an alert saying a Drive enclosure component failed or removed. Checked the logs and it complains it was removed, then put back 2 or 3 seconds later. This was happening once every two or three months, now has sent about 50 alerts over the last 3 days.

We updated our DS4700's because of this. It's not fun getting a san alert at 3am then logging in to find out the error cleared the very next second. The funny part is IBM replaced 3 power supplies before the word came down that it was a bug in the firmware.

Yeah, I was having this issue, and the new firmware seems to have solved it. Also, I moved to 7.77.20, and it's been stable.

And sorry, I was mixing the NVSRAM and ESM. The NVSRAM was buried in the .zip for the controller firmware. We had our ESM upgraded last spring (version is 4.10 from Q1 last year) , so there wasn't a need on my part to upgrade it, and all of our drives are running newest firmware, so didn't have to do anything with that.

For those interested, after doing this "the wrong way", i.e., not deploying a version of the ESM that lists as a requirement a newer version of the Controller Firmware, before updating the Controller Firmware, and having the Writeback cache forcibly disabled on me, I've now gone and done it pretty much as documented, and it has just worked.

I am staging the Controller firmware and NVSRAM then activating it.

Notes :1. Upgrade Storage Manager to current version first.2. Don't use the firmware updater at the Enterprise Level. That's for updating major firmware versions over multiple inactive Storage Subsystems. Log into each Storage Subsystem and update from there.3. If you look at the about information for the Storage Manager version, you may find they are different between Enterprise and Subsystem. This is fine and will change once the Subsystem is updated.

Steps :

1. Run SMclient. 2. Pick a subsystem to upgrade, ie DS3512-3, double click on it to launch the Subsystem Manager.3. Select Advanced -> Troubleshooting -> Support Data -> Collect…4. Enter a filename to save to. I've gone with subsystem_name-pre-upgrade.zip. i.e. DS3512-3-pre-upgrade.zip5. Hit Start, this takes about 1 - 2 minutes. Initial progress bar almost doesn't move, then zips through the remainder.6. Check the current firmware levels. Advanced -> Maintenance -> Firmware inventory..7. Select Advanced -> Maintenance -> Download -> ESM Firmware…8. Tick the box to select all units. Locate the firmware in the ESM firmware directory. Click Start, then type yes in the box.9. Wait about 1 - 2 minutes per unit.10. Check the firmware levels again. ESM firmware is reported at the bottom of the report window.11. Save and then clear the existing event logs. 12. Select Advanced -> Maintenance -> Download -> Controller Firmware13. If you didn't do step 11, and there are errors in the logs, at this point you'll be told there are errors in the logs and you need to clear them before you can proceed with the updates.14. Select the Controller Firmware.15. Select the box marked Transfer NVSRAM file with Controller Firmware and select the Dual Controller firmware file.16. Select the box marked Transfer files but don't activate them (activate later), and hit Transfer17. Click OK when prompted to confirm transfer then activation.18. Transfer is approx 28MB, activation will be marked as postponed.19. Transfer is successful, click Close.20. Check the firmware levels, you should see that the old firmware is active and the new firmware is pending. 21. Select Advanced -> Maintenance -> Activate Controller firmware.22. Confirm the pending versions are correct and click Yes. PANIC!23. Activation may take up to 10 minutes. During the Activation process, running dmesg on the gpfs nodes should show the multipath demon complaining about paths becoming unavailable.mmlsnsd -Lv may report errors if one of the GPFS NSDs becomes unavailable, if everything goes well you won't see anything happen here. mmdf may also report any errors.24. After activation click ok, and the Storage Subsystem may quit as with the change in firmware comes a newer version of the Storage Subsystem with new interfaces.25. Relaunch the Subsystem Manager and check the firmware levels again.26. Proceed to the next Subsystem.

There you have it. A simple 26 step process. Yell out if you want me to clarify anything.

Anyone had any experience with this on the latest firmware versions? Just want a little verification before I do an upgrade. Two lots of firmware to upgrade (not doing the disks) ESM and Controller.

New version of the ESM firmware says "Recommend controller firmware 7.77.18" we're on 7.70.38.New version of the Controller firmware says "Upgrade ESM before upgrading controller firmware." but doesn't say anything about a minimum supported ESM version in any of the READMEs or txt files. Installed is 319, new is 343.

Doesn't look like there are a huge number of releases in between as far as the ESM goes, but I thought I'd see if anyone else had done this recently and which order you went about it in.

Doing a huge necro on this thread (sorry, but it has heaps of good information).

I can't work out either if you need to do the ESM update as stated first, or the controller. We are on 7.83 and need to go to 7.84 at least to support vmware 5.1. I was going to jump to 7.87, but I don't know if the ESM needs to be done first. Very hard to workout from the readme notes.

Somewhere in the ESM or the controller notes there should be a suggestion of which requires which. I think you should upgrade ESM before controller, and that's what Matt's post suggests above.

For disk firmware, if you have a full downtime, might as well do the extra upgrade, just takes a few extra 5-60 minutes. If you don't get a full downtime, don't worry about it? It's just if something goes wrong, IBM support can say "upgrade to the latest disk firmware and see if that resolves the issue".

Disk firmware wise, I've never done it, with any brand. We've had it all running in production for about 1.5 years now, so I'm sure a problem would have cropped up by now. Was just curious if anyone else has done it.

Disk firmware wise, I've never done it, with any brand. We've had it all running in production for about 1.5 years now, so I'm sure a problem would have cropped up by now. Was just curious if anyone else has done it.