Provisioning Itanium Blades in 2020

For a long time I've wanted to try programming on an Itanium, and with Intel having called last orders, Windows server support mostly ended in January, gcc is dropping support for it, and Linux is dropping support, the price of these servers has finally dropped out of the stratosphere and I've picked up a used i4 with four bl890 blades in a c7000 enclosure.

Setting up HP Bladecenter

The HP Bladecenter is managed by the ILO web interface, the ip address can be set on the little screen on the front of the rack.
First impressions were mostly HP Bladecenter Administrator pain from it only working in Internet Explorer due to Silverlight, Java, .NET, and ClickOnce dependancies. Even with the latest Version 4.4 of the Onboard Administrator firmware, Java required the below exceptions to allow the use of the remote serial console and remote dvd drive:
Add file and socket access exceptions to: C:\Program Files (x86)\Java\jre1.8.0_241\lib\security\java.policy
permission "<>", "read, write, execute, delete";
permission "*", "accept, connect, listen, resolve";
permission java.util.PropertyPermission "*", "read, write";
permission java.lang.RuntimePermission "*";
permission java.awt.AWTPermission "showWindowWithoutWarningBanner";
Remove rsa-md5 signing from jdk.jar.disabledAlgorithms in C:\Program Files (x86)\Java\jre1.8.0_241\lib\security\
Add the ILO url to the java exception site list.
This was enough for me to start working with Bladecenter but I still haven't figured out how to get the Integrated Remote Console (IRC) to work, however the HP ILO Standalon console does the same job.

Configuring the HP Smart Array P410i Controller

There are two ways to configure RAID here, either with the saupdate.efi file available through a HP support contract, or using ORCA, as I don't have HP support I only know Orca. Unfortunately HP have removed all of the documentation I've seen referenced on the forums regarding this but I've mostly peiced it together. We need to get the drivers and device numbers for the arrays and plug them into drvcfg like so:
EFI> drivers
            T   D
D           Y C I
R           P F A
V  VERSION  E G G #D #C DRIVER NAME                         IMAGE NAME
== ======== = = = == == =================================== ===================
98 00000354 B X X  1  1 Smart Array SAS Driver v3.54        MemoryMapped(0xB,0x
EFI> devices
C  T   D
T  Y C I
R  P F A
L  E G G #P #D #C Device Name
== = = = == == == =============================================================

B4 B X X  1  2  1 Smart Array P410i Controller
EFI> drvcfg
98 B4 ...
EFI> drvcfg -s 98 B4
[Orca menu]
From here we can enable/disable raid and manage partitions. Once we are done managing the arrays we can reload the partitions into EFI with:
EFI> reconnect -r
EFI> map -r
Annoyingly I've found myself having to do this regularly as the installers all fail if there is anything but a fresh raid array.

The only other note on the Bladecenter is that it gets it's system time from the installed Operating System, and the way it calculates if your login session is timed out appears to be based on the difference between your local time and the time on the BladeCenter you logged in, thankfully it doesn't check this often but it does make it impossible to create a local account on the iLO without setting up an operating system with the correct time or adjusting your system time to 1970.

Installing an operating system

There is not much choice of operating system for the Itanium, officially only HP-UX, OpenVMS, and Windows Server 2008 are listed on the remote serial connection page as supported but there is also:
RHEL 5.11
Gentoo Linux
Debian 10.0.0
SLES 11 (End of general support)
FreeBSD 9.3 (EOL)
Debian 7.11 (EOL)
Ubuntu 8.0.4 (EOL)
Windows XP Beta (EOL)

All of the installers use the serial output as opposed to VGA output, if you're looking at the VGA then they all appear to freeze after loading the ramdisk.

The OpenVMS installer works without issue.
The RHEL 5.11 installer works without issue.
The SLES 11 Installer doesn't reach a menu, and instead gives unending DMAR and DRHD faults, it looks like this, when I reached out to SUSE support they would not support this without a LTSS subscription.
The Debian 10.0.0 installer does not work, there are many new ISOs to try but none of them install, I suspect the earlier ones fail due to missing drivers for the Virtual CDROM and the later fail due to incompatibility with kernels after version 4.6.7.
The Debian 7.11 installer has a bug in which stops the bootloader from installing with the message:

Running "/usr/sbin/elilo" failed with error code "1".
This was reportedly fixed here but it appears to have snuck back in, this can be worked around in the shell after elilo fails:
# cd /
# mount -o bind /proc /target/proc
# mount -o bind /sys /target/sys
# umount /target/boot/efi
# chroot /target /bin/bash
# cd /root
# wget
# sed -i 's/iso8859-1/utf8/'
# chmod +x
# ./ -b /dev/sda1 # This points to the /boot partition
# efibootmgr -c -L "Debian" -l 'EFI\debian\elilo.efi'
# # Depending on the partition scheme root may need correcting to point to / in /boot/EFI/debian/elilo.conf
# exit
# exit # Exits back to installer
Packages for 7.11 are still served by debian, just add the below to /etc/apt/sources.list
deb wheezy main
Debian7.11 -> Debian11 Upgrade
I did manage to upgrade from 7.11 to 11 by installing a 4.6.7 kernel and dpkg 1.18.0 (for .xz support in the new packages), changing the package source to:
deb sid main
and running
apt-mark hold linux-image-4.6.7 linux-headers-4.6.7 linux-firmware-image-4.6.7 linux-libc-dev 
apt-get upgrade
then dealing with the left behind packages manually. I did have to install ifupdown and add /usr/sbin and /sbin to the path (via /etc/profile) but otherwise it wasn't too bad.
Packages for this are available here.
The Gentoo installer does not currently work on these blades, looking at some older install disks I found I also had no luck:
20191126 - fails with "kernel panic, tried to kill init
20180509 - fails with "can't find /newroot in fstab

It should be possible to upgrade the RHEL or Debian installs with a stage3 Gentoo install but I haven't yet managed it.

Next Steps

I will use this as a development environment for hobby projects as it provides some unique challenges. Here is the hello world at the start my journey:
        .global main
        .proc main
main:   .prologue
        alloc           loc1 = ar.pfs,0,4,1,0
        mov             loc2 = gp
        mov             loc3 = rp
        add             loc0 = @ltoff(msg),gp
        ld8             out0 = [loc0]    rp = puts
        mov             gp = loc2
        .restore sp
        mov             rp = loc3
        mov             ar.pfs = loc1
        mov             ret0 = 0
        br.ret.sptk     rp
        .endp main
msg:    stringz "Hello world!"
[jon@squa devel]$ as -x hello.s -o hello.o 
[jon@squa devel]$ gcc hello.o
[jon@squa devel]$ ./a.out
Hello world!

Misc notes on the BL8x0

Misc bugs in the iLO are usually fixed by reseating the faceplate and restarting internet explorer.
The old smart array firmware has quite a few issues, I highly recommend upgradeing to 6.64.
The firmware matrix is available here.
(1) RHEL 5.11 screenfetch output

(2) Debian 7.11 screenfetch output

(3) OpenVMS show cpu output

(4) Debian 11 screenfetch output