27 October 2019

27th of October

Farm status
Intel GPUs
All off

Nvidia GPUs
One running Einstein gravity wave work

Raspberry Pis
All running Einstein BRP4 work

For news on the Raspberry Pis see Marks Rpi Cluster


Other news
Its been warm for the last fortnight so most of the farm has been off. I had all of the Nvidia GPU machines running Einstein gravity wave work on their CPU's last week.


Pimp my storage server
While it was warm I decided to pimp, I mean upgrade, one of the storage servers. Its a 2U rack mount with 12 x 3.5" HDD drive bays. I currently have 4 x 8TB drives which after allowing for single drive redundancy gives a theoretical 24TB usable space, but in practice its 21TB.

I've got lots of spare HDD's but they are of varying sizes from 320GB up to 4TB. These days for a storage server they recommend using the largest size drives available.

The first upgrade was memory. When running ZFS they recommend to increase the memory first as it will use it for caching. Its referred to as the ARC (adaptive replacement cache). I got 8 x 16GB sticks (128GB) which filled all the available RAM sockets. The server supports up to 32GB memory sticks but given its ECC memory it would have cost quite a bit more.

The second upgrade was an Intel Optane 900P 280GB SSD as a cache drive. In ZFS terms its called L2ARC (level 2 adaptive replacement cache). Its a half height PCIe x4 card so you can use it in a 2U server. They're recommended by "Serve the home" as cache drives at the moment.

07 October 2019

7th of October

Farm status
Intel GPUs
All off

Nvidia GPUs
Running Einstein gravity wave work overnight

Raspberry Pis
All except two running Einstein BRP4 work

For news on the Raspberry Pis see Marks Rpi Cluster


Other news
Weather warm during the day so crunching overnight.


Storage servers
I spent the weekend working on the tower case storage server. The case has room for 10 drives however the top two drive bays are intended for a CD burner. The other 8 drive bays are designed for 3.5" drives. I currently have 4 x 4TB WD SE drives in there. Originally this machine used to run Windows Server 2008 with a RAID controller. I swapped it over to Linux last year.

This week I removed the RAID controller card and plugged the drives into the motherboard SATA controller to simplify things. I updated the OS to the latest Debian and reinstalled ZFS onto it. I went to restore from the Drobo which decided one its drives was missing. A quick power off and reseat the 3 drives got it going in a degraded state. I copied the files back from the Drobo while it was repairing itself. That has the tower case storage server up and running.

For the rack mount (disk based) storage server I ordered more memory. It currently has the 32GB that was in the tower case storage server and I want it back. The rack mount will get 8 x 16GB of ECC memory bringing it up to 128GB.

I looked at adding a cache drive to the storage servers but the recommendations for ZFS seems to be increase the memory to its maximum first. Only add an SSD as a cache drive once you get above 64GB of memory as the L2ARC (cache drive) uses main memory.