Meiosis

From athena

The service contracts that support Athena are ending, and there are currently no plans to replace or upgrade the hardware. This moves the facility into a new service level and thus Athena configuration must change.

Contents

[edit] Current schedule

  • July 5 - Take debug nodes offline
  • July 19 - Provide I/O nodes to move data off of drives
  • July 19 - Athena (as we know it) will be offline
  • July 26 - Temporary I/O nodes will be taken offline
  • July 27-29 - Athena will be powered down for cooling system work
  • Aug 30 - Athena's "meiosis" will be completed [note revised date]

[edit] Details of Meiosis

The hardware that makes up Athena will not change, however the software configuration will radically change. Athena will have a Condor portion of the cluster and a ROCKS+MOAB portion. (More details to come...)

  • Drives will be reformatted
  • Home Directories will be separated
  • Scratch/Data disks will be going away

[edit] Home directory space

Home directory space will be split evenly between Physics/INT/CENPA and Astronomy.

[edit] Scratch and data disks

EMC is asking for $40,000 to support the RAID arrays for one more year. This is throwing away money and we'd prefer to invest even half of that money into something newer and more sustainable. We're expecting to have a fraction (10-20%) of the storage available when the meiosis is complete. It's intended to solely support scratch space for post-Athena work. Consider which data is valuable and store it appropriately. If you're interested in purchasing additional network disk for your research needs, please feel free to email help@phys.washington.edu.

[edit] NEW: File Transfers Nodes

Do not move more than 10 GB of data to your existing Physics/Astronomy home directories. There's not enough disk space there for "a run on the bank.

There are 3 file transfer nodes available. These nodes are not routed outside of the UW, so you'll need to initiate file transfers from these nodes (much like the Athena head node.)

  • data-io-2.phys.washington.edu
  • data-io-3.astro.washington.edu
  • data-io-4.astro.washington.edu

The first one is Physics, the second two are Astro. You can log in with your regular Physics or Astro account from anywhere on campus. It will mount your physics or astro home directory.

Here are the available file systems:

  • /share/home
  • /share/pogo1
  • /share/scratch2
  • /share/scratch1
  • /share/data1
  • /share/sdata2
  • /share/sdata1


Each node now has two optimized file transfer clients installed: hpn-ssh and bbcp. These are located in /usr/local/bin which should be in your path.

[edit] Moving data from Athena to Astro/Physics storage

If you're concerned about migrating your data, email help@phys.washington.edu with the following information:

  1. How much data your going to move (approximate)
  2. Whether it's lots of tiny files (<10MB and more than 100,000 files ) or many large files (>10GB/file and more than 100 files)
  3. Which file system you're planning to move your data to

We'll advise you on alternative places to store your data if where you're intending to store it is unavailable.

[edit] Moving data from Athena to other sites

Many folks in Physics and Astronomy have external accounts with access to TB's of data. These file transfer nodes should move data much faster than the athena head node using the two packages mentioned below:

[edit] hpn-ssh

hpnssh is a tweaked version of openssh that is optimized for large file transfers. It is possible to see several factors of 2 in improvement if both source and target are using it. Simply substitute hpnssh for ssh in your command line and hpnscp for scp.

[edit] bbcp

bbcp is a multi-stream file transfer client developed at SLAC. The program bbcp must be in your path on both source and destination in order to work. Also, bbcp uses non-standard ports, so if one of the systems has a firewall in place in cannot be the bbcp destination.

The best options Jeff has found in his limited testing between locations are as follows:

bbcp -p -P 2 -f -w 256k -s 4