Install and Configure Your Frontend

1.2. Install and Configure Your Frontend

This section describes how to install your Rocks cluster frontend with the Rocks Base CD coupled with the HPC Roll CD.


The minimum requirement in order to bring up a frontend is to have the Rocks Base CD and the HPC Roll CD. That is, using only the Rocks Base CD will result in a non-functional frontend.

  1. Insert the Rocks Base CD into your frontend machine and reset the frontend machine.

  2. After the frontend boots off the CD, you will see the boot screen:

    When you see the screen above, type:



    The "boot:" prompt arrives and departs the screen quickly. It is easy to miss. If you do miss it, the node will assume it is a compute appliance, and the frontend installation will fail and you will have to restart the installation (by rebooting the node).


    If the installation fails, very often you will see a screen that complains of a missing /tmp/ks.cfg kickstart file. To get more information about the failure, access the kickstart and system log by pressing Alt-F3 and Alt-F4 respectively.

  3. After you type frontend the installer will start running. Soon, you'll see a screen that looks like:

    To add the HPC Roll, select 'Yes' by pressing the space bar.

  4. After the CD/DVD drive ejects the Rocks Base media, put the HPC Roll CD into the drive.

    To install the HPC Roll, select 'Ok' by pressing the space bar.

  5. When the HPC Roll is discovered, you'll see window with the message:

    Found Roll 'hpc' 
  6. Then you are asked if you have another Roll to add.

    If you are just adding the HPC roll, then answer 'No' by hitting the 'Tab' key (which highlights the 'No' button) then hit the space bar.

    If you adding another Roll, hit the spacebar and the CD will eject. Remove the HPC roll and put in the next Roll and answer 'Yes'. (This is the same procedure as described in Step 4 for the HPC roll).

  7. After you've added the rolls, you'll be asked to put the Rocks Base CD back into the drive.

    Put the Rocks Base CD back into the drive, then answer 'Ok' by hitting the space bar.


    If you are building an IA64-based frontend, you may see an error message on the screen regarding volname and the installation will continue to ask for the Rocks Base DVD. This occurs because the DVD drive in the frontend is a low-cost drive and it requires a driver to be preloaded in order to support the volname command.

    A workaround is to reboot the frontend with the Rocks Base DVD, but this time at the ELILO boot prompt, type:

    frontend driverload=ide-scsi hdb=ide-scsi

    If your DVD is recognized by Linux as hda, you'll have to change the above line to:

    frontend driverload=ide-scsi hda=ide-scsi

  8. Then you'll see the Cluster Information screen:

    This information is used by Ganglia to uniquely identify this cluster.

    This information is optional.

    Fill out the form (or take the defaults) then hit the 'Ok' button.

  9. The disk partitioning screen allows you to select automatic or manual partitioning.

    To select automatic partitioning, hit the Autopartition button. This will partition the frontend like:

    Table 1-1. Frontend -- Default Root Disk Partition

    Partition Name Size
    / 6 GB
    swap 1 GB
    /export remainder of root disk

    To manually partition your frontend machine, select Disk Druid.


    If you select manual partitioning, you must specify at least 6 GBs for the root partition and you must create a separate /export partition.

    Automatic partitioning is the default (and recommended).

  1. The private cluster network configuration screen allows you to set up the networking parameters for the ethernet network that connects the frontend to the compute nodes.


    It is recommended that you accept the defaults (by hitting the Tab key until the Ok button is highlighted, then hitting the Enter key).

    But for those who have unique circumstances that requires different values for the internal ethernet connection, we have exposed the network configuration parameters.

  2. The public cluster network configuration screen allows you to set up the networking parameters for the ethernet network that connects the frontend to the outside network (e.g., the internet).

    The above window is an example of how we configured the external network on one of our frontend machines.

  3. Configure the the Gateway and DNS entries:

  4. Configure the name for your frontend:

    The name can be a fully qualified domain name or just the hostname portion of a fully qualified domain name (e.g., like the default frontend-0). If you plan on adding the Grid Roll (or other Globus PKI services) the hostname must be the primary FQDN for your host.


    Choose your hostname carefully. The hostname is written to dozens of files on both the frontend and compute nodes, if the hostname is changed cluster services will no longer be able to find the frontend machine. Some of these services include: SGE, Globus, NFS, AutoFS, and Apache.


    Do not choose to name your frontend "automatically via DHCP". This will prevent the frontend's DNS server from being included in name resolve path and will cause major problems in your cluster.

    If you absolutely need to use DHCP, be sure to add the line 'nameserver' to /etc/resolv.conf.

  5. Input the root password:

  6. The frontend will format it's file systems:

  7. Then install the base packages:

    And the installer will copy the contents of the Rocks Base to the hard disk on the frontend.

  8. Then the installer will ask for each of the roll CDs you added at the beginning of the frontend installation.

    Put the appropriate roll CD in the drive when prompted and hit 'Ok'.

  9. After the last roll CD is installed, the machine will reboot.


    We have observed an intermittent problem when the frontend reboots. The frontend will hang and display the error message: GRUB Loading Stage2.... To resolve this error, see Grub Stage2 Error.