Hectospec & Hectochelle

A Brief Description:
Hectospec (Available-except for December 20 through January 20 due to cold temperatures; ambient air temp must be above 20°F): a moderate resolution spectrograph at R = 1000 to 2500 and exploits all 300 available fibers. Hectospec's wavelength range covers 3650 - 9200Å. The instrument contact is Dan Fabricant. (Hectospec is not considered a PI instrument for SAO and UAO observers.)

 

Hectochelle (Available-except for December 20 through January 20 due to cold temperatures; ambient air temp must be above 20°F): an echelle spectrograph operating at and R range of 32,000 - 40,000, and a wavelength range of 3800 - 9000Å. Due to optical properties the image from on 240 of the fibers fall on the Hectochelle detectors. It is a single order instrument and there are 11 order-separting filters available. The instrument contact is Andrew Szentgyorgyi. (Hectochelle is not considered a PI instrument for SAO and UAO observers.)

The Fundamental Capabilities:

InstrumentGrating (lpmm)Spectral Range (Å)Blaze Wavelength (Å)  Dispersion (Å/pix)RMS image diameter (pix)

Hectospec

2703650 - 920052001.215

Hectospec

6005300 - 780060000.555

Hectochelle

1103800 - 9000/0.045

Further Information:

Hectospec and Hectochelle are two large bench mounted spectrographs that are fiber fed by 300, 25m long optical fibers from the telescope's Cassegrain focus. A common to both spectrographs robotic positioner that is mounted behind the f/5 beam uses a pair of six-axis robots, dubbed Fred and Ginger, to reconfigure all 300 fibers over the 1deg focal surface to an accuracy of ~25um in just 300 seconds. Each fiber that is held magnetically on the focal surface has a core diameter of 250um that subtends to 1.5" on the sky. Adjacent fibers can be spaced as closely as 20" and is positioned using a complex algorithm.

Hectospec is a moderate-resolution, multiobject optical spectrograph, it offers 5770Å of spectral coverage at ~6Å resolution in the 350 to 1000 nm band. A higher dispersion grating offering ~3Å resolution is also available.

The Hectochelle adds a high dispersion capability to Hectospec's moderate dispersion with a second, very large bench-mounted spectrograph using an echelle grating. Hectochelle uses 240 of the possible 300 fibers and attains R~32,000 (sigma~4 km/s) over a single, filter-selected orders.

For greater details about the instruments, how to prepare for your observations, your reponsibilites, tips on data collection and reduction and links to all supporting documentation please see the Hectospec webpages and the Hectochelle webpages.  IDL Software for the reduction of Hectospec data can be found here (please contract Joannah Hinz with any questions).

HSRED Reduction Pipeline

What's New:

We are pleased to release version 2.0 of HSRED, incorporating a number of significant improvements provided by the Telescope Data Center at SAO. Key improvements include:

  • Fine-tuned wavelength calibration, especially for 600-line grating data.
  • Improved, faster cosmic-ray rejection using statistics derived from all exposures of a target.
  • Improved sky subtraction, by allowing model sky to vary smoothly with fiber number across the chip.
  • A new routine to model and subtract the red light leak that affects Hectospec spectra longward of 8500A.
  • Correction for the A- and B-band telluric (atmospheric) absorption features.
  • For fields observed without flux calibration stars, we now default to a simple weighted coaddition of exposures, which yields more stable results, especially under marginal observing conditions.
  • Installation Instructions:

    HSRED v2.0 can be downloaded via github. Just execute the following command:

    git clone git://github.com/richardjcool/HSRed.git 

    As before, HSRED requires copies of both the idlutils and idlspec2d packages. HSRED v2.0 has been tested and verified to work with idlutils v5_5_15 and idlspec2d v5_7_1, but will also work with the (now ancient) versions of these packages needed for the old version of HSRED, so there is no need to update if you already have a working version.

    Once you have downloaded all the pieces of code you need, continue following the instructions for setting up your environment variables here. Any installation or usage questions can be addressed by Joannah Hinz.

    The Quick Cookbook:

    A new wrapper script is included with HSRED v2.0 that should streamline the reduction process. Step one is to place all the files you wish to reduce from a single night in a directory together: biases, dome and twilight flats, comparision lamp exposures, and any science exposures. You must include both the .fits files and their associated _map files. Sets of exposures using different gratings (270/600) or central wavelengths must be sorted and separated, one working directory per config (per night).

    From your working directory, start idl and run the command:

    IDL> hs_pipeline_wrap, /dostand, [rerun='0100']

    for a standard reduction with cosmic-ray rejection, summed combination of exposures, red-leak removal, etc. The optional rerun keyword can be passed to make more than one reduction of the same dataset. If not specified, rerun will default to 0100 and your reduced data products will be placed in the subdirectory 'reduction/0100'.

    If you have included F-stars in your target configuration, and wish to perform flux calibration as part of the coaddition, run:

    IDL> hs_pipeline_wrap, /uberextract, [rerun='0100']

    Note that, for flux calibration, you must still add your stars and their photometry to the $HSRED_DIR/etc/standardstars.dat file, as described here. However, it should no longer be necessary to generate and use "plugcat" files. All info needed should now be read from the included _map files.

     

    If your data was taken with the 600-line grating, you must specify the /do600 keyword, like:

    IDL> hs_pipeline_wrap, /dostand, /do600, [rerun='0100']

    For data obtained with an offset sky exposure (for sky-subtraction in crowded fields), the proceedure is a bit more complicated, and requires editing the lists/cal.list file. First, to reduce your science frames without the normal fiber-based sky subtraction, run

    IDL> hs_pipeline_wrap, /dostand, /doskyobject, [rerun='0100']

    Next, the sky offset frame must be reduced separately. Edit the cal.list, commenting out the line for your science frames, and adding a new line giving the sky offset frames you wish to reduce (See the old page here for more on the format of the cal.list file, and remember it is no longer necessary to specify a plugcat.) Now, run the pipeline wrapper again. NOTE it is very important to use a different "rerun" number for this reduction, to avoid confusion with file names of the reduced data products.

    IDL> hs_pipeline_wrap, /dostand, /doskyobject, rerun='0101'

    The final step is to manually subtract the reduced sky offset frame from the reduced science frame, using the reduced data format and method of your choice.

    DATA PRODUCTS:

    Compared to earlier versions of HSRED, an expanded set of data products is generated, generally thanks to an enhanced version of hs_toiraf.pro which is called automatically within hs_pipeline_wrap. If you only wish to have the spHect* files, please just edit hs_pipeline_wrap.pro to comment out the relevant lines. There have also been minor changes to what is stored in the spHect* files, so in either case, please consult the description below:

    skysub_/###.*.ms.fits: Subdirectory containing individual iraf-format spectra, linearized in wavelength, one file per fiber. Format is a 4543x1x4 cube. The 4 spectra in this cube are (in order):

  • Sky-subtracted, varaince-weighted, coadded spectrum
  • Sky-subtracted, summed spectrum
  • Sky spectrum that was subtracted
  • sigma (sqrt(variance)) spectrum
  •  

    If the pipeline was run with the /uberextract option, 1) and 2) will be the same, and contain the flux-calibrated, averaged spectrum (rather than one in coadded counts). This is true for all the data products described here.

    .ms.fits : All 300 fibers in one file, with linearized wavelengths. This is a 4543x300x4 data cube, and the 4 elements along the 3rd dimension are the same as those described above for the single-fiber files.

    spHect*fits: multi-extension fits file containing all coadded, sky-subtracted spectra for each hectospec configuration. Wavelengths are tabulated per pixel, rather than being described as a function in the fits header. The new, slightly altered data format is:

  • HDU0: wavelengths (Angstroms)
  • HDU1: sky-subtracted, variance-weighted coadded spectra (total counts) OR flux-calibrated averaged spectra
  • HDU2: inverse variance (counts)
  • HDU3: AND bad pixel mask
  • HDU4: OR bad pixel mask
  • HDU5: Plugmap structure (fiber info)
  • HDU6: Combined sky spectra, absent if flux calibration was set
  • HDU7: Summed (unweighted) spectra, absent if flux calibration was set
  • spObs*fits: multi-extension fits file containing sky-subtracted spectra for each individual science exposure (non-coadded). Format is similar to the coadded spHect files, but spObs files include the sky spectrum subtracted from each fiber on this exposure in HDU4

  • HDU0: wavelengths (Angstroms)
  • HDU1: sky-subtracted, coadded spectra (total counts) (OR flux-calibrated)
  • HDU2: inverse variance (counts)
  • HDU3: bad pixel mask
  • HDU4: sky spectra (single-exposure)
  • HDU5: Plugmap structure (fiber info)