Cost constraints limit the availability and scale of the current detector technology, which use small, scarcely produced Cadmium Zinc Telluride wafers.
"Today, a typical state-of-the-art device has 2,048 by 2,048 pixels at a cost around $350,000 to $500,000," Figer says. "Detectors on large telescopes can cost a significant fraction of the total instrument budget. Very large, affordable infrared arrays will be essential for making optimum use of the proposed 30-meter class ground-based telescopes of the future."
"The key to making largerup to 14,000 by 14,000 pixelsand less expensive infrared detectors lies in using silicon wafer substrates, since large silicon wafers are common in the high-volume semiconductor industry and their coefficient of thermal expansion is well-matched to that of the silicon readout circuits," Figer says.
For the last 15 years, scientists have pursued the use of silicon substitutes in the quest for large infrared detectors. Until now, the crystal lattice mismatch between silicon and infrared materials has stymied advancement, causing defects that generate higher dark current, and thus higher noise, reduced quantum efficiency and increased image persistence.
Atoms in a silicon crystal are spaced closer together than those in infrared light-sensitive materials. When the infrared material is grown on the silicon, defects are generated. Photo-generated charge that represents the signal can get stuck and lost, or pop out of the lattice and show up as a phantom signal. The difference in atomic spacing can create the false signal.
Raytheon has developed the prototype detector technology using a method of depositing light-sensitive material onto silicon substrates while maintaining high vacuum throughout the many steps in the process. The material growth is done using
|Contact: Susan Gawlowicz|
Rochester Institute of Technology