Engineering decisions that have the greatest effect on worker and public safety occur early in the design process. During these decisions, engineers rely on their experience and intuition to estimate the severity and likelihood of undesired future events like failures, equipment damage, injuries, or environmental harm. These initial estimates can then form the basis of investment of limited project resources in mitigating those risks. Behavioral economics suggests that most people make significant and predictable errors when considering high consequence, low probability events. Yet, these biases have not previously been studied quantitatively in the context of engineering decisions. This paper describes results from a set of computer-based engineering assessment and decision experiments with undergraduate engineering students estimating, prioritizing, and making design decisions related to risk. The subjects included in this experiment overestimated the probability of failure, deviated significantly from anticipated risk management preferences, and displayed worsening biases with increasing system complexity. These preliminary results suggest that considerably more effort is needed to understand the characteristics and qualities of these biases in risk estimation and understand what kinds of interventions might best ameliorate these biases and enable engineers to more effectively identify and manage the risks of technology.