UAS MISHAPS AND ACCIDENTS

ASCI-638 Module 7 examined UAS accidents and incidents and compared them to accidents and incidents that have occurred in piloted aviation.  Wild, Murray, and Baxter (2016) considered technological issues the primary causal factor in UAS accidents and incidents, not human factor deficiencies.  Research conducted by Joslin (2015) revealed 84% of accidents and incidents involving UAS were contributed to equipment failure – C2 lost link and infrastructure degradation of facilities supporting UAS.  Glussich and Histon’s (2010) study of human interaction with automation revealed that inadequacies in mechanical systems were the leading cause of UAS accidents.  UAS accidents were investigated and finding showed that accidents varied with the level of technology and technological reliability was “consistent with early piloted aviation” (Glussich & Histon, 2010).   

Every aviation operation comes with hazards and risks.  Hazards associated with aviation can range from pilot attitude to the operational environment in which the flight is conducted.  For example, a pilot with an anti-authoritative attitude may cut corners by entering the traffic pattern with non-standard entries, misuse checklists, or operate near cloud layers not legally permitted when weather condition are poor (Rossier, 1999).  Risks involved with aviation cannot be eliminated.  One risk of flight is operating in marginal VFR (MVFR).  MVFR is a weather condition defined as having a ceiling between 1,000 and 3,000 feet and/or 3 to 5 miles visibility.  When operating in MVFR the risk of encountering inadvertent IMC is present and for a pilot that is not instrument rated, this situation could result in spatial disorientation (Namowitz, 2015).

UAS operators are pushing for flight beyond visual line of sight (BVLOS).  To help mitigate risks associated with BVLOS, risk management tools such as the basic risk assessment matrix is helpful.  The risk assessment matrix can aid in evaluating commonly known hazards affiliated with BVLOS operations in terms of probability and severity (Elliot & Shear, 2016).  This tool offers a glimpse of the operation during flight planning stage and useful as a guide in making “a go/no-go” decision.  The risk assessment matrix also provides real-time data to safety and management personnel, which is allows constant monitoring of the operation from a safety standpoint (Elliot & Stewart, 2011).

Human factor issues such as decision and skill-based response to environmental conditions have contributed to the UAS mishap and accident rates.  After investigating further, risk assessment and delay of time sensitive task were among decision-based errors (Glussich & Histon, 2010).  

References

Glussich, D. & Histon, J. (2010, October).  Human/Automation interaction accidents: Implications for UAS operation.  Paper presented at the 29th Digital Avionics Systems Conference, Salt Lake City, UT.  Retrieved from https://ieeexplore-ieee-org.ezproxy.libproxy.db.erau.edu/document/5655352

Elliot, L. J., & Stewart, B. (2011). Automation and Autonomy in Unmanned Systems. In D. M. Marshall, R. K. Barnhart, S. B. Hottman, & M. T. Most (Eds.), Introduction to unmanned aircraft systems (pp. 100-117). New York, NY: CRC Press.

Joslin, R. E. (2015, January). Insight into UAS accidents and incidents.  Paper presented at the Aviation / Aeronautics / Aerospace International Research Conference, Phoenix AZ.  Retrieved from https://commons.erau.edu/aircon/2015/Friday/14/

Namowitz, D. (2015).  Training tips: The M in MVFR.  Retrieved from https://www.aopa.org/news-and-media/all-news/2015/june/08/training-tip

Rossier, R. N. (1999).  Hazardous attitude, which one do you have.  Retrieved from https://www.aopa.org/news-and-media/all-news/1999/september/flight-training-magazine/hazardous-attitudes

Wild, G., Murray, J., & Baxter, G. (2016). Explore civil drone accidents and incidents to help prevent potential air disasters.  Aerospace, 3(3), 1-11. doi:10.3390/aerospace3030022

Leave a comment