Improving U.S. Navy Campaign Analyses with Big Data
Author(s) -
B. L. Morgan,
Harrison Schramm,
Jerry R. Smith,
Thomas W. Lucas,
Mary L. McDonald,
Paul J. Sánchez,
Susan M. Sanchez,
Stephen C. Upton
Publication year - 2017
Publication title -
informs journal on applied analytics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.662
H-Index - 64
eISSN - 1526-551X
pISSN - 0092-2102
DOI - 10.1287/inte.2017.0900
Subject(s) - navy , vetting , notional amount , timeline , scrutiny , risk analysis (engineering) , heuristics , operations research , suite , big data , computer science , engineering , computer security , business , finance , political science , archaeology , law , history , operating system
Decisions and investments made today determine the assets and capabilities of the U.S. Navy for decades to come. The nation has many options about how best to equip, organize, supply, maintain, train, and employ our naval forces. These decisions involve large sums of money and impact our national security. Navy leadership uses simulation-based campaign analysis to measure risk for these investment options. Campaign simulations, such as the Synthetic Theater Operations Research Model (STORM), are complex models that generate enormous amounts of data. Finding causal threads and consistent trends within campaign analysis is inherently a big data problem. We outline the business and technical approach used to quantify the various investment risks for senior decision makers. Specifically, we present the managerial approach and controls used to generate studies that withstand scrutiny and maintain a strict study timeline. We then describe STORMMiner, a suite of automated postprocessing tools developed to suppor...
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom