Abstract
Modern technological warfare requires a level of cognitive ability and discipline unique in the history of armed conflict. Recent advances in physiology, nutrition, neuroscience, and engineering offer a significant potential to prevent or reduce the degradation of a warfighter’s mental or physical capabilities in this demanding environment. The authors explore four categories for potential enhancement of military personnel: genetic or computational-mechanical alteration of the human body; physiological monitoring and tighter coupling between man and machine; pharmaceuticals; and nutrition and supplementation. None of these types of enhancements is without controversy; in particular, genetic intervention would require morally intolerable experimentation. In the foreseeable future, the military enhancement technologies most likely to see use will be akin to those seen in elite athletics. Physiological monitoring and feedback, changes in nutrition, and careful pharmaceutical interventions all could improve warfighter performance, and, the authors assert, such enhancements are not morally problematic if their effects are candidly assessed and revealed. In choosing whether and how to enhance military personnel, the government must balance long-term health hazards with a reduced risk of near-term injury or death. If physiological monitoring and feedback (and regulation, through drugs or other means) can decrease large, immediate, or long-term risks to the life or well-being of service personnel, the authors write, there appears to be a moral obligation to provide those enhancements to warfighters.
Because of its long-standing support and leverage of advances in the physical sciences, computer science, and engineering, the US military is without peer on the technological front. Many of the military’s technical systems have been developed with the aim of improving and extending human performance. Night-vision devices are but one well-known example. Until recently, however, the military has shown limited interest in exploiting biomedical advances that might enhance intrinsic human performance and resilience. This reluctance is starting to wane, but only slowly.
All advanced militaries now operate in a world of increasing technological parity and fragility. It is well appreciated that human performance is an important component of the overall advantage of one military force with respect to another. In military affairs, the human element can have non-linear effects, both positive and negative, on mission outcomes (Defense Science Board, 2013). New advances in physiology, nutrition, neuroscience, and engineering now offer a significant potential to prevent (or reduce) the degradation of a warfighter’s cognitive and physical capabilities during conflict and substantially increase the performance of both combat personnel and the larger systems of which they are part.
Researchers and policy makers are exploring, or at least considering, a new generation of technology that may further extend and amplify the intrinsic physical and cognitive abilities of combatants. The Air Force Chief Scientist’s relatively recent document “Technology Horizons: A Vision for Air Force Science and Technology 2010–30” identifies critical technologies for the Air Force (Office of the US Air Force Chief Scientist, 2010). This remarkably forward-looking blueprint highlights human augmentation and indicates that it may come in the form of increased use of autonomous systems, interfaces for more intuitive decision making, close coupling of humans and automated systems, exploiting genetic correlates to performance, and direct augmentation of humans via drugs or implants to improve memory, alertness, cognition, and visual and aural acuity. The document also notes that while some of these approaches may seem inherently distasteful, potential adversaries are not likely to be constrained by similar cultural values.
Technological approaches to improved human performance such as night-vision goggles are typically not controversial, but biological technologies can stir the public imagination. Action films and television—from The Six Million Dollar Man of 1970s TV to the Jason Bourne movie series—have sometimes featured fighters who have been prosthetically or biochemically enhanced by some government agency, licit or illicit. But how and whether actually to enhance the physical, cognitive, and emotional capacities of US military personnel is a serious policy question.
Social attitudes about technological adaptations of the human condition are complex and not obviously consistent. Almost all people use optional enhancements of their abilities, or optional reductions in their disabilities. Reading glasses, hearing aids, and aspirin tablets remedy common disabilities, from myopia to headaches. Computers are, at base, cognitive prostheses. Diet and exercise regimes can enhance our abilities or remediate disabilities. Prosthetic devices enable those who have lost limbs to grasp, walk, and run. Society finds no moral problem with any of these enhancements. Computerized implants in animal brains have been studied for several decades and have been used in humans—without serious ethical qualms—for cases of blindness and loss of motor control. There is a degree of cyborgism that is morally tolerated because it helps to compensate for a disability.
Ordinarily, such adaptations do not prompt ethical worries, but there are exceptions. Groups in the deaf community, for example, have opposed cochlear implants for the congenitally deaf. Sports organizations have resisted a variety of assistance devices: carts for handicapped professional golfers; improved swimsuits for competitive swimmers; prosthetic legs for competitive runners. The reasons for opposition include a desire for community solidarity, a sense that the rules of a game intrinsically forbid certain aids, and an antipathy to giving some performers an unfair advantage. Society seems to be especially sensitive to certain advantages in sports —witness the universal concern over drugs and blood oxygen packing—but is indifferent to others. Although they greatly affect performance, differences in genetics, training locales, and financial resources for training seem to mean little or nothing to sporting audiences. No modern society objects to external, cognitive prostheses, whether the computer, the smartphone, or Google Glass (OK, maybe Google Glass, but chiefly because the appliance is creepy), even though these devices have probably created differential advantages for those who have the resources and talents to make the fullest use of them.
None of the aforementioned reservations about or objections to performance enhancement neatly applies to the military. Other things equal, Americans (and, of course, citizens of other countries) want their military personnel to have an advantage over their adversaries. Not all is fair in war, but a lot of unfairness is wanted. So the real issue is whether other things really are, and will be, equal. Do current military enhancements, and those in near prospect, risk long-term harm to those treated? Do they threaten perspectives on what is human or on the propriety of permanent alterations of the human condition? Or are the enhancements within prospect medically and ethically sensible? Do the potential benefits repay the risks? We, the authors, see four overlapping but conceptually separable categories of military personnel enhancement: genetic or computational/mechanical alteration of the human body; physiological monitoring and tighter coupling between man and machine; pharmaceuticals; and finally, nutrition and supplementation. None of these is without controversy.
The ethical minefield of genetic and mechanical enhancement
In the short or mid-term, mechanical alterations to senses or computational implants are unlikely to be practical military technologies. An infrared-sensitive eye implant, for example, would require expensive surgery, and it is not clear that the visual cortex would know what to do with the input. A cortically implanted communication or memory device would chiefly have the advantage that it could not be lost, which does not seem close to worth the bother.
One can imagine a future in which genetically produced anatomical and functional changes are possible and seen as advantageous to the performance of military duties. Gene manipulation is nowhere close to producing specific, functional, useful, mammalian anatomical variations, but they are conceivable. The ethical issues vary as far as the science fiction imagination can range, and the moral concerns and constraints will vary accordingly. One way to think of what might be morally tolerable is to imagine the sorts of genetic anomalies that would give an advantage in sports but that would be difficult legitimately to exclude—for example, a competitive swimmer with fully functional webbed feet. The moral issues seem to lie less in the existence of such genetic modifications in the human population, and more in the idea of deliberately creating a subpopulation that has them. Major genetic alteration of adults does not seem feasible, and a system of embryonic interventions to manufacture neonates with specific, militarily useful genetic alterations would create a pre-specified warrior class, undermining the very idea of the citizen soldier. The experiments needed to create a reliable production process of that kind would seem morally intolerable and their success unlikely.
In our experience, the US military is especially wary of any policy that touches on genetic manipulation or even genetic testing, useful as the latter might eventually become in selection of military personnel and their assignments. In the foreseeable future, the relevant military enhancement technologies are likely to be far less esoteric than genetic modification and much more akin to those seen in elite athletics.
Monitoring the warfighter
Sensors integrated into a fifth-generation fighter (e.g., the F-22) provide well in excess of 1,500 parameter measurements per second of hundreds of components and subsystems during training and missions. In contrast, the most vulnerable and least expendable component of an F-22 mission—the pilot—is only monitored for blood oxygen levels (which began only as a reaction to F-22 oxygen-supply issues). One can anticipate this situation changing rapidly across a wide range of military duties.
Advances in sensor technology have led to the development of wearable and unobtrusive sensors for a range of biomarkers that can be used to monitor physical and mental states, particularly the psychophysiology, of a warfighter. Next-generation sensors using nanotechnology and flexible conformal materials that provide a “lab on a Band-Aid” could enable even more unobtrusive monitoring systems. This monitoring can be thought of as providing an easy-to-use dashboard or check-engine light for the operator of a weapons system or even for commanders and other senior decision makers working long hours under considerable stress (Stone et al., forthcoming). Recent scientific studies have found molecular targets of opportunity for such physiological sensors. This military interest in monitoring the warfighter connects with the burgeoning quantified-self movement found in the larger civilian community.
Military personnel could in principle be monitored either online in real time or off-line periodically, and both types of measurement have been proposed. Such systems could include equipment to monitor sleep patterns, heart rate and variability, respiration, biomarkers for stress or attentiveness, and changes in behavior or self-reported mood; soldiers, sailors, pilots, support personnel, and their supervisors could be given feedback when necessary to improve alertness or other aspects of performance (Blackhurst et al., 2012).
Monitoring and feedback summon the morally troubling image of human beings permanently wired as intermediate cogs in a device that continuously samples and modulates aspects of their biochemistry, their attention, and in large degree their action. We, the authors, think that this is at the end of a long slope from present Defense Department efforts and see little reason to think that end will ever be approached. The military may want to help keep warfighters awake, alert, and healthy, but it is difficult to see any advantage or prospect of creating pharmaceutical cyborgs.
Performance-enhancing pharmaceuticals
Advances in neuroscience are slowly but surely yielding a mechanistic understanding of cognition, optimal mental performance, and resilience. Coupled with advances in nutrition and the development of new neuropharmaceuticals, this understanding opens the door to the possibility of enhanced cognitive performance and resilience. One can already observe a rather energetic and large-scale, albeit unconstrained, experiment in pharmaceutical enhancement of cognition well under way on every major college or university campus.
Modafinil and similar alertness or vigilance-support pharmaceuticals have been studied extensively (Caldwell et al., 2000) and have demonstrated utility in several Defense Department operational contexts. Researchers have reported that modafinil improved planning among their test subjects (Turner et al., 2003). More recently, a study of sleep-deprived physicians found that modafinil improved their cognitive flexibility while reducing impulsive behavior (Sugden et al., 2012). Modafinil (and newly emerging successor drugs) could be evaluated to improve performance of senior decision makers and others who must operate at a high level in a sleep-deprived state.
In addition, new pharmaceuticals aimed at mitigating the cognitive losses associated with aging and neurodegenerative diseases are under development. Such substances will likely have application in cognitive enhancement and resilience. It is anticipated that these carefully targeted drugs will more effectively exploit brain plasticity and offer fewer side effects than current drugs such as modafinil. One can reasonably anticipate that some future adversaries will not hesitate to provide their personnel with the latest pharmaceuticals to mitigate the effects of sleep deprivation, to enhance training, or to create other advantages.
The use of performance-enhancing drugs when combined with the possibility of real-time monitoring of biomarkers that reflect operator performance prompts moral concerns, in part because of real worries about long-term health effects, and in part because the image of human beings as permanently wired parts of a military device is abhorrent. The first consideration is serious; pharmaceuticals that enhance alertness or increase strength (in just two of many possibilities) could possibly affect health years later. The second is the end of a conjectural slippery slope that starts with, say, coffee and ends with unlimited access to advanced neuropharmaceuticals. But the truth about slippery slopes is that society starts down a lot of them, and society can, and often does, stop when appropriate.
Loss of resilience and its consequences
Over the last decade, the demands the military places on warfighters have changed in ways that go beyond the strains of the repeated tours of duty common in the Iraq and Afghanistan wars. Modern technological warfare necessitates a level of cognitive ability and discipline unheard of in the history of war, and it does so at every level of command, from the dismounted soldier to the commander in the operations center. Additionally, the military population largely reflects the physical condition of the broader civilian population from which it is drawn. A trip to your local Wal-Mart, shopping mall, or airport will illustrate the problem. The US military has a more overburdened and stressed and less healthy and, arguably, less resilient force than in generations past.
In 2011, nearly 110,000 active-duty Army troops were prescribed antidepressants, narcotics, sedatives, antipsychotics, or anti-anxiety drugs—reportedly an eight-fold increase since 2005 (Murphy, 2012). If these numbers are accurate, the dramatic increase in the use of pharmaceuticals must reflect an aggressive attitude toward medication to deal with the consequences of the loss of resilience. Perhaps not entirely coincidentally, suicides across the services have risen sharply. In 2012, a record 349 service members committed suicide—substantially more than the 295 Americans who died in combat in 2012. Although it remains unclear to what extent combat deployments are contributing to the military suicide rate (Hoge and Castro, 2012; Kang and Bullman, 2008; LeardMann et al., 2013; Ramchand et al., 2011), the capacity to work under substantial stress for extended periods of time while maintaining resilience is among the most urgent manpower needs facing the military. During 2011, the most recent year for which data is currently available, roughly 47 percent of those who died by suicide had deployed to Iraq or Afghanistan and approximately 15 percent had direct combat experience (Luxton et al., 2011). The causes for suicide are multiple, and the military is making great efforts to reduce the suicide rate among its members. Should effective means be found to reduce the rate of suicide among service members, one can reasonably anticipate that similar benefits would accrue to society as a whole.
Nutrition and supplementation
One area that can affect the resilience of warfighters and begs for improvement is nutrition and diet supplementation. The quantity and quality of dietary choices and distribution of nutrients throughout the day greatly affect muscle performance, body composition, cognitive performance, and feelings of energy or exhaustion. In addition, a rapidly expanding body of research describes an ever-increasing understanding of the link between native gut bacteria and human physical and biochemical characteristics, from increased obesity to cognitive metabolites and immune responses (Burnet, 2012). It will be culturally complicated for US military services to make improvements in nutrition, but changes in military diet offer real opportunities for significant increases in performance and resilience.
Military personnel deployed in conflicts eat more or less the usual US diet, but warfighters’ nutritional requirements often are, in fact, very unusual. Imagine warfighters as high-performance athletes who are constantly worried about bombs and bullets. One might say that many in our military, as in the broader society, are overfed, overmedicated, and nutritionally undernourished.
The gap between what the individual soldier requires and what the military provides in the way of nutrition and dietary supplements is already resulting in complex and at times risky solutions. For example, dietary supplementation is commonplace among airmen, soldiers, sailors, and marines who routinely rely on marketing hype and information gleaned from websites and bodybuilding forums to optimize their performance, with no reliable expert support or guidance. A 2003 US Army Research Institute Study on nonprescription supplement use among Special Forces during the year 2000 showed that 90 percent of Special Forces soldiers and 76 percent of support soldiers used supplements of some sort (Bovill et al., 2003). Thus the issue is not whether the troops will be supplemented, but rather what the proper role for the armed services might be in assuring safety and efficacy in the supplements consumed.
Widespread media reports have described how numerous athletes have been rendered ineligible for international competitions because they took supplements that contained steroids (and other sometimes-dangerous substances) not listed on product labels. In addition, as noted in a 2008 report by the scientific advisory group JASON, the unregulated and largely foreign supplement supply chain presents a notable vulnerability to attack (Williams et al., 2008). Nearly everything else related to the warfighter—from boots to hats—is carefully sourced and scrutinized. Vaccinations are required for personnel headed to foreign regions subject to diseases uncommon in the United States; other medical prophylaxes have been required in case of chemical warfare. There is no reason why the military should not carefully and appropriately guide the consumption of all supplements among its personnel.
In spite of the new performance demands on the modern warfighter, the military has only recently started to focus on ways to exploit the most current research arising from nutrition and performance science to optimize or tailor diets to the nutritional requirements for specific roles and performance. A warfighter’s diet roughly mirrors the standard US diet, a regimen that is causally associated with significant prevalence of metabolic syndrome and other problems associated with poor nutrition (e.g., obesity, diabetes, heart disease, cerebrovascular disease, cancer, and cognitive decline). Individually packaged meals ready to eat and mess-hall fare typically have high caloric content with relatively poor nutritional value.
As just one example, the connection between n-3 fatty acids (commonly referred to as omega-3 s) and mental health is currently an area of substantial interest for Defense Department research (Defense Science Board, 2013). Service members often have a severe deficiency of the omega-3 essential fatty acid docosahexaenoic acid (DHA), which may result in reduced resilience. DHA is a key central nervous system constituent and is the most abundant omega-3 fatty acid in the brain and retina. Low DHA levels are associated with cognitive decline, increased rates of telomere shortening (implicated in premature aging), diminished cardiovascular health, and increased susceptibility to and poorer recovery prospects from brain injury. A recent study compared total serum fatty-acid compositions from among 800 active-duty US military suicide deaths to 800 matched controls. The study identified low serum DHA status as a significant risk factor for increased risk of suicide deaths (Lewis et al., 2011). NIH investigators have dubbed n-3 fatty acids “nutritional armor” for their ability to increase brain resilience to withstand physical and psychological injury, to resolve major mental illnesses, and to enhance recovery post-injury (Lewis and Bailes, 2011). Randomized controlled trials indicate that n-3 s have been shown to be comparable in effectiveness to antidepressants and may reduce impulsive aggression by 40 percent among violent adults and aggressive children.
Another promising area of Defense Department-sponsored research focuses on development and evaluation of the performance and resilience effects of supplying exogenous ketone esters as a direct supplement to nutrition (Defense Science Board, 2013). Several independent lines of evidence spanning roughly a century of research have led to a mechanistic understanding of how and why rigorous ketogenic diets, which are in clinical use to treat diseases (e.g., several neurological disorders, including epilepsy, and metabolic syndrome disorders such as diabetes), can boost endurance and stamina when used by otherwise healthy people, including high-performance athletes (e.g., cyclists, rowers, long-distance runners). As an apparent evolutionary adaptation to periods of starvation, the liver can produce ketone bodies that are then converted into substances that feed cellular energy production. Were humans unable to produce ketone bodies when glucose becomes unavailable, they could survive only briefly in a state of starvation—and it is unlikely that the species would have endured.
A new dietary ketone energy supplement has recently been developed, with DARPA support, that produces nutritional ketosis without the dietary restriction. Ketone supplementation causes a rapid and sustained elevation of ketones1 for hours after oral administration. Preliminary results from DARPA-funded research showed improved physical and cognitive performance in animals and humans when using ketone supplementation. Rats exercised 32 percent more when fed ketone esters and showed increased cognitive function when solving maze tasks. Elite athletes (e.g., world-class rowers) have demonstrated superior performance with respect to endurance time, volume of oxygen consumed, heart rate, blood lactate levels, and power output. A study is underway at Oxford University examining cognitive performance in humans when using the ketone ester.2
At a minimum, improvements to the diet available to warfighters could have an immediate impact on physical and cognitive fitness and a long-term impact on health with respect to common but avoidable maladies such as coronary disease, diabetes, and cancer. Science-based improvements in the efficiency of cellular metabolism, managed through dietary changes and supplementation, could have beneficial impacts on physical, cognitive, and psychological health and resilience, and they certainly warrant further research.
Ethics and the enhanced warfighter
One often considers the hazards of supplementation and enhancement, but in the military context the greater moral hazard may be the decision not to enhance warfighter resilience through feasible and science-based methods that are available. One might also reasonably anticipate research investments in this type of enhancement to have important and beneficial effects in the broader society.
Questions about the health effects of physiological monitoring and feedback, changes in nutrition, and related interventions are not moral hard cases if the effects of the interventions are known and candidly assessed and revealed. Athletes in many fields knowingly, and in some sense willingly, accept risks to their long-term health in exchange for short-term excellence, opportunity, or advantage. The athletes or their representatives properly try to have those risks reduced without fundamentally changing the game. Something like that seems to describe the military situation. A volunteer in the military enters into a contract with the government to take risks that would be allowed for no civilian employee. The government reciprocates by promising, among other things, treatment or compensation if harm results. In doing so, the government must necessarily estimate the benefits and risks to the warfighter, whether the issue is vaccinations, or physiological monitoring and feedback, or nutrition, including supplements. Some risk of long-term health hazard must be balanced against reduced risk of near-term injury or death. Estimating the proper balance may of course be difficult, but, absent evidence of negligence or irrationality, the military cannot be generally faulted if sometimes the balance is for short-term risk reduction over long-term health uncertainties.
There are limits. We do not permit anyone, not even under the cover of a military career, to be bonded to slavery, and presumably we would not allow a practice that guaranteed the certainty of death or maiming of each member of a class of personnel. There should be no kamikazes in the US military. But if contracting is free and informed, and the weaker party has alternatives, it is legitimate for the military to slightly hazard the long-term health of its members for increased performance and reduced risk now, provided the risks are assessed and made known, as well as they can be.
One might worry that improvements in the capacities or resilience of military personnel will prompt administrators and generals to put them at greater hazard in combat, but that thought is to some extent refuted by the increasing US reluctance to suffer casualties in military conflicts. As noted above, moral issues have a flip side as well: If physiological monitoring and feedback (and regulation, through drugs or other means) can decrease large, immediate, or long-term risks to the life or future well-being of service personnel, it would seem there is a moral obligation to provide those resources and controls.
Questions of freely chosen, informed contracts aside, there are human and social costs to any adaptations or augmentations that might have some probability of leaving military personnel disabled, physically or psychologically. Treatment costs may be calculable, but the social costs of losses of human capital are not really estimable. But whatever their total, we are paying those costs already. The augmentations we need are those that improve human performance, and perhaps most important, increase resilience. We should not confuse warfare with sports and deny our military personnel the best science-based enhancements. War will always be hell; we need a more resilient military, humanly resilient.
Funding
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.
Article Notes
↵1 Typically > 3 mM β-hydroxybutyrate.
↵2 Resilience of cardiopulmonary and neurological function under extreme environments of oxidative stress has been achieved in animals given ketone supplementation. In research funded by the Office of Naval Research, ketone esters have proven effective against central nervous system oxygen toxicity seizures by a mechanism involving the elevation of blood ketones (D’Agostino et al., 2012, 2013). Central nervous system oxygen toxicity is a limitation for Navy SEAL divers. In addition, ketone supplementation may increase resilience to traumatic brain injury by preserving brain energy metabolism (Prins, 2008; Veech et al., 2012).
References
Blackhurst JL, Gresham JS, and Stone MO (2012) The quantified warrior: How DoD should lead human performance augmentation. Armed Forces Journal, December. Available at: http://armedforcesjournal.com/article/2012/12/12187387.
Bovill ME,
Tharion WJ,
Lieberman HR (2003) Nutrition knowledge and supplement use among elite U.S. Army soldiers. Military Medicine 168(12): 997–1000.
Burnet PWJ (2012) Gut bacteria and brain function: The challenges of a growing field. Letter. Proceedings of the National Academy of Sciences 109(4): E175–E175.
Caldwell JA, Jr,
Caldwell JL,
Smythe NK III,
et al. (2000) A double-blind, placebo-controlled investigation of the efficacy of modafinil for sustaining the alertness and performance of aviators: A helicopter simulator study. Psychopharmacology 150(3):272–282.
D’Agostino D,
Pilla R,
Held H,
et al. (2012) Development, testing and therapeutic applications of ketone esters (KE) for CNS oxygen toxicity (CNS-OT); i.e., hyperbaric oxygen (HBO2)-induced seizures. FASEB Journal 26(supplement): 711.10–711.10.
D’Agostino DP,
Pilla R,
Held HE,
et al. (2013) Therapeutic ketosis with ketone ester delays central nervous system oxygen toxicity seizures in rats. American Journal of Physiology—Regulatory, Integrative and Comparative Physiology 304(10):R829–R836.
Defense Science Board (2013) Technology and innovation enablers for superiority in 2030. October. Available at: www.acq.osd.mil/dsb/reports/DSB2030.pdf .
Hoge CW,
Castro CA (2012) Preventing suicides in US service members and veterans: Concerns after a decade of war. Journal of the American Medical Association 308(7): 671–672.
Kang HK,
Bullman TA (2008) Risk of suicide among US veterans after returning from the Iraq or Afghanistan war zones. Letter. Journal of the American Medical Association 300(6): 652–653.
LeardMann CA,
Powell TM,
Smith TC,
et al. (2013) Risk factors associated with suicide in current and former US military personnel. Journal of the American Medical Association 310(5): 496–506.
Lewis MD,
Bailes J (2011) Neuroprotection for the warrior: Dietary supplementation with omega-3 fatty acids. Military Medicine 176(10): 1120–1127.
Lewis MD,
Hibbeln JR,
Johnson JE,
et al. (2011) Suicide deaths of active-duty US military and omega-3 fatty-acid status: A case-control comparison. Journal of Clinical Psychiatry 72(12): 1585–1590.
Luxton DD, Osenbach JE, Reger MA et al. (2011) Department of Defense suicide event report (DoDSER). Calendar year 2011 annual report. Available at:http://t2health.org/sites/default/files/dodser/DoDSER_2011_Annual_Report.pdf.
Murphy K (2012) A fog of drugs and war. Los Angeles Times, April 7. Available at:http://articles.latimes.com/2012/apr/07/nation/la-na-army-medication-20120408.
Office of the US Air Force Chief Scientist (2010) Technology horizons: A vision for Air Force science and technology 2010–30. Available at:www.defenseinnovationmarketplace.mil/resources/AF_TechnologyHorizons2010-2030.pdf .
Prins ML (2008) Cerebral metabolic adaptation and ketone metabolism after brain injury. Journal of Cerebral Blood Flow & Metabolism 28(1): 1–16.
Ramchand R, Acosta J, Burns RM et al. (2011) The War Within: Preventing Suicide in the US Military. Santa Monica, CA: Rand Corporation. Available at:www.rand.org/pubs/monographs/MG953.html .
Stone MO,
Blackhurst JL,
Gresham JS,
et al. (forthcoming) Development of the quantified human. In: Artemiadis PK (ed) Neuro-robotics: From Brain Machine Interfaces to Rehabilitation Robotics, New York: Springer.
Sugden C,
Housden CR,
Aggarwal R,
et al. (2012) Effect of pharmacological enhancement on the cognitive and clinical psychomotor performance of sleep-deprived doctors: A randomized controlled trial. Annals of Surgery 255(2): 222–227.
Turner DC,
Robbins TW,
Clark L,
et al. (2003) Cognitive enhancing effects of modafinil in healthy volunteers. Psychopharmacology 165(3): 260–269.
Veech RL,
Valeri CR,
VanItallie TB (2012) The mitochondrial permeability transition pore provides a key to the diagnosis and treatment of traumatic brain injury. IUBMB Life 64(2): 203–207.
Williams E et al. (2008) Human performance. JASON Report JSR-07-625, March. Available at: www.fas.org/irp/agency/dod/jason/human.pdf .
Author biographies
Kenneth Ford is cofounder and chief executive officer at the Institute for Human & Machine Cognition.
Clark Glymour is a senior research scientist at the Institute for Human & Machine Cognition and Alumni University Professor at Carnegie Mellon University.
No comments:
Post a Comment