Skip to main content
SearchLoginLogin or Signup

Demonstrating the Value of Government Investment in Science: Developing a Data Framework to Improve Science Policy

Published onApr 28, 2022
Demonstrating the Value of Government Investment in Science: Developing a Data Framework to Improve Science Policy


During last year’s Senate debate on the Endless Frontiers Act, a bill that would authorize substantial funding increases for the National Science Foundation (NSF), some lawmakers raised questions concerning the value of federal investments in scientific research. This article suggests that a new framework for studying and understanding the value and impact of federal research investments can help pro-science policymakers defend against such attacks and make more informed policy decisions regarding such research investments. The data, analytical tools, and research products supported by this framework are important to demonstrate the value of basic research investments, particularly in nonhealth-related fields, where the process by which major scientific discoveries occur and their impacts are often underappreciated and not well understood by the public and policymakers.

Keywords: science policy; Congress; data framework; data use; science funding

1. Introduction

During last year’s Senate debate on the Endless Frontier Act1 (Endless Frontier Act, 2021), which proposed to significantly increase federal funding for the National Science Foundation (NSF), Senator Rand Paul (R-KY) took to the floor to criticize federal funding for scientific research conducted by the National Science Foundation (NSF) and National Institutes of Health (NIH). Paul brought up research studies being funded by these agencies on "Cocaine and Risky Sex Habits of Quail" and “Lizards on a Treadmill” (Miller, 2021). In an earlier op-ed published in The Hill, Paul pointed out other research studies he claimed as wasteful, such as $1.5 million to study how to improve the taste of tomatoes and a $500,000 study on if taking a selfie makes you happy (Paul, 2021).

In targeting such grants, Senator Paul took a page out of the playbook used previously by other elected officials such as Senator William Proxmire (D-WI) who often ridiculed federally funded research grants with funny sounding titles as a part of his monthly ‘Golden Fleece Award’ (Proxmire, 1975–1985). Others, such as former Senator Tom Coburn (R-OK) have mimicked Proxmire’s practice through publishing an annual government “wastebook” targeting scientific research that they identified as “examples of egregious abuse of taxpayer dollars” such as the widely publicized NSF-funded “Shrimp on a Treadmill” study (Coburn, 2011a,b). Along similar lines, Senator Joni Ernst (R-IA) provides monthly “Make ’Em Squeal” awards, which stands for “Stop Questionable, Unnecessary and Excessive Allowances for Legislators,” that often target federally funded scientific research (Ernst, n.d.).

This article looks at how advocates have used retrospective stories to explain the value of government investments in basic research. It then briefly examines current research conducted concerning the economic and social value of federal research investments. The article concludes by arguing that the creation of new metrics and a broader framework for understanding the impact of federal scientific research investments could help policymakers to maximize the benefits of such investments and to better defend and promote them. 

2. The Importance of Storytelling

To counter attacks like those made by Senator Proxmire and Senator Paul, research advocates have often responded with their own stories of federal investments in basic research that have demonstrated clear public value or resulted in widely recognized modern-day technologies and medical advances. Common stories include the iPhone, the internet, and Google, all of which would not exist were it not for federally funded basic research investments made by agencies such as the NSF, U.S. Department of Defense, and U.S. Department of Energy (Owen-Smith, 2021a). More recently, investments in research on synthetic mRNA, previously dismissed as frivolous, even by some in the scientific community, have been pointed to as examples of the value of previous basic research investments made by the federal government. Had those investments not been made, the rapid development of COVID-19 vaccines by Pfizer-BioNTech and Moderna would not have been possible (Thompson, 2021).

To highlight these stories and help the scientific community better defend itself against such attacks, a group of members of Congress led by Representative Jim Cooper (D-TN), with support from organizations including the American Association for the Advancement of Science (AAAS) and the Association of American Universities (AAU), created the Golden Goose Awards (“Of Geese and Fleece,” 2012).

Since the first Golden Goose Awards were given in 2012, the award has recognized scientists whose seemingly obscure, funny sounding, federally funded research—the very types of research projects targeted for ridicule by Senators Proxmire and Paul—have led to major breakthroughs resulting in new technologies, medical treatments, and knowledge that have greatly benefited society (Golden Goose Award., n.d.). Included among the past Golden Goose Award recipients are Charles Townes, Nobel prizing winning physicist who invented the maser, which led to the laser, which was dubbed ‘a solution in search a problem’ when it was first invented because no one had a clue as to how it could be used (Golden Goose Award, 2012b).

Other federally funded discoveries recognized by the Golden Goose Award include Green Fluorescent Proteins, a field-altering protein tagging tool that has revolutionized cell biology, discovered while trying to understand why some jellyfish glow green (Golden Goose Award, 2012a; Nobel Prize in Chemistry, 2008). Another award went for research drawing upon studies of the venom of the Gila Monster, which was used to develop a new compound now widely utilized in a major treatment for diabetes (Eng et al., 1990; Golden Goose Award, 2013). The 2020 Golden Goose Awards recognized researchers that conducted the basic scientific research that underpinned the quick development of the vaccines for COVID-19. In the instance of COVID-19, fundamental research previously undertaken to understand SARs and other coronaviruses was essential to the quick development of the various COVID-19 vaccines that have been developed in record time (Golden Goose Awards Ceremony, 2020). 

Historically, studies such as the 1968 NSF-supported Technology Retrospect and Critical Events in Science, often referred to as the NSF TRACES study, have attempted to link the genealogical roots of key technologies back to basic research origins to help demonstrate how basic research led to innovations of social and economic importance. TRACES illustrated the role basic research played in the development of major technological advancements including the video tape recorder, oral contraceptives, and the electron microscope (NSF, 1968).

Such retrospective accounts of the value of basic research have played a role in defending the federal system for supporting U.S. science that emerged out of World War II as part of the vision developed by Vannevar Bush. Bush, who served as President Franklin D. Roosevelt’s unofficial science advisor during the war, overseeing efforts such as the work on the atomic bomb, was asked by FDR to write a report laying out how science could help to achieve national goals during peace time, just as it had been used to help achieve major objectives during the war. In his 1945 report, Science the Endless Frontier, Bush responded by presenting a vision for how federal investments in basic research—which he referred to as “the pacemaker for technical progress”—were critical to addressing major national economic, public health, and national security–related challenges. Bush argued the government’s proper role in funding research was to support basic science (as opposed to applied science) performed “without thought of practical ends” (Bush, 1945). In response to Bush’s report, Congress created the NSF through the National Science Foundation Act (1950) “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense.”

3. Stories Alone Are Not Enough

While such cases studies and historiographic accounts are useful to demonstrate the value of public investments in basic research and are important for policymakers to cite and be aware of, these accounts have limitations (Ruegg & Jordan, 2007). To fully understand if and how government support for basic scientific research advances our national interests in the ways envisioned by Bush (1945) in his Endless Frontier report, scientific success stories and case studies need to be supplemented by sound social and economic research, data, and information, not only to further delineate the economic and social value of these public investments, but to assist decision makers in making sound policy choices that best maximize the societal benefits of these research investments.

If aligned, well-organized, and effectively communicated, research data and other information can be used to better explain and understand basic research investments and to enable elected officials and policymakers who are supportive of such investments to better defend and support them. The empirical finding of decades of scholarship on legislative decision-making and legislative activity rests on what Doug Arnold (1990, p. 42) dubbed “traceability.” Whether one assumes that the primary goal of legislators is to win reelection, enact good public policy, or gain stature within Congress (Fenno, 1978; Kingdon, 1989; Mayhew, 1974), legislators make decisions about where to invest their time and how to cast their votes by estimating how likely their actions and activity will be seen as producing specific and visible positive results as opposed to visible negative results. 

Research data and information on the value of public research investments and how to effectively communicate its value to the public should be a high priority for those who wish to maintain public support for basic research. Such research can be valuable to maintaining existing support for scientific research and defending against attacks that will continually be raised by skeptics such as Rand Paul. As the findings of noted American political scientist John Kingdon have demonstrated, those who support basic scientific research must provide decision makers with the information they feel will benefit them and that they require to defend and support their policy positions. As Kingdon says, “it is probably a general strategic principle that those who want something from decision-makers must adapt their strategies to the decision rules that are being used” (Kingdon, 1989, p. 153).

When it comes to federal investment in scientific research, this includes helping draw clear and transparent connections between the everyday lives of the constituents the elected officials represent and the outcomes of federal investment in research. That, however, has often proved to be a difficult challenge, in part because the time frames for realizing the fruits of federal basic research investments often occur over the long term and do not align well with the two-year time frames of Congressional election cycles. Moreover, it is often difficult to connect the economic and social benefits of basic research investments directly to people’s everyday lives. Making such connections will only grow more challenging as U.S. politics becomes more nationalized (Abramowitz & Webster, 2016; Hopkins, 2018) and science more polarized and politicized (AP-NORC, 2022; Borenstein & Fingerhut, 2022).

One exception has been medical research, where the public is more easily able to understand how investments in basic research have helped to lead to major life-saving cures in medical advances to address cancer, heart disease, and viruses such as COVID 19. As Sampat points out, the NIH has been particularly effective at employing the ‘serendipity hypothesis’ to help it to realize substantial funding increases for the agency. Echoing the argument for basic research in Bush’s 1945 report, the serendipity hypothesis maintains that many health discoveries relating to treating one disease were made unexpectedly while conducting research oriented toward a totally different disease. Additionally, Sampat also maintains that in some selected cases, NIH has been able to target research to specifically respond to the immediate demands and expectations of members of Congress and their constituencies. These targeted research efforts aimed at specific diseases such as diabetes, heart disease, and cancer have been important in maintaining support from disease advocates and the public (Sampat, 2012).

Certainly, there has been significant economic research that demonstrates that investments in R&D and human capital are critical to economic growth. Examples include the work done by Nobel Prize–winning economists Bob Solow and Paul Romer (Matsangou, 2019; Romer, 1994; Solow, 1956 ). Work done in the 1990s by Edwin Mansfield attempted to quantify the social rate of return from investment in academic research, placing this from 28% to 40% (Mansfield, 1992, 1998).

Francis Narin in 1997 used bibliometric methods to show that patents refer more often to publicly funded research than to industry-funded research (73% referenced publicly funded research versus 27% that referenced industry papers) (Narin et al., 1997). Narin and his colleagues later went on to develop a stock market predictive model whereby they used bibliometric-based measures of patent quality to assess and predict future stock market performance (Deng et al., 1999; Narin et al., 2001). A recent analysis by Tartari and Stern (2021) found a strong connection between the presence of a research university and increased quantity and quality of entrepreneurship in the surrounding area. 

In addition to examining the economic returns and value from public research investments, researchers have also focused on examining social returns from these investments. In their examination of the social returns on R&D investment, Jones and Williams (1998) sought to estimate the degree of underinvestment. They concluded that the optimal R&D investment was likely several times the levels being provided at the time. Jones (2021) reviewed several studies on the social returns from R&D investments, reaching the conclusion that the United States should invest far more in science and innovation than it does. His findings echoed Jones and Summers’s (2020) previous findings.

There are, in fact, several different methodologies that can be used in assessing the benefits of public research investments. The U.S. Department of Energy Office of Energy Efficiency and Renewable Energy has compiled an analysis of 14 different methods for evaluating the impact of R&D programs. These range from historical tracing and econometric-based methodologies to those that use bibliometrics, technology commercialization tracking, benchmarks, and network analysis (Ruegg & Jordan, 2007). Salter and Martin (2001) identify three primary methodologies for assessing the benefits of public research investments: economic studies, surveys, and case studies. Drawing on this and other research conducted at SPUR, Martin and Tang (2007) provide a comprehensive summary of quantitative and qualitative approaches to examining the benefits of publicly funded research. They conclude that the benefits from public research investments come in many different forms, are fairly substantial, and are sufficient to justify considerable government investment in basic research.

Several studies have used patent data to study and provide insight concerning how public research investments in basic science impact innovation. A comprehensive AAAS report by Matt Hourihan (2020) titled Public Research Investments and Patenting: An Evidence Review outlines how patent analysis has been used by scholars to evaluate the value of scientific research and assess the role government research support plays in innovation. Other analyses, such as that conducted by Pressman et al. (2019), have employed advanced input-output models to estimate the economic impact of academic licensing of intellectual property from universities and other nonprofits whose research funding comes largely from the federal government. 

Such analyses, which try to assess the economic and social returns on public research investment, often prove hard for both policymakers and the public to understand and relate to because the connections between research investments and improvements in their own daily lives are often quite precarious. This is particularly true for research in the physical sciences. As discussed earlier, the strong linkage of health research to people’s daily lives is much easier to understand and value than investments made in advanced areas of physics because people are better able to relate to the public health benefits of scientific research. Everyone knows someone who has died of cancer, struggles with other life-threatening diseases and illnesses, or has been sick with or died from COVID-19. This clear understanding that people have, of the linkage between their lives and biomedical and health science, has, in turn, resulted in continued strong Congressional support and strong funding growth for the NIH over the past 20 years. 

One often used justification for research investments has been that such investments produce jobs. While it is unclear if job creation serves as a strong selling point for making government investments with the public, many policy makers are interested in understanding their impact on job creation. Indeed, when over $18 billion in government funding to support research and development was included in the American Recovery and Reinvestment Act of 2009 (ARRA), the executive branch mandated that all prime and subrecipients who received funding—including researchers at academic institutions – document the total number of jobs resulting from the funds they received (American Recovery and Reinvestment Act, 2009). The reports were required to be submitted quarterly by the Office of Management and Budget (OMB) using the specially created website to help facility the ARRA reporting requirements.

For the academic research community, this reporting requirement proved tremendously onerous and time-consuming, adding to the already significant administrative tasks that individual university research faculty were required to perform. Noted one vice provost for research at a large research university, “[ARRA] reporting requirements have been more demanding, more detailed than anything we’ve had to do in the past” (O’Donnell, 2009). Another large research university reported that it took more than 2 hours per research award to fulfil the ARRA reporting requirements (Nelson & Sedwick, 2011). Additionally, changing definitions and the fact that much of what constituted a “full-time employee” was left to be interpreted by the researchers doing the reporting resulted in much of the data that was ultimately collected by OMB to be inconsistent and largely unusable to accurately determine how many jobs had directly resulted from the ARRA funding. Noted a report by the Government Accountability Office (GAO), “While [ARRA] recipients GAO contacted appear to have made good faith efforts to ensure complete and accurate reporting, GAO’s fieldwork and initial review and analysis of recipient data from, indicate that there are a range of significant reporting and quality issues” (GAO, 2009).

As an alternative, a much more accurate, effective, and less burdensome approach to providing an accurate accounting of the number of jobs created from ARRA was developed by members of the National Science and Technology Council Science of Science Policy Interagency Working Group and White House Office of Science and Technology Policy staff. This group worked with seven universities through a pilot project developed by the Federal Demonstration Partnership whereby university administrative records were used to identify the total number of individuals and the percentage of their total salaries supported by ARRA funding (Lane & Schwarz, 2012). The FDP pilot demonstrated that this approach could remove the need to involve faculty principal investigators in ARRA jobs reporting and that this information could instead be more accurately captured through institutional reports generated by linking human resources and payroll data together on university campuses (Nelson & Sedwick, 2011).

This pilot led to the development of the NSF and NIH supported StarMetrics program (NIH, 2015). StarMetrics grew quickly from the initial seven universities to more than 80 participating universities. The work that began under the STARMetrics program was later taken over and supported as the UMETRICs program piloted by the Committee on Institutional Cooperation (CIC) and has been continued and built upon by the University of Michigan’s Institute for Research on Innovation and Science (IRIS). The intention of UMETRICs and the work of IRIS is to collect, analyze, and link university and business administrative records and other available data sets, such as U.S. Census Bureau data and patent information, to better assess the impact of science, understand the structure of the science and engineering (S&E) workforce, and to optimize research investments.

The data collected by IRIS has been used to help facilitate studies focused on how research experience shapes student career pathways; to examine how federally funded research ensures safe and secure food systems; to analyze gender differences in graduate studies and early career pathways within STEM fields; to understand the connections between how federal research grants positively impact small businesses and vendors in specific states and regions; and to produce tremendously valuable information linking research inputs to positive science impact stories with direct connections to specific states, regions, and congressional districts (IRIS, 2021). Most recently the data in IRIS has helped to provide information about how the COVID-19 pandemic is impacting large, research-intensive universities and their research faculty and students (Owen-Smith, 2021b).

4. Developing Better Metrics, Analytical Tools, and Data Infrastructure

With big new data sets and statistical analyses tools now available, we have an opportunity to develop “Better Benchmarks” as was envisioned by the late Jack Marburger, Director of Office of Science and Technology Policy under the George W. Bush administration, and which he called for in his 2005 op-ed in Science magazine (Marburger, 2005). 

While the development of such metrics can help to defend why public investments in basic research are critical to U.S. economic growth, public health, and national security, they are perhaps even more valuable to connect data and information that helps explain the value of basic research investments and the scientific process by which such research leads to tangible societal benefits that both policymakers and the public can easily understand. For example, we need to be able to effectively understand and better explain that researchers can learn from negative results (Mehta, 2019; Taragin, 2019); that there is value in supporting a diverse array of scientists who come from a variety of genders, ethnicities, sexual orientations, perspectives, backgrounds, areas of expertise, and ages (Gibbs, 2014; Jones & Weinberg, 2011; Medin & Lee, 2012; Page, 2007), and the important role that interdisciplinary research efforts can play in addressing major societal challenges (Visholm et al., 2012; Wagner et al., 2011; “Why Interdisciplinary Research Matters,” 2015).

The development of new and improved metrics to evaluate public research investments will require continued and expanded support for research on the “science of science and innovation policy” (National Research Council, 2014). This Science of Science and Innovation Policy (SciSIP) program effort was formally launched by the NSF in 2005. Support for the effort was further solidified with the creation of National Science and Technology Counsel (NSTC) Interagency Task Force on the Science of Science Policy in 2006. In 2008 this Task Force issued a roadmap in direct response to Marburger’s challenge to develop “a scientifically rigorous, quantitative basis from which policy makers and researchers can assess the impacts of the Nation’s scientific and engineering enterprise, improve their understanding of its dynamics, and assess the likely outcomes” (NSTC, 2008). 

The SciSIP program has played an important role in providing research and analytical grants that have resulted in the development of new data, new indicators, and analysis that is greatly enhancing our understanding the value of research investments. It has also helped to support a cadre of students, faculty, and scholars who can further help to develop and analyze science and engineering data and indicators.

In 2019 the former NSF SciSIP program was repositioned within the NSF into a new program, The Science of Science: Discovery, Communication, and Impact (SoS:DCI), which is specifically designed to increase the public value of scientific activity. Included in the program’s goals is to support research aimed at: 1) increasing the rate of socially beneficial discovery; 2) improving science communication outcomes, and 3) expanding the societal benefits of scientific activity (NSF, 2020). One of the valuable additions to this program is the increased emphasis on science communications, for even if there is strong evidence of the value of public investments in basic research, the failure to be able to effectively communicate and explain the value to policymakers and the public can have an adverse impact upon major science policy decisions, including those relating to research investments and funding levels. These efforts to effectively communicate science and scientific evidence are critical to maintaining public trust in science and to pushing back against the growing antiscience movement.

In addition to developing better tools and strategies for assessing, understanding, and communicating the value of basic research, a more comprehensive and complete data infrastructure to support such efforts will also be required. Central to this will be the NSF and its already existing National Center for Science and Engineering Statistics (NCSES), which already conducts a series of surveys and maintains a wealth of data on S&E-related matters (e.g., funding and expenditures, education, workforce, international data, research facilities).

Additionally, it will be important that there be increased data accessibility, sharing, and cooperation and coordination across federal agencies, including the U.S. Census Bureau, IRS, U.S. Department of Labor, and the major federal research agencies. A major positive step that should help facilitate this type of data sharing and access is the passage and implementation of the Foundations for Evidence-Based Policy Act of 2018 (2019), also known as the Evidence Act. The Evidence Act, signed into law by the president on January 14, 2019, requires changes to how the federal government manages and uses data and how it makes it publicly accessible.

5. Developing a Framework for Analysis

To help to justify, defend, and ensure that federal investments in basic research are made wisely, careful thought should be given the types of questions that must be answered and what types of data, information, and linkages might be able to shed light on these questions. To help with this, a comprehensive and well-thought-out framework should be developed.

This framework should not only help to better organize and make use of existing data that help explain the economic benefits of such investments; it must also seek to provide a better understanding of the social impacts and spillover effects of such investments (e.g., improved health, food security, education, and quality of life). It should also provide new insights that help to emphasize the importance of the people trained and supported by research investments and what they do, not just numbers of jobs produced, papers in high-impact journals, or patents issued. As Romer and others have said, one of the most important things for us to measure is the value of the ‘human capital’ that investments in science produce (Romer, 2020; Zolas et al., 2015). Human capital in STEM takes a long time to develop and needs consistent financial support. So, it is important that a framework examine the timeframes involved in the development of such talent and the importance of sustaining a broad talent base across multiple fields.

This framework should seek to answer key questions that would help us better understand the value of federal research investments while also helping us to understand the processes that determine how and to whom the benefits from basic research accrue, and how those benefits might best be maximized. Specifically, the framework should help to answer questions such as:

  1. What is the actual return on investment for public (as opposed to private) investment in scientific research, from both an economic and social perspective? How and where does public investment in research, in turn, spur additional private investment? In addition to their economic value, how can such investments in basic research help to inform policy decisions that advance government missions and national interests in areas such as defense, health, energy, environment, education, immigration, and so on?

  2. What are some of the harder-to-describe qualitative benefits that are derived from research investments related to the education, training, and development of human and social capital that accompany U.S. research investments? How can we better quantify and communicate these benefits to policymakers and the public, many who do not understand that the research grants not only support innovative scientific research but also education and training of graduate students?

  3. How do investments in scientific research translate into employment and job creation, in the short-term from federal funding that flows to universities, national laboratories, and other research performers? What are the longer-terms impacts of basic research investment in the form of startup companies and new products, processes, and technological advances?

  4. How do the fruits of basic research find their way into the commercial marketplace and what synergies exist between public and private investment in both basic and applied science? What can be done to improve the processes and to enhance public–private partnerships that enable new ideas to move from the lab to the marketplace? 

  5. How and where does the United States stand in terms of investing in scientific research compared to other countries? How does our R&D investment portfolio compare to these other countries? How are these investments changing overtime?

  6. What is the makeup of the U.S. scientific and technical workforce and how valuable is international talent and domestic talent to meeting STEM workforce needs? What is the importance and value of maintaining a diverse STEM workforce to improve the quality of scientific research and for the broader national workforce? 

  7. What does the public view as the greatest impacts of science on their own lives? What do they see as the most important benefits of government investment in research? How do we better communicate scientific findings in ways that the public trust and believe, especially in light of the increasing polarization of science within that general population?

As a framework is developed, it will also be essential that careful thought is given to various issues, including developing common data, tools, definitions, and standards, and developing a framework for sharing data, ensuring privacy, access, and other essential components. For example, the Organisation for Economic Co-operation and Development has undertaken efforts to develop agreed-upon and shared definitions between countries to facilitate distribution of internationally shareable comparable data. Likewise, the NSF has developed definitions for its surveys pertaining to STEM education the STEM workforce, R&D funding and expenditures, and research facilities. These definitions continually need to be updated and refined as new data sets and information are added and linked to enhance the data and information that is already collected through such surveys.

Additionally, the framework needs to be developed in such a way that it promotes data that is comparable so that decision makers and analysts can easily compare at the level they are interested in, over time and at multiple levels: internationally, nationally, regionally, across sectors, as well as at state and local levels.

Ultimately, a key ingredient of any framework should be to help all potential users understand the value of scientific research and to ensure that data and information collected are flexible, usable, and accessible to policymakers and the public in ways that will allow them to make their own comparisons, assessments, and informed judgements.

In addition to the specific issues raised here, as the framework is developed, accounting for these additional needs should be given high priority:

  1. Funding for data and analysis—It is essential that money is set aside to help research achieve not just the ‘broader impacts’ of science but also with the objective of allowing scientists to conduct analysis and collect data on the impacts of their work. There is also a need for grants and contracts to have a small amount within the grant available to conduct analysis on the impact that their research has had. This should be a regular practice not only at NSF but the other federal research agencies including the NIH, NASA, the Departments of Energy, Defense, Agriculture, and others. 

  2. Measuring and assessing interactions and linkages—There is a need to better understand relationships and interactions between federal research investments, universities, and industry. How can linkages be made to better leverage public research investments and increase their impact? Are there ways in which certain types of interactions can help maximize the impact that research investments have in helping address important social issues relating to public health, hunger, nutrition, economic development and so on? Additional follow-up studies to Narin’s work (1997) linking private patent references to publicly funded research and his analysis (2001) linking stock market success to the quality of the science underpinning a company’s patent portfolio should be undertaken.

  3. International collaboration—Better understanding and information is needed about the value and importance of international scientific collaboration. Changing geopolitical circumstance and increasing nationalism can impact important international scientific collaborations, which can have significant impacts on our ability to maximize the benefits and value of domestic science investments, especially at a time when the United States no longer has a monopoly on basic scientific knowledge.

  4. Need for current and updated data and estimates—While current data are sometimes hard to get and update in a timely manner, data that are three years old are often not helpful in informing current policy debates. For example, pre-pandemic 2017 data will in many instances not be relevant for our current economy during or after the COVID-19 pandemic. To help solve this problem, we need more up-to-date data, and when such data are not available, estimates must be developed. To help to achieve this goal, more research on how to develop the best current data estimates is needed.

  5. Balancing the need for data privacy and confidentiality with the need for data transparency and public accessibility—New ways to aggregate, anonymize, and then link data from government and private sources to other available public data sources need to be developed. If made available to researchers, such data can serve as a tremendously valuable resource to help advance our understanding of research investments. Steps must be taken, however, to constantly protect individual privacy and the confidentiality involved with such data, while at the same time making it more readily available.

  6. Visualization—Additional efforts must be undertaken to better visualize data in ways that the public and policymakers can understand. Such visualizations can be particularly powerful in conveying technical data that are understandable to policymakers and the public. They can also help explain longer-term trends and complex data about research investments, their impacts, and related matters relevant to how future investments should be made.

6. Conclusion

While challenging, we must develop new and better data, analytical tools, and a comprehensive framework to help us further understand and explain the societal impact of public investments in science. Advances in data science, the ability to merge and link big data sets, and increased access to previously inaccessible government data sources allow us to conduct research and enhance our understanding in ways previously not possible.

Success in this area will ultimately depend on a reliance on quantitative and qualitative data, analysis, and stories. It will also necessitate additional government support for creating the data infrastructure required for a wide range of research studies and support for researchers to better understand and gain insight into the science of science and innovation policy.

We must also learn how to effectively communicate what we learn from such data and analysis in ways that help policymakers to make sound science policy decisions and to defend public investments in basic research from its critics. The ability to effectively convey what we know about the value of public investment in basic research to policymakers and the public in ways that resonate with their values and in which they understand and relate is even more essential as mistrust of science, political polarization, and the antiscience movement grow. These efforts will be critical to help defend and address important questions concerning the value of public investments in basic scientific research in the future.

Disclosure Statement

Tobin L. Smith has no financial or non-financial disclosures to share for this article.


Abramowitz, A., & Webster, S. (2016). The rise of negative partisanship and the nationalization of U.S. elections in the 21st century. Electoral Studies, 41, 12–22.

American Recovery and Reinvestment Act of 2009, H.R.1, 111th Cong. (2009),

AP-NORC. (2022, January 26). Amidst the pandemic, confidence in the scientific community becomes increasingly polarized.

Arnold, R. D. (1992). The Logic of Congressional Action. Yale University Press.

Borenstein, S., & Fingerhut, H. (2022, January 26). Americans’ trust in science now deeply polarized, poll shows. Associated Press.

Bush, V. (1945). Science the endless frontier: A report to the president. Government Printing Office.

Coburn, T. (2011a). National Science Foundation: Under the microscope

Coburn, T. (2011b). 2011 wastebook.

Deng, Z., Lev, B., & Narin, F. (1999). Science and technology as predictors of stock performance. Financial Analysts Journal, 55(3), 20–32. 

Eng, J., Andrews, P. C., Kleinman, W. A., Singh, L., & Raufman, J. P. (1990). Purification and structure of exendin-3, a new pancreatic secretagogue isolated from Heloderma horridum venom. Journal of Biological Chemistry, 265(33), 20259–20262.

Endless Frontier Act of 2021, S. 1260, 117th Congress. (2021).

Ernst, J. (n.d.). Make ‘em squeal.

Fenno, R. (1978). Home style: House members and their districts. Scott Foresman & Co.

Foundations for Evidence-Based Policymaking Act of 2018, Pub. L. No. 115-435, 132 Stat. 5529 (2019).

Gibbs, K. (2014, September 10). Diversity in STEM: What it is and why it matters. Scientific American.

Golden Goose Award. (n.d.). Awardees.

Golden Goose Award. (2012a). Green fluorescent protein (M. Chalfie, R. Tsien, & O. Shimomura).

Golden Goose Award. (2012b). The MASER (C. Townes).

Golden Goose Award. (2013). Diabetes medication (J. Eng).

Golden Goose Awards Ceremony. (2020, December 2). [Documentary short film.] YouTube.

Government Accountability Office. (2009). Recovery Act: Recipient reported jobs data provide some insight into use of Recovery Act funding, but data quality and reporting issues need attention (GAO 10-223).

Hopkins, D. J. (2018). The increasingly United States: How and why American political behavior nationalized. University of Chicago Press.

Hourihan, M. (2020). Public research investments and patenting: An evidence review. American Association for the Advancement of Science.

Institute for Research on Innovation and Science. (2021). Research data: Research using IRIS data. Retrieved September 2021 from

Jones, B. (2021). Science and innovation: The under-fueled engine of prosperity. In M. S. Kearney, & A. Ganz (Eds.), Rebuilding the post-pandemic economy. Aspen Institute Press.

Jones, B., & Summers, L. (2020). A calculation of the social returns to innovation (Working Paper 27863). National Bureau of Economic Research.

Jones, B., & Weinberg, B. (2011). Age dynamics in scientific creativity. PNAS, 108(47), 18910–18914.

Jones, C. I., & Williams, J. C. (1998). Measuring the social return to R&D. The Quarterly Journal of Economics, 113(4), 1119–1135.

Kingdon, J. (1989). Congressman’s voting decisions (3rd ed.). University of Michigan Press.

Lane, J., & Schwarz, L. (2012). Creating new administrative data to describe scientific workforce: The STAR Metrics program (IZA Discussion Paper No. 6600). IZA.

Mansfield, E. (1992). Academic research and industrial innovation: A further note. Research Policy, 21(3), 295–296. 

Mansfield, E. (1998). Academic research and industrial innovation: An update on empirical findings. Research Policy, 26(7–8), 773–776.

Marburger, J. H. (2005, May 20). Wanted: Better benchmarks. Science.

Martin, B., & Tang, P. (2007, June). The benefits from publicly funded research (Paper No. 161). Science Policy Working Unit. 

Matsangou, E. (2019, January 19). Romer's endogenous growth theory could provide a solution for global problems. World Finance.

Mayhew, D. (1974). Congress: The electoral connection. Yale University Press

Medin, D., & Lee, C. (2012, April 27). Diversity makes better science. Association for Psychological Science Observer.

Mehta, D. (2019, October 4). Highlight negative results to improve science. Nature.

Miller, A. (2021, May 28). “Lizards on a treadmill”: Rand Paul calls out wasteful research spending with colorful props on Senate floor. Washington Examiner.

O’Donnell, C. (2009, December 10). UW responds to heavy reporting requirements for economic stimulus money. UW News

Narin, F., Hamilton, K., & Olivastro, D. (1997). The increasing linkage between U.S. technology and public science. Research Policy, 26(3), 317–330.

Narin, F., Thomas, P., & Breitzman, A. (2001). Using patent indicators to predict stock portfolio performance. In B. Berman (Ed.), From ideas to assets: Investing wisely in intellectual property (pp. 293–308). John Wiley and Sons.

National Research Council. (2014). Science of science and innovation policy: Principal Investigators' Conference summary. The National Academies Press.

National Science and Technology Council. (2008). The science of science policy: A federal research roadmap. Report on the Science of Science Policy to the Subcommittee on Social, Behavioral and Economic Sciences, Committee on Science.

National Science Foundation Act of 1950. P.L. 81-507

National Science Foundation. (1968, December 15). Technology Retrospect and Critical Events in Science (TRACES). Prepared for the NSF by the Illinois Institute of Technology Research under Contract NSF-C535.

National Science Foundation. (2020). Frequently asked questions about SBE's science of science programs, NSF 20-128.

Nelson, L., & S. W. Sedwick. (2011). STAR METRICS: A Participant's Perspective. NCURA Magazine, 43, 24–25.

The Nobel Prize in Chemistry. (2008). The Nobel Prize in Chemistry 2008 was awarded jointly to Osamu Shimomura, Martin Chalfie and Roger Y. Tsien "for the discovery and development of the green fluorescent protein, GFP.”

“Of Geese and Fleece.” (2012, April 30). The New York Times.

Owen-Smith, J. (2021a, May 23). NSF funding choice: Move forward or fall behind. The Hill.

Owen-Smith, J. (2021b, October 21). The U.S. Academic Research Enterprise (US-ARE): Possible paths from the pandemics. Springer Nature. 

Page, S. (2007). The difference: How the power of diversity creates better groups, firms, schools and societies. Princeton University Press.

Paul, R. (2021, April 26). Wasteful “Endless Frontiers Act” won't counter China's rising influence. The Hill.

Pressman, L, Planting, M., Bond, J., Yuskavage, R., & Moylan, C. (2019). The economic contributions of university/nonprofit inventions in the United States: 1996–2017. Report prepared for Biotechnology Innovation Organization and the Association of University Technology Managers.

Proxmire, W. (1975–1987). Golden Fleece Awards (Wis Mss 738, Box 158, Folders 1–5).

Romer, P. (1994). The origins of endogenous growth. The Journal of Economic Perspectives, 8(1), 3–22.

Romer, P. (2020, July 8). What it takes to be a leader in both basic science and technological progress. Statement for House Budget Committee Hearing on Federal R&D.

Ruegg, R., & Jordan, G. (2007). Overview of evaluation methods for R&D programs: A directory of evaluation methods relevant to technology development programs. Report prepared for U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy.

Salter, A., & Martin, B. (2001). The economic benefits of publicly funded basic research: A critical review, Research Policy, 30(3), 509–532.

Sampat, B. (2012). Mission-oriented biomedical research at the NIH. Research Policy, 41(10), 1729–1741.

Solow, R. (1956). A contribution to the theory of economic growth. The Quarterly Journal of Economics, 70(1), 65–94.

Taragin, M. (2019). Learning from negative findings. Israel Journal of Health Policy Research, 8, Article 38.

Tartari, V., & Stern, S. (2021). More than an ivory tower: The impact of research institutions on the quantity and quality of entrepreneurship (Working Paper 28846). NBER Working Paper Series.

Thompson, D. (2021, March 29). How mRNA technology could change the world. The Atlantic.

U.S. Department of Health and Human Services. (2015, October 2). STAR METRICS: New way to measure the impact of federally funded research. National Institutes of Health.

Visholm, A., Grosen, L., Norn, M. T., & Jensen, R. L. (2012, June). Summary report: Interdisciplinary research is key to solving societies problems. DEA.

Wagner, C., Roessner, D., Bobb, K., Klein, J., Boyack, K.W., Keyton, J., Rafols, I., & Borner, K. (2011, January). Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature. Journal of Informetrics, 5(1), 14–26.

“Why Interdisciplinary Research Matters.” (2015). Nature, 525(7569), 305.

Zolas, N., Goldschlag, N., Jarmin, R., Stephan, P., Owen- Smith, J., Rosen, R., Allen, B. M., Weinberg, B., & Lane, J. (2015). Wrapping it up in a person: Examining employment and earnings outcomes for Ph.D. recipients. Science, 350(6266), 1367–1371.

©2022 Tobin L. Smith. This article is licensed under a Creative Commons Attribution (CC BY 4.0) International license, except where otherwise indicated with respect to particular material included in the article.

No comments here
Why not start the discussion?