AAAS Annual Meeting 2019: Science Transcending Boundaries

In this 3-part series, I will explore the following main ideas of this conference: the collectivity of science, the intersection of science and policy, and the complexity of science communication. With some personal reflection on my takeaways from the meeting, I will delve into the interface of science and the public, and its implications for society as a whole.

Read More

Event Recap: Dr. Sarah Rhodes, Health Science Policy Analyst

by Chris Yarosh

PSPG tries to hold as many events as limited time and funding permit, but we cannot bring in enough speakers to cover the range of science policy careers out there. Luckily, other groups at Penn hold fantastic events, too, and this week’s Biomedical Postdoc Program Career Workshop was no exception. While all of the speakers provided great insights into their fields, this recap focuses on Dr. Sarah Rhodes, a Health Science Policy Analyst in the Office of Science Policy (OSP) at the National Institutes of Health (NIH).

First, some background: Sarah earned her Ph.D. in Neuroscience from Cardiff University in the U.K., and served as a postdoc there before moving across the pond and joining a lab at the NIH. To test the policy waters, Sarah took advantage of NIH’s intramural detail program, which allows scientists to do temporary stints in administrative offices. For her detail, Sarah worked as a Policy Analyst in the Office of Autism Research Coordination (OARC) at the National Institute of Mental Health (NIMH). That experience convinced her to pursue policy full time. Following some immigration-related delays, Sarah joined OARC as a contractor and later became a permanent NIH employee.

After outlining her career path, Sarah provided an overview of how science policy works in the U.S. federal government, breaking the field broadly into three categories: policy for science, science for policy, and science diplomacy. According to Sarah (and as originally promulgated by Dr. Diane Hannemann, another one of this event’s panelists), the focus of different agencies roughly breaks down as follows:


This makes a lot of sense. Funding agencies like NIH and NSF are mostly concerned with how science is done, Congress is concerned with general policymaking, and the regulatory agencies both conduct research and regulate activities under their purview. Even so, Sarah did note that all these agencies do a bit of each type of policy (e.g. science diplomacy at NIH Fogarty International Center). In addition, different components of each agency have different roles. For example, individual Institutes focus more on analyzing policy for their core mission (aging at NIA, cancer at NCI, etc.), while the OSP makes policies that influence all corners of the NIH.

Sarah then described her personal duties at OSP’s Office of Scientific Management and Reporting (OSMR):
  • Coordinating NIH’s response to a directive from the President’s Office of Science and Technology Policy related to scientific collections (think preserved specimens and the like)
  • Managing the placement of AAAS S&T Fellows at NIH
  • Supporting the Scientific Management Review Board, which advises the NIH Director
  • Preparing for NIH’s appropriations hearings and responding to Congressional follow-ups
  • “Whatever fires needs to be put out”
If this sounds like the kind of job for you, Sarah recommends building a professional network and developing your communication skills ASAP (perhaps by blogging!?). This sentiment was shared by all of the panelists, and it echoes advice from our previous speakers. Sarah also strongly recommends volunteering for university or professional society committees. These bodies work as deliberative teams and are therefore good preparation for the style of government work.

For more information, check out the OSP’s website and blog. If you’re interested in any of the other speakers from this panel, I refer you to the Biomedical Postdoc Program.

WHO says bacon causes cancer?

by Neha Pancholi

Note: Here at the PSPG blog, we like to feature writing from anyone in the Penn community interested in the science policy process or science for general interest. This is the 1st in a series of posts from new authors. Interested is writing for the blog? Contact us!

The daily meat consumption in the United States exceeds that of almost every other country1. While the majority of meat consumed in the United States is red meat2, the consumption of certain red meats has decreased over the past few decades due to associated health concerns, such as heart disease and diabetes1,2. In October, the World Health Organization (WHO) highlighted another potential health concern for red meat: cancer.

The announcement concerned both red and processed meat. Red meat is defined as unprocessed muscle meat from mammals, such as beef and pork3. Processed meat– generally red meat –has been altered to improve flavor through processes such as curing or smoking3. Examples of processed meat include bacon and sausage. The WHO confirmed that processed meat causes cancer and that red meat probably causes cancer. Given the prevalence of meat in the American diet, it was not surprising that the announcement dominated headlines and social media. So how exactly did the WHO decide that processed meat causes cancer?

The announcement by the WHO followed a report from the International Agency for Research on Cancer (IARC), which is responsible for identifying and assessing suspected causes of cancer. The IARC evaluates the typical level of exposure to a suspected agent, results from existing studies, and the mechanism by which the agent could cause cancer.

After a review of existing literature, the IARC classifies the strength of scientific evidence linking the suspected cancer-causing agent to cancer. Importantly, the IARC determines only whether there is sufficient evidence that something can cause cancer. The IARC does not evaluate risk, meaning that it does not evaluate how carcinogenic something is. The IARC classifies the suspected carcinogen into one of the following categories4:
  • Group 1 – There is convincing evidence linking the agent to cancer in humans. The agent is deemed carcinogenic.
  • Group 2A – There is sufficient evidence of cancer in animal models, and there is a positive association observed in humans. However, the evidence in humans does not exclude the possibility of bias, chance, or confounding variables. The agent is deemed as a probable carcinogen.
  • Group 2B – There is a positive association in humans, but the possibility of bias, chance, or confounding variables cannot be excluded. There is inadequate evidence in animal models.
  • This category is also used when there is sufficient evidence of cancer in animal models, but there is not an association observed in humans. The agent is a possible carcinogen.
  • Group 3 – There is inadequate evidence in humans and animals. The agent cannot be classified as carcinogenic or not carcinogenic.
  • Group 4 – There is sufficient evidence to conclude that the agent is not carcinogenic in humans or in animals.
The IARC reviewed over 800 studies that examined the correlation between consumption of processed or red meat and cancer occurrence in humans. These types of studies, which examine patterns of disease in different populations, are called epidemiological studies. The studies included observations from all over the world and included diverse ethnicities and diets. The greatest weight was given to studies that followed the same group of people over time and had an appropriate control group. Most of the available data examined the association between meat consumption and colorectal cancer, but some studies also assessed the effect on stomach, pancreatic, and prostate cancer. The majority of studies showed a higher occurrence of colorectal cancer in people whose diets included high consumption of red or processed meat compared to those who have low consumption. By comparing results from several studies, the IARC determined that for every 100 grams of red meat consumed per day, there is a 17% increase in cancer occurrence. For every 50 grams of processed meat eaten per day, there is an 18% increase. The average red meat consumption for those who eat it is 50-100 grams per day.3

The IARC also reviewed studies that examined how meat could cause cancer. They found strong evidence that consumption of red or processed meat leads to the formation of known carcinogens called N-nitroso compounds in the colon. It is also known that cooked meat contains two types of compounds that are known to damage DNA, which can lead to cancer. However, there is not a direct link between eating meat containing these compounds and DNA damage in the body.3

Based on the strong evidence demonstrating a positive association with consumption of processed meat and colorectal cancer, the IARC classified processed meat as a Group 1 agent3. This means that there is sufficient evidence that consumption of processed meat causes cancer.

There was a positive association between consumption of red meat and colorectal cancer in several epidemiological studies. However, the possibility of chance or bias could not be excluded from these studies. Furthermore, the best-designed epidemiological studies did not show any association between red meat consumption and cancer. Despite the limited epidemiological evidence, there was strong mechanistic evidence demonstrating that red meat consumption results in the production of known carcinogens in the colon. Therefore, red meat was classified as a probable carcinogen (Group 2A)3.

It will be interesting to see how the WHO announcement affects red meat consumption in the United States and worldwide. But before swearing off processed and red meat forever, there are a few things to consider.

First, it is important to bear in mind that agents classified within the same group have varying carcinogenic potential. Processed meat was classified as a Group 1 agent, which is the same classification for tobacco smoke. However, estimates by the Global Burden of Disease Project attribute approximately 34,000 cancer deaths per year to consumption of processed meat5. In contrast, one million cancer deaths per year are due to tobacco smoke5. While the evidence linking processed meat to cancer is strong, the risk of cancer due to processed meat consumption appears to be much lower than other known carcinogens. Second, the IARC did not evaluate studies that compared vegetarian or poultry diets to red meat consumption5. Therefore, it is unknown whether vegetarian or poultry diets are associated with fewer cases of cancer. Finally, red meat is high in protein, iron, zinc, and vitamin B123. Thus, while high red meat consumption is associated with some diseases, there are also several health benefits of consuming red meat in moderation. Ultimately, it will be important to balance the risks and benefits of processed and red meat consumption.


1http://www.npr.org/sections/thesalt/2012/06/27/155527365/visualizing-a-nation-of-meat-eaters
2http://www.usda.gov/factbook/chapter2.pdf
3Bouvard et al. Carcinogenicity of consumption of red and processed meat. The Lancet Oncology, 2015. 16(16): 1599-1600.
4http://www.iarc.fr/en/media-centre/iarcnews/pdf/Monographs-Q&A.pdf
5http://www.who.int/features/qa/cancer-red-meat/en/

Reminder: Science does not happen in a vacuum

by Chris Yarosh

It is very easy to become wrapped up in day-to-day scientific life. There is always another experiment to do, or a paper to read, or a grant to submit. This result leads to that hypothesis, and that hypothesis needs to be tested, revised, re-tested, etc. Scientists literally study the inner workings of life, matter and the universe itself, yet science often seems set apart from other worldly concerns.

But it’s not.

The terrorist attacks in Paris and Beirut and the ongoing Syrian refugee crisis have drawn the world’s attention, and rightfully so. These are genuine catastrophes, and it is difficult to imagine the suffering of those who must face the aftermath of these bouts of shocking violence.

At the same time, 80 world leaders are preparing to gather in freshly scarred Paris for another round of global climate talks. In a perfect world, these talks would focus only on the sound science and overwhelming consensus supporting action on climate change, and they would lead to an agreement that sets us on a path toward healing our shared home.

But this is not a perfect world.

In addition to the ongoing political struggle and general inertia surrounding climate change, we now must throw the fallout from the Paris attacks into the mix. Because of this, the event schedule will be limited to core discussions, which will deprive some people of their chance to demonstrate and make their voices heard on a large stage. This is a shame, but at least the meeting will go on. If the situation is as dire as many scientists and policy experts say it is, this meeting may be our last chance to align the world’s priorities and roll back the damage being caused to our planet. It was never going to be easy, and the fearful specter of terrorism—and the attention and resources devoted to the fight against it— does nothing to improve the situation.

This is a direct example of world events driving science and science policy, but possible indirect effects abound as well. It is not outside the realm of possibility that political disagreement over refugee relocation may lead to budget fights or government shutdown, both of which could seriously derail research in the U.S. With Election 2016 rapidly approaching, it is also possible that events abroad can drive voter preferences at home, with unforeseen impacts on how research is funded, conducted, and disseminated.

What does this mean for science and science policy?

For one, events like this remind us once again that scientists must stay informed and be ready to adapt as sentiments and attention shift in real time. Climate change and terrorism may not have seemed linked until now (though there is good reason to think that this connection runs deep), but the dramatic juxtaposition of both in Paris changes that. Scientists can offer our voices to the discussion, but it is vital that we keep abreast of the shifting political landscapes that influence the conduct and application of science. Keeping this birds-eye view is critical, because while these terrorist attacks certainly demand attention and action, they do nothing to change the urgent need for action on the climate, on health, and on a whole host of issues that require scientific expertise.

While staying current and engaging in policymaking is always a good thing for science (feel free to contact your representatives at any time), situations like the Syrian refugee crisis offer a more unique chance to lend a hand. Science is one of humanity’s greatest shared endeavors, an approach to understanding the world that capitalizes on the innate curiosity that all people share. This shared interest has always extended to displaced peoples, with the resulting collaborations providing a silver lining to the negative events that precipitated their migrations. Where feasible, it would be wise for universities across the globe to welcome Syrians with scientific backgrounds; doing so would provide continuity and support for the displaced while preventing a loss of human capital. Efforts to this effect are currently underway in Europe, though it is unclear how long these programs can survive the tension surrounding that continent.

For good and ill, world events have always shaped science. The tragedies in France, Syria, and elsewhere have incurred great human costs, and they will serve as a test of our shared humanity. As practitioners of one of our great shared enterprises, scientists have a uniquely privileged place in society, and we should use our station to help people everywhere in any way possible.

New funding mechanism aims to bring balance to the biomedical research (work)force

by Chris Yarosh

This past March, the National Cancer Institute (NCI) announced a new funding mechanism designed to stabilize the biomedical research enterprise by creating new career paths for PhD-level scientists. That mechanism, called the NCI Research Specialist Award (R50), is now live. Applications (of which there will likely be many) for the R50 will be accepted beginning in January, with the first crop of directly-funded Research Specialists starting in October 2016. More details about the grant can be found in the newly released FOA.

Why is this a big deal? In recent years, there have been increased calls for reform of the biomedical enterprise. More people than ever hold PhDs, and professor positions (the traditional career goal of doctorate holders) are scarce. This leaves many young researchers trapped somewhere in the middle in postdoctoral positions, something we've talked about  before on this blog. These positions are still considered to be training positions, and without professor openings (or funding for independent labs), these scientists often seek industry positions or leave the bench altogether in lieu of finding academic employment.

On the flip side, modern academic labs are highly dependent on a constant stream of graduate students and postdocs to do the lion’s share of the research funded by principal investigator-level grants (R01s). This creates a situation where entire labs can turn over in relatively short periods of time, possibly diminishing the impact of crucial research programs.

But what if there was another way? That, in a nutshell, is the aim of the R50. By funding the salaries (but not the research costs) of PhD-level researchers, the R50 seeks to create opportunities for scientists to join established research programs or core facilities without having to obtain larger grants or academic appointments. This attempts to kill two birds with one stone: more jobs for PhDs, less turnover in labs already funded by other NCI grants.

This approach is not all roses, however. For one, this doesn’t change the fact that research funding has been flat or worse in recent years. Even with more stable staffing, the amount of research being completed will continue to atrophy. Moreover, the money for future R50s will need to come from somewhere, and it is possible that this will put additional strain on the NCI’s budget if overall R&D spending is not increased soon. Lastly, there are some concerns about how the R50 will work in practice. For example, Research Specialists will be able to move to other labs with NCI approval, but how will this actually play out? Will R50s really be pegged to their recipients, or will there be an implicit understanding that they are tied to the supporting labs/institutions?

It should be noted that this is only a trial period, and that full evaluation of the program will not be possible until awards are actually made. Still, this seems like a positive response to the forces currently influencing the biomedical research enterprise, and it will be interesting to see if and when the other NIH institutes give something like this a shot.

Communicating about an Epidemic in the Digital Age

**Link for live streaming of this event can be found here**

by Hannah Shoenhard, Jamie DeNizio, and Michael Allegrezza

Craig Spencer, a New York City doctor, tested positive for Ebola on October 23. The story broke online the same day, and by the next morning, tabloids were plastered with images of masked and gowned health workers with headlines such as Bungle Fever and Ebola! Late-night comedy, Twitter, local news: the story was inescapable, the hysteria palpable. All in all, only eleven Ebola patients were treated on U.S. soil. But the media’s reaction affected the lives of anyone who watched television or had an internet connection.

The Ebola epidemic in Africa has died down. Liberia is Ebola-free, while Sierra Leone and Guinea continue to report cases in the low single digits per week. Most promisingly, a new vaccine has been shown to be highly effective in a clinical trial. Given the vaccine, it seems that the likelihood of future epidemics on the scale of the one in 2014 is low. But especially during the early days of the epidemic, miscommunication and mistrust of international public health workers slowed the medical response and exacerbated the epidemic. And, as the reaction to the New York City case shows us, this problem is not unique to West African countries.

Even if the threat from Ebola in particular is under control, infectious disease is endemic to civilization. Knowing that new epidemic threats can emerge at any time, important questions need to be considered. 

How prepared are Philadelphia’s institutions to communicate with the public in the event of a future epidemic? What specific challenges were successfully or unsuccessfully addressed during the Ebola crisis that could provide learning points going forward? Are there successful models or case studies for handling communication during epidemics that are worth emulating?

These questions will be up for debate on Wednesday at the University of Pennsylvania in a forum open to the public. The event will be held in the Penn bookstore (3601 Walnut St.) upstairs meeting room from 5:30 to 7 p.m. on Wednesday, November 4.

The event is hosted by two graduate student groups at Penn, the Emerging Leaders in Science and Society (ELISS) Fellows and the Penn Science Policy Group with the goal of fostering collaborative ideas to develop effective channels to manage trust, fear, and accurate communication during potential future epidemics.

On the panel for the forum will be three innovators in communicating public health issues, Dr. Mitesh Patel, MD, MBA, MS, James Garrow, MPH, and Dr. Giang T. Nguyen, MD, MPH, MSCE. Moderating the discussion will be Dr. Max King, Ph.D., Associate Vice Provost for Health and Academic Services at Penn.

Community members are encouraged to attend the forum with questions and comments. People can also watch a live stream of the event (check our twitter page for the link) and submit questions via twitter with #EpidemicPhilly.


Biographies of the panelists and moderator


Mitesh Patel
 
Mitesh S. Patel, MD, MBA, MS is a board-certified general internist, physician scientist, and entrepreneur. He is an Assistant Professor of Medicine and Health Care Management at the Perelman School of Medicine and The Wharton School at the University of Pennsylvania. His work focuses on leveraging behavioral economics, connected health approaches and data science to improve population health and increase value within the health care delivery system. 

As a physician-scientist, Mitesh studies how we can utilize innovative technology and connected health approaches to passively monitor activity and how we can use health incentives to motivate behavior change. His prior work has been published in NEJM, JAMA, the Annals of Internal Medicine and featured in the New York Times, NPR, and CNN. Mitesh also co-founded, Docphin, a startup that strives to improve the application of evidence-based medicine into clinical practice.

Mitesh holds undergraduate degrees in Economics and Biochemistry as well as a Medical Doctorate from the University of Michigan. He obtained an MBA in Health Care Management from The Wharton School and an MS in Health Policy Research from the Perelman School of Medicine at the University of Pennsylvania. Mitesh completed his internal medicine residency training and the Robert Wood Johnson Clinical Scholars Program at the University of Pennsylvania.

James Garrow

James Garrow, MPH is a nationally recognized proponent and advocate for the use of social media and digital tools in the execution of public health activities. His role as the Director of Digital Public Health in Philadelphia is among the first in the country charged with using new digital tools and techniques like social media, crowdsourcing, and big data utilization. He also provides media relations support for the Philadelphia Department of Public Health.

An accomplished public speaker and noted thought leader, Jim has been invited to and spoken at conferences across the US on social media use in public health and emergency response. He is an active social media user, maintaining two regularly scheduled Twitterchats and a blog on crisis and emergency risk communications.

Jim obtained a B.S. in Applied Sociology from Drexel in 2001 and a Master’s of Public Health from Temple in 2011. 


Giang T. Nguyen

Dr. Nguyen, MD, MPH, MSCE, is an Assistant Professor in the School of Medicine, Department of Family Medicine and Community Health at Penn.  He is also Chair of the MD-MPH Advisory Committee and a member of the MPH Program Curriculum Committee.

Dr. Nguyen leads the Penn Asian Health Initiatives. His research focus is in Asian immigrant health with concentrations in cancer control, disease prevention, and community-based participatory research. His community engagement work has included outreach to Vietnamese and other Southeast Asian refugees, health fairs and immunization clinics, cancer education workshops, advocacy, HIV/AIDS, and LGBT issues. He serves on boards and advisory committees for several Asian serving organizations, including the Asian and Pacific Islander National Cancer Survivors Network.

Dr. Nguyen is also the Medical Director of Penn Family Care, the clinical practice of the University of Pennsylvania's Department of Family Medicine and Community Health. He provides direct care to adult and pediatric patients in the primary care setting and teaches medical students and family medicine residents. He also is a Senior Fellow of the Penn Center for Public Health Initiatives, where he is a core faculty member for the Penn MPH program.
Max King
Previously the coordinator of Penn State's University Scholars Program, Max King, Ph.D., MS, is now Associate Vice Provost for Health and Academic Services at The University of Pennsylvania.
Dr. King holds three degrees from Penn State: a B.S. in Biological Health, M.S. in Health Education, and Ph.D. from the Interdisciplinary Program in Educational Theory and Policy. His research focus is the multidimensional Methodology of Q-Analysis, or Polyhedral Dynamics, a higher-level structural analysis approach derived from algebraic topology.

Dr. King also held an appointment as an Affiliate Assistant Professor and a member of the graduate faculty in the Department of Administration, Policy, Foundations, and Comparative/International Education at Penn State. He taught educational foundations, comparative education, British education, research methods, and international education. He also has extensive experience in computer systems, developing mainframe and microcomputer research and thesis applications.

NIH to chimera researchers: Let's talk about this...

by Chris Yarosh

When we think about the role of the National Institutes of Health (NIH) in biomedical research, we often think only in terms of dollars and cents. The NIH is a funding agency, after all, and most researchers submit grants with this relationship in mind. However, because the NIH holds the power of the purse, it also plays a large role in dictating the scope of biomedical research conducted in the U.S. It is noteworthy, then, that the NIH recently delayed some high profile grant applications related to one type of research: chimeras.

Chimeras, named for a Greek mythological monster composed of several different animals, are organisms that feature cells that are genetically distinct.  In the lab, this commonly refers to animals that contain cells from more than once species. Research into chimeras is not new; scientists have been successfully using animal/animal (e.g. sheep/goat) chimeras for over 30 years to learn about how animals develop. Human/animal chimeras are also a common research tool. For example, the transfer of cancerous human tissue into mice with weakened immune systems is standard practice in cancer biology research because it allows researchers to test chemotherapy drugs in a system that is more complex than a dish of cells before testing them in human subjects. These experiments are largely uncontroversial, save for individuals who fall into the anti-animal testing camp (and those who dispute the predictive power of mouse models in general). Why then, has the NIH decided to pump the brakes on this line of research?

Like many things, the answer lies in the timing. The temporarily-stalled research involves injecting human pluripotent cells—undifferentiated cells that can develop into any number of different cell types—not into mature animals, but instead into animal embryos. Unlike the tumor-in-a-mouse research mentioned above, this kind of experiment is specifically trying to get normal human cells to develop as an animal matures and remain, well, normal human cells. One idea is that someday we could grow an organ (liver, pancreas, etc.) in an animal, such as a pig, that is still a human organ. This would lower the barrier for successful transplantation, meaning that somebody in serious need of a new liver could receive one from livestock instead of waiting for a human donor from a transplant list. Another thought is that chimeric animals will better model human physiology, making subsequent clinical trials more accurate.

If you read the last paragraph and felt a bit uneasy, you’re not alone. For some, this type of research crosses the invisible line that separates humans from animals, and is therefore unacceptable. Others find this research troubling from an animal welfare standpoint, and still other worry about unanticipated differentiation (e.g. “we wanted a liver, but we found some human cells in the pig’s nerves, too”) or unethical uses for this type of technology.

The NIH hears these concerns, and wants to talk about them before giving scientists the go ahead to use public funds on this type of research. Some researchers have reacted negatively to this, fearing broader restrictions in the future, but I think this is an important part of the scientific process. We live (and for scientists, work) in an era of unprecedented ability to modify genomes and cell lineages, and human/animal chimeras are just one example of a type of research destined for more attention and oversight. It is important to get the guidelines right.

The NIH will convene a panel of scientists and bioethicists to discuss human/animal chimera research on November 6th, so keep an eye out for possible policy revisions after then. Given the promise of this type of research and the potential concerns over its use, this surely is only the beginning of the deliberative process.

UPDATE (11/05/2015): Scientists from Stanford University have posted an open letter in Science calling for a repeal of the current restrictions in this field. The full letter, found here, argues that there is little scientific justification for the NIH's stated concerns. Over at Gizmodo,  the NIH has responded by claiming that the true purpose of the stop order and review is to "stay ahead" of current research and anticipate future work. This is consistent with the NIH's views as articulated on the Under the Poliscope blog. All things considered, the workshop tomorrow, and any guidelines resulting from it, should be very interesting for people who wish to develop and use these tools.

Training the biomedical workforce - a discussion of postdoc inflation


By Ian McLaughlin


Earlier this month, postdocs and graduate students from several fields met to candidly discuss the challenges postdocs are encountering while pursuing careers in academic research.  The meeting began with an enumeration of these challenges, discussing the different elements contributing to the mounting obstacles preventing postdocs from attaining faculty positions – such as the scarcity of faculty positions and ballooning number of rising postdocs, funding mechanisms and cuts, the sub-optimal relationship between publications and the quality of science, and the inaccurate conception of what exactly a postdoctoral position should entail.


From [15]

At a fundamental level, there’s a surplus of rising doctoral students whose progression outpaces the availability of faculty positions at institutions capable of hosting the research they intended to perform [10,15].  While 65% of PhDs attain postdocs, only 15-20% of postdocs attain tenure-track faculty positions [1].  This translates to significant extensions of postdoctoral positions, with the intentions of bolstering credentials and generating more publications to increase their appeal to hiring institutions.  Despite this increased time, postdocs often do not benefit from continued teaching experiences, and are also unable to attend classes to cultivate professional development.


From [10]
Additionally, there may never be an adequate position available. Instead of providing the training and mentorship necessary to generate exceptional scientists, postdoctoral positions have become “holding tanks” for many PhD holders unable to transition into permanent positions [5,11], resulting in considerably lower compensation relative to alternative careers 5 years after attaining a PhD.

From [13]

Perhaps this wouldn’t be quite so problematic if the compensation of the primary workhorse of basic biomedical research in the US was better.  In 2014, the US National Academies called for an increase of the starting postdoc salary of $42,840 to $50,000 – as well as a 5-year limit on the length of postdocs [1].  While the salary increase would certainly help, institutions like NYU, the University of California system, and UNC Chapel Hill have explored term limits.  Unfortunately, a frequent outcome of term limits was the promotion of postdocs to superficial positions that simply confer a new title, but are effectively extended postdocs. 

Given the time commitment required to attain a PhD, and the expanding durations of postdocs, several of the meeting’s attendees identified a particularly painful interference with their ability to start a family.  Despite excelling in challenging academic fields at top institutions, and dedicating professionally productive years to their work, several postdocs stated that they don’t foresee the financial capacity to start a family before fertility challenges render the effort prohibitively difficult.

However, administrators of the NIH have suggested this apparent disparity between the number of rising postdocs and available positions is not a significant problem, despite having no apparent data to back up their position. As Polka et al. wrote earlier this year, NIH administrators don’t have data quantifying the total numbers of postdocs in the country at their disposal – calling into question whether they are prepared to address this fundamental problem [5].

A possible approach to mitigate this lack of opportunity would be to integrate permanent “superdoc” positions for talented postdocs who don’t have ambitions to start their own labs, but have technical skills needed to advance basic research.  The National Cancer Institute (NCI) has proposed a grant program to cover salaries between $75,000-$100,000 for between 50-60 of such positions [1,2], which might be expanded to cover the salaries of more scientists.  Additionally, a majority of the postdocs attending the meeting voiced their desire for more comprehensive career guidance.  In particular, while they are aware that PhD holders are viable candidates for jobs outside of academia – the career trajectory out of academia remains opaque to them.

This situation stands in stark contrast to the misconception that the US suffers from a shortage of STEM graduates.  While the careers of postdocs stall due to a scarcity of faculty positions, the President’s Council of Advisors on Science and Technology announced a goal of one million STEM trainees in 2012 [3], despite the fact that only 11% of students graduating with bachelor’s degrees in science end up in fields related to science [4] due in part, perhaps, to an inflated sense of job security.  While the numbers of grad students and postdocs have increased almost two-fold, the proliferation of permanent research positions hasn’t been commensurate [5]. So, while making science a priority is certainly prudent – the point of tension is not necessarily a shortage of students engaging the fields, but rather a paucity of research positions available to them once they’ve attained graduate degrees. 

Suggested Solutions

Ultimately, if the career prospects for academic researchers in the US don't change, increasing numbers of PhD students will leave basic science research in favor of alternatives that offer better compensation and career trajectories – or leave the country for international opportunities.  At the heart of the problem is a fundamental imbalance between the funding available for basic academic research and the growing community of scientists in the U.S [9,14], and a dysfunctional career pipeline in biomedical research [9].  Some ideas of strategies to confront this problem included the following suggestions.

Federal grant-awarding agencies need to collect accurate data on the yearly numbers of postdoctoral positions available.  This way, career counselors, potential students, rising PhD students, and the institutions themselves will have a better grasp of the apparent scarcity of academic research opportunities.

As the US National Academies have suggested, the postdoc salary ought to be increased.  One possible strategy would be to increase the prevalence of “superdoc”-type positions creating a viable career alternative for talented researchers who wish to support a family but not secure the funding needed to open their own labs.  Additionally, if institutions at which postdocs receive federal funding were to consider them employees with all associated benefits, rather than trainees, rising scientists might better avoid career stagnation and an inability to support families [11].

As the number of rising PhDs currently outpaces the availability of permanent faculty positions, one strategy may be to limit the number of PhD positions available at each institution to prevent continued escalation of postdocs without viable faculty positions to which they might apply.  One attendee noted that this could immediately halt the growth of PhDs with bleak career prospects.

Several attendees brought up the problems many postdocs encounter in particularly large labs, which tend to receive disproportionately high grant funding.  Postdocs in such labs feel pressure to generate useful data to ensure they can compete with their peers, while neglecting other elements of their professional development and personal life. As well, the current system funnels funding to labs that can guarantee positive results, favoring conservative rather than potentially paradigm-shifting proposals – translating to reduced funding for new investigators [9]. Grant awarding agencies’ evaluations of grant proposals might integrate considerations of the sizes of labs with the goal of fostering progress in smaller labs. Additionally, efforts like Cold Spring Harbor Laboratory’s bioRχiv might be more widely used to pre-register research projects so that postdocs are aware of the efforts of their peers – enabling them to focus on innovation when appropriate.

While increased funding for basic science research would help to avoid the loss of talented scientists, and private sources may help to compensate for fickle federal funds [6], some attendees of the meeting suggested that the current mechanisms by which facilities and administrations costs are funded might be restructured. These costs, also called “indirect costs” - which cover expenditures associated with running research facilities, and not specific projects - might be restructured to avoid over 50 cents of every federally allocated dollar going to the institution itself, rather than the researchers of the projects that grants fund [7,8].  This dynamic has been suggested to foster the growth of institutions rather than investment in researchers, and optimizing this component of research funding might reveal opportunities to better support the careers of rising scientists [9,12]

Additionally, if the state of federal funding could be more predictable, dramatic fluctuations of the numbers of faculty positions and rising scientists might not result in such disparities [9].  For example, if appropriations legislation consistently adhered to 5 year funding plans, dynamics in biomedical research might avoid unexpected deficits of opportunities.


From [5]

Career counselors ought to provide accurate descriptions of how competitive a search for permanent faculty positions can be to their students, so they don’t enter a field with a misconceived sense of security.  Quotes from a survey conducted by Polka et al. reveal a substantial disparity between expectations and outcomes in academic careers, and adequate guidance might help avoid such circumstances.

As shown in the NSF’s Indicators report from 2014, the most rapidly growing reason postdocs identify as their rationale for beginning their projects is “other employment not available” – suggesting that a PhD in fields associated with biomedical sciences currently translates to limited opportunities. Even successful scientists and talented postdocs have become progressively more pessimistic about their career prospects.  Accordingly - while there are several possible solutions to this problem - if some remedial action isn’t taken, biomedical research in the U.S. may stagnate and suffer in upcoming coming years.

Citations
 1.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 2.    http://news.sciencemag.org/biology/2015/03/cancer-institute-plans-new-award-staff-scientists
 3.    https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf
 4.    http://www.nationalreview.com/article/378334/what-stem-shortage-steven-camarota
 5.    Polka JK, Krukenberg KA, McDowell GS. A call for transparency in tracking student and postdoc career outcomes. Mol Biol Cell. 2015 Apr 15;26(8):1413-5. doi: 10.1091/mbc.E14-10-1432.
 6.    http://sciencephilanthropyalliance.org/about.html
 7.    http://datahound.scientopia.org/2014/05/10/indirect-cost-rate-survey/
 8.    Ledford H. Indirect costs: keeping the lights on. Nature. 2014 Nov 20;515(7527):326-9. doi: 10.1038/515326a. Erratum in: Nature. 2015 Jan 8;517(7533):131
 9.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 10.    National Science Foundation (2014) National Science and Engineering Indicators (National Science Foundation, Washington, DC).
 11.    Bourne HR. A fair deal for PhD students and postdocs. Elife. 2013 Oct 1;2:e01139. doi: 10.7554/eLife.01139.
 12.    Bourne HR. The writing on the wall. Elife. 2013 Mar 26;2:e00642. doi: 10.7554/eLife.00642.
 13.    Powell K. The future of the postdoc. Nature. 2015 Apr 9;520(7546):144-7. doi: 10.1038/520144a.
 14.    Fix the PhD. Nature. 2011 Apr 21;472(7343):259-60. doi: 10.1038/472259b.
 15.   Schillebeeckx M, Maricque B, Lewis C. The missing piece to changing the university culture. Nat Biotechnol. 2013 Oct;31(10):938-41. doi: 10.1038/nbt.2706.

AAAS Forum Take #2

Another point of view of the AAAS Forum by Matthew Facciani:

I have provided scientific testimony and met with some of my local legislators, but I’ve never had any formal exposure to science policy. I was really excited to hear about the AAAS Science & Technology Policy Forum to learn more about how scientists can impact policy. The information I absorbed at the conference was overwhelming, but incredibly stimulating. Some of the lectures discussed the budget cuts and the depressing barriers for achieving science policy. However, I felt there was definitely an atmosphere of optimism at the conference and it was focused on how we can create positive change.

One of my favorite aspects of the conference were the discussions of how to effectively communicate science to non-scientists. Before we can even have discussions of funding, the general public needs to understand how science works and why basic science is so important. For example, science never proves anything with 100% certainty, but it may sound weak if politicians are only saying that science “suggests” instead of “proves.” One creative way to circumvent this problem is to use comparisons. Instead of saying “science suggests GMOs are safe” we could say “scientists are as sure that GMOs are safe as they are sure that smoking is bad for your health.” The conference was rife with these kinds of effective tactics and I left the conference with a sense of confidence that we can collectively make a difference to influence science policy.


Matthew Facciani is a sociology PhD student at The University of South Carolina. He is also a gender equality activist and science communicator. Learn more at www.matthewfacciani.com, and follow him at @MatthewFacciani.

2015 AAAS Science and Technology Policy Forum Summary


I recently had the opportunity to attend the 2015 AAAS Science and Technology Policy Forum in Washington, D.C. This annual meeting brings together a range of academics and professionals to discuss the broad S&T policy landscape. Below are some of my takeaways from the meeting. I hope to have additional comments from other National Science Policy Group members up soon.

By Chris Yarosh

The talks and panels at the Forum encompassed a huge range of topics from the federal budget and the appropriations outlook to manufacturing policy and, of course, shrimp treadmills. My opinion of the uniting themes tying this gamut together is just that—my opinion— and should only be taken as such. That being said, the threads I picked on in many of the talks can be summarized by three C’s: cooperation, communication, and citizenship.

First up, cooperation. Although sequestration’s most jarring impacts have faded, AAAS’s budget guru Matthew Hourihan warns that fiscal year 2016 could see a return of…let’s call it enhanced frugality. These cuts will fall disproportionately on social science, clean energy, and geoscience programs. With the possibility of more cuts to come, many speakers suggested that increased cooperation between entities could maximize value. This means increased partnership between science agencies and private organizations, as mentioned by White House Office of Science and Technology Policy Director John Holdren, and between federal agencies and state and local governments, as highlighted by NSF Director France Córdova. Cooperation across directorates and agencies will also be a major focus of big interdisciplinary science and efforts to improve STEM education. Whatever the form, the name of the game will be recognizing fiscal limitations and fostering cooperation to make the most of what is available.

The next “C” is communication. Dr. Córdova made a point of listing communication among the top challenges facing the NSF, and talks given by Drs. Patricia Brennan (of duck penis fame) and David Scholnick (the aforementioned shrimp) reinforced the scale of this challenge. As these two researchers reminded us so clearly, information on the Web and in the media can be easily be misconstrued for political or other purposes in absence of the correct scientific context. To combat this, many speakers made it clear that basic science researchers must engage a wider audience, including elected officials, or risk our research being misconstrued, distorted, or deemed unnecessary. As Dr. Brennan said, it is important to remind the public that while not every basic research project develops into something applied, “every application derives from basic science.”

The last “C” is citizenship. Several of the speakers discussed the culture of science and interconnections between scientists and non-scientists. I think that these presentations collectively described what I’ll call good science citizenship.  For one, good science citizenship means that scientists will increasingly need to recognize our role in the wider innovation ecosystem if major new programs are ever going to move forward. For example, a panel on new initiatives in biomedical research focused on 21st Century Cures and President Obama’s Precision Medicine Initiative. Both of these proposal are going to be massive undertakings; the former will involve the NIH and FDA collaborating to speed the development and introduction of new drugs to the market, while the latter is going to require buy in from a spectrum of stakeholders including funders, patient groups, bioethicists, and civil liberty organizations. Scientists are critical to these endeavors, obviously, but we will need to work seamlessly across disciplines and with other stakeholders to ensure the data collected from these programs are interpreted and applied responsibly.

Good science citizenship will also require critical evaluation of the scientific enterprise and the separation of the scientific process from scientific values, a duality discussed during the William D. Carey lecture given by Dr. William Press. This means that scientists must actively protect the integrity of the research enterprise by supporting all branches of science, including the social sciences (a topic highlighted throughout the event), and by rigorously weeding out misconduct and fraud. Scientists must also do a better job of making our rationalist approach works with different value systems, recognizing that people will need to come together to address major challenges like climate change.  Part of this will be better communication to the public, but part of it will also be learning how different value systems influence judgement of complicated scientific issues (a subject of another great panel about Public Opinion and Policy Making). Good science citizenship, cultivated through professionalism and respectful engagement of non-scientists, will ultimately be critical to maintaining broad support for science in the U.S.

Publish or Perish: an old system adapting to the digital era

    By Annie Chen and Michael Allegrezza
      
           When scientific publishing was developed in the 19th century, it was designed to overcome barriers that prevented scientists from disseminating their research findings efficiently. It was not feasible for scientists to arrange for typesetting, peer review, printing, and shipping of their results to every researcher in their field. As payment for these services offered by publishers, the researchers would transfer the exclusive copyrights for this material to the publisher, who would then charge subscribers access fees. To limit the printing costs associated with this system, journals only published articles with the most significant findings. Now, nearly 200 years later, we have computers, word processors, and the Internet. Information sharing has become easier than ever before, and it is nearly instantaneous. But the prevailing model of subscription-based publishing remains tethered to its pre-digital origins, and for the most part these publishers have used the Internet within this model, rather than as a tool to create a new and better system for sharing research.

Figure 1. Trend lines show an annual increase of 6.7%
for serials expenditures vs 2.9% for the Consumer Price
Index over the period 1986-2010, relative to 1986 prices.
In theory, digitization should have decreased costs of communicating science: (authors can perform many of the typesetting functions, articles can be uploaded online instead of printed and shipped, etc. In practice, however, digitization has actually increased the price of journals. Statistics from the Association of Research Libraries show that the amount spent on serials increased 6.7% per year between 1986 and 2011, while inflation as measured by the US Consumer Prices Index only rose 2.9% per year over the same period (Figure 1).1 Shawn Martin, a Penn Scholarly Communication Librarian, explained, “Penn pays at least twice for one article, but can pay up to 7 or more times for the same content,” in the process of hiring researchers to create the content, buying subscriptions from journals, and paying for reuse rights. To be fair, the transition phase from print to digital media has been costly for publishers because they have had to invest in infrastructure for digital availability while still producing print journals. Many publishers argue that while journal prices may have increased, the price per reader has actually decreased due to a surge in the ease of viewers accessing articles online.

Regardless of whether increasing journal prices was justified, a new model for academic publishing emerged in the 1990s in opposition: open access (OA). There are two ways of attaining open access: Gold OA, when the publisher makes the article freely accessible, and Green OA, which is self-archiving by the author. A few years ago, Laakso et al. conducted a quantitative analysis of the annual publication volumes of Direct OA journals from 1993 to 2009 and found that the development of open access could be described by three phases: Pioneering (1993-1999), Innovation (2000-2004), and Consolidation (2005-2009).2 During the pioneering years, there was high year-to-year growth of open access articles and journals, but the total numbers were still relatively small. OA publishing bloomed considerably from 2000 to 2009, growing from 19,500 articles and 740 journals to 191,850 articles and 4,769 journals, respectively. During the innovation years, new business models emerged. For example, BioMedCentral, later purchased by Springer in 2008, initiated the author charge. In 2004, some subscription-based journals began using a hybrid model, such as Springer’s Open Choice program, which gave authors the option of paying a fee to make their article openly available. During the consolidation phase, year-to-year growth for articles decreased from previous years but was still high, at about 20%.

The introduction of open access journals has sparked fierce and passionate debates among scientists. Proponents of open access believe scientific research should be available to everyone from anywhere in the world. Currently, subscription fees prevent many people from accessing the information they need. With open access, students and professors in low- and middle-income countries, health care professionals in resource-limited settings, and the general public would gain access to essential resources. For instance, Elizabeth Lowenthal, MD, at the Penn Center for AIDS Research, recently published a paper in PLoS One analyzing variables that influence adherence to retroviral drugs in HIV+ adolescents living in Botswana. Her decision to publish open access was because “the article will be of most direct use to clinicians working in Botswana and I wanted to make sure that it would be easy for them to access it.” Open access also provides re-use rights and may facilitate a more rapid exchange of ideas and increased interactions among scientists to generate new scientific information.

However, there may also be some downsides to increased access. Open access may increase the number of articles that people have to sift through to find important studies.3 Furthermore, people who do not know how to critically read scientific papers may be misled by articles with falsified data or flawed experiments. While these papers often get retracted later on, they may undermine the public’s confidence in scientists and medicine. Wakefield’s (retracted) article linking vaccines to autism, for example, may have contributed to the rise of the anti-vaccine movement in the US.4 Furthermore, many open access journals require authors to pay for their papers to be published to offset the cost of publication, and some people have taken advantage of this new payment system to make a profit through predatory journals (a list of predatory OA journals can be found here: http://scholarlyoa.com/publishers/). It is clear though, that the expansion of open access from 1993 until present time suggests that open access can be a sustainable alternative to the traditional model of subscription-based academic publishing.

In addition to facilitating access to scientific articles, the Internet has also created opportunities to improve the peer review process. Peer review was designed to evaluate the technical merit of a paper and to select papers that make significant contributions to a field. Scientists supporting the traditional model of publishing argue that the peer review process in some open access journals may not be as rigorous, and this may lead to the emergence of a “Wild West” in academic publishing. Last year, reporter John Bohannon from Science magazine sent a flawed paper to 304 open access journals, and of the 255 journals that responded, 157 accepted the paper, suggesting little or no peer review process in these journals.5 However, even high-impact journals publish papers with flawed experiments.6 Michael Eisen, co-founder of PLoS, wrote, “While they pocket our billions, with elegant sleight of hand, they get us to ignore the fact that crappy papers routinely get into high-profile journals simply because they deal with sexy topics…. Every time they publish because it is sexy, and not because it is right, science is distorted. It distorts research. It distorts funding. And it often distorts public policy.”7 Nature, for example, published two articles last year about acid-bath stem cell induction, which were later retracted due to data manipulation. However, according to Randy Sheckman, editor-in-chief of eLife, “these papers will generate thousands of citations for Nature, so they will profit from those papers even if they are retracted.”8

With digital communication, peer review for a manuscript could shift from a rigid gate controlled by 3 or 4 people, who might not even be active scientists, into a more dynamic, transparent, and ongoing process with feedback from thousands of scientists. Various social media platforms with these capabilities already exist, including ResearchGate9 and PubMedCommons.10 Some open access journals are using different strategies to address these issues in peer review. eLIFE, for example, employs a fast, streamlined peer review process to decrease the amount of time from submission to publication while maintaining high-quality science. On the other hand, PLoS One, one of the journals published by the Public Library of Science, judges articles based on technical merit alone, not on the novelty.

We polled a few scientists at Penn who had recently published for their thoughts on open access and peer review. Most people did not experience a difference in the peer review process at an open access journal compared to non-open access. The exception was at eLIFE, where reviewers’ comments were prompt, and the communication between reviewers and editors is “a step in the right direction,” according to Amita Sehgal, PhD. To improve the peer review process, some suggested a blind process to help eliminate potential bias towards well-known labs or against lesser-known labs.

The digital revolution is changing the culture of academic publishing, albeit slowly. In 2009, the NIH updated their Public Access Policy to require that any published research conducted with NIH grants be available on PubMed Central 12 months after publication.11 Just last month, the publisher Macmillan announced that all research papers in Nature and its sister journals will be made free to access online in a read-only format that can be annotated but not copied, printed or downloaded. However, only journal subscribers and some media outlets will be able to share links to the free full-text, read-only versions.12 Critics such as Michael Eisen13 and John Wilbanks14 have labeled this change merely a public relations ploy to appeal to demands without actually increasing access. It will be interesting to see if other publishers follow this trend.

Scientific communication has yet to reap the full benefits in efficiency made possible by the Internet. The current system is still less than ideal at furthering ideas and research with minimal waste of resources. But this generation of young researchers is more optimistic and may revolutionize scientific publishing as we know it. “I think [open access is] the future for all scientific publications,” says Bo Li, a postdoc at Penn. “I hope all research articles will be freely accessible to everyone in the world.”

A companion opinion article by Penn PhD student Brian S. Cole can be found here.

This article appeared in the Penn Science Policy January 2015 newsletter



Annie Chen
Michael Allegrezza














1Ware M, Mabe M. (2012) The stm report. http://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf
2Laakso M, Welling P, Bukvova, H, et al. (2011) The development of open access journal publishing from 1993 to 2009. PLoS ONE.
3Hannay, T. (2014) Stop the deluge of scientific research. The Guardian: Higher Education Network Blog. http://www.theguardian.com/higher-education-network/blog/2014/aug/05/why-we-should-publish-less-scientific-research.
4Wakefield AJ, Murch SH, Anthony A, Linnell, et al. (1998) Ileal lymphoid nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children [retracted]. Lancet. 351:637-41.
5Bohannon, J. (2013) Who’s afraid of peer review? Science. 342(6154): 60-65. DOI: 10.1126/science.342.6154.60
6Wolfe-Simon F, Switzer Blum J, Kulp TR, et al. (2011) A bacterium that can grow by using arsenic instead of phosphorus. Science. 332(6034) 1163-6. doi: 10.1126/science.1197258.
7Eisen M. (2013) I confess, I wrote the Arsenic DNA paper to expose flaws in peer-review at subscription based journals. it is NOT junk. http://www.michaeleisen.org/blog/?p=1439.
8(2014) Episode 12. The eLIFE podcast. http://elifesciences.org/podcast/episode12
9ResearchGate. http://www.researchgate.net/
10PubMed Commons. http://www.ncbi.nlm.nih.gov/pubmedcommons/
11NIH Public Access Policy Details. http://publicaccess.nih.gov/policy.htm
12Baynes G, Hulme L, MacDonald S. (2014) Articles on nature.com to be made widely available to read and share to support collaborative research. Nature. http://www.nature.com/press_releases/share-nature-content.html
13Eisen M. (2014) Is Nature’s “free to view” a magnanimous gesture or a cynical ploy?. it is NOT junk. http://www.michaeleisen.org/blog/?p=1668
14Van Noorden R. (2014) Nature promotes read-only sharing by subscribers. Nature. http://www.nature.com/news/nature-promotes-read-only-sharing-by-subscribers-1.16460

The Richest Return of Wisdom

 By Brian S. Cole

    The real lesson I’ve gleaned from my time in pursuit of a PhD in biomedical research hasn’t been the research itself; indeed many of my colleagues and I came into the program already equipped with extensive bench experience, but the real eye-opener has been how science is communicated.  When I was an undergraduate, assiduously repeating PCR after PCR that quietly and dutifully failed to put bands on a gel, I just assumed that experiments always worked in the well-funded, well-respected, well-published labs that wrote the papers we read in school.  As an undergraduate, I had implicit trust in scientific publications; at the end of the PhD, I have implicit skepticism.  It turns out I’m not alone.

    The open access movement has taken a new tone in the past year: increasing recognition of the irreplicability1 and alarming prevalence of scientific misconduct2 in highly-cited journals has led to questioning of the closed review process.  Such a process disallows the public access to reviewers’ comments on the work, as well as the editorial correspondence and decision process.  The reality of the publication industry is selling ads and subscriptions, and it is likely that editors often override scientific input by peer reviewers that throws a sexy new manuscript into question.  The problem is the public doesn’t get access to the review process, and closed peer review is tantamount to no peer review at all as far as accountability is concerned.

    For these reasons, our current scientific publication platform has two large-scale negative consequences: the first economic, and the second epistemic.  First, intellectual property rights for publicly funded research are routinely transferred to nonpublic entities that then use these rights for profit.  Second, there is insufficient interactivity within the scientific community and with the public as a result of the silo effect of proprietary journals.  The open access revolution is gaining momentum on the gravity of these issues, but to date, open access journals and publishers have largely conformed to the existing model of journals and isolated manuscripts, and while open access journals have enabled public access to scientific publications, they fail to provide the direly needed interactivity that the internet enables.

    In the background of the open access revolution in science, a 70 year old idea3 about a new system for disseminating scientific publications was realized two decades ago on a publicly licensed code stack4 that allows not just open review, but distributed and continuous open review with real-time version control and hypertext interlinking: not just citations, links to the actual source.  Imagine being able to publish a paper that anybody can review, suggest edits to, add links to, and discuss publicly, with every step of that ongoing process versioned and stored.  If another researcher repeats your experiment, they can contribute their data.  If you extend or strengthen the message of your paper with a future experiment, that can also be appended.  Such a platform would utterly transform scientific publication from a series of soliloquies into an evolving cloud of interlinked ideas.  We’ve had that technology for an alarmingly long time given its lack of adoption by researchers who continue to grant highly cited journals ownership over the work the public has already paid for.

    I’ve kicked around the idea of a Wikiscience5 publication system for a long time with a lot of scientists, and the concerns that came up were cogent and constructive.  In testament to the tractability of a wiki replacement for our system of scientific publication is Wikipedia, one of the greatest gifts to humankind ever to grace the worldwide web.  The distributed review and discussion system that makes Wikipedia evolve does work, and most of us are old enough to remember a time when nobody thought it would.  But how can we assess impact and retain attribution in a distributed publication and review system such as a wiki?  Metrics such as journal impact factor and article-level metrics wouldn’t directly apply to a community-edited, community-reviewed scientific resource.  Attribution and impact assessment are important challenges to any system that aims to replace our journal and manuscript method for disseminating scientific information.  While a distributed scientific information system would not easily fit into the context of the current metrics for publication impact that are an intimate part of the funding, hiring, and promotion processes in academia, the consideration of such a system presents an opportunity to explore innovative analyses of the relevance and impact of scientific research.  Indeed, rethinking the evaluation of scientists and their work6 is a pressing need even within the context of the current publication system.

    We should be thinking about the benefit of the networked consciousness of online collectivism, not the startling failures of our current publication system to put scientific communication into the hands of the public that enabled it, or even the challenges in preserving integrity and attribution in a commons-based peer production system.7  We are the generation that grew up with Napster and 4chan, the information generation, the click-on-it-and-it’s-mine generation, born into a world of unimaginable technological wealth.  Surely we can do better than paywalls, closed peer review, and for-profit publishers.  We owe it to everybody: as Emerson put it, “He who has put forth his total strength in fit actions, has the richest return of wisdom.” 8


This article accompanies a feature piece about scientific publishing in the digital era and also appeared in the Penn Science Policy Group January 2015 newsletter

Brian S. Cole

1Ioannidis, John P. A. "Why Most Published Research Findings Are False."PLoS Medicine 2.8 (2005): E124.
2Stern, Andrew M., Arturo Casadevall, R. Grant Steen, and Ferric C. Fang. "Research: Financial Costs and Personal Consequences of Research Misconduct Resulting in Retracted Publications." ELife 3 (2014)
3Bush, Vannevar. "As We May Think." The Atlantic. Atlantic Media Company, 01 July 1945.
4"MediaWiki 1.24." - MediaWiki. <http://www.mediawiki.org/wiki/MediaWiki_1.24>.
5"WikiScience." - Meta. <http://meta.wikimedia.org/wiki/WikiScience>.
6"San Francisco Declaration on Research Assessment." American Society for Cell Biology. <http://www.ascb.org/dora/>.
7"3. Peer Production and Sharing." <http://cyber.law.harvard.edu/wealth_of_networks/3._Peer_Production_and_Sharing>.
8Emerson, Ralph W. "The American Scholar." The American Scholar. Web. <http://www.emersoncentral.com/amscholar.htm>.

Purdue professor Dr. Sanders responds to commentary about his Ebola interview with Fox News

Last month I analyzed the media coverage of Ebola in a post where I dissected an interview between Fox News reporters and Dr. David Sanders. I was recently contacted by Dr. Sanders, who wished to clarify a few issues that I raised in my article. The purpose of my post was to demonstrate how the media sometimes covers scientific issues in ways that exaggerate and oversimplify concepts, which can potentially mislead non-scientist citizens.

I stated that the way Dr. Sanders described his research sounded a little misleading. I intended to convey how I thought an average non-scientist listener might interpret the dialogue. However, Dr. Sanders points out that he was careful with his wording to avoid possible confusion. He explained, “as you have pointed out, one says one thing, and the media (and the Internet) render it as something else.  I would just like to point out that I carefully stated that Ebola can ENTER human lung from the airway side; I never said infect.  I also try to avoid the use of the term ‘airborne’ because of the confusion about its meaning.”

Also, he had several good scientific points about the validity of using pseudotyped viruses and the comparison to other viruses when considering the potential for a change in Ebola transmission.

“Pseudotyped viruses are used widely for studying viral entry, and I know of no examples where the conclusions on the cell biology of the entry of pseudotyped viruses have been contradicted by studies of entry of the intact virus despite such comparisons having been published numerous times.” 

“When we discovered that there was maternal-child transmission of HIV was that a new mode of transmission or merely a discovery of a previously unknown mode of transmission? How was Hepatitis C transmitted between humans before injections and blood transfusions? I don't know either. How is Ebola virus transmitted between fruit bats or from fruit bats to humans? Perhaps modes of transmission differ in their efficiency. The HIV comparison with Ebola ("HIV hasn't become airborne") is fallacious given the cell biology of entry for the two viruses.  The receptors for HIV (the CD4 attachment factor and the chemokine receptor) are present on blood cells and not on lung tissue.  The receptors for Ebola are present on a diverse set of cells including lung cells. In addition, Influenza A switches in real time from a gastrointestinal virus in birds to a respiratory virus in mammals--not that many mutations required.”

Additionally, he wisely pointed out that “precedent may be a valid argument in medical practice or the law, but it is not valid in science.” In fact, science seeks to uncover things that were previously unknown, and thus were without precedent.

I appreciate Dr. Sander’s response to my article. I think that rational and in-depth discussions about science need to happen more frequently in the media. Short, simplified stories with shock-factor headlines only detract from the important conversations that are necessary to find practical solutions to challenges like Ebola.

-Mike Allegrezza

Fox News demonstrates both good and bad ways to cover Ebola

Some news outlets, including Fox, have been wildly spreading fears about Ebola. As an example of both good and bad ways that the media covers science, let’s take a look at a recent clip from Fox News in which they interview Dr. David Sanders about the possibility of Ebola virus mutating to become airborne-transmissible (right now it is only spread by direct contact!)



Their story is titled "Purdue professor says Ebola 'primed' to go airborne.Here is a link to the video.

I’ll start off with the good things:

1) Dr. Sanders did a good job explaining that Ebola is not airborne right now, but there is a "non-zero" probability that Ebola might mutate to infect the lungs and become air transmissible. And this probability increases as more people are infected.
2) The newscasters did a good job of accurately recapping what he was explaining without blowing it out of proportion.

Now for some bad things:

1) Quite obviously, the scare-you-into-clicking-on-it title. First of all, it's completely misleading for the sole purpose of grabbing attention (it got me!). Second of all, it's completely false. I watched it three times and Dr. Sanders never said "primed." So it is blatantly incorrect.
2) They did not include coverage of other scientists that claim the fears of airborne transmission are over-hyped because there are no instances of that ever happening naturally for a virus that infects humans. HIV and hepatitis are both good examples that have infected millions without changing their route of transmission.
3) The way Dr. Sanders describes his published research is a little misleading in the context of this story. It sounds like he describes the research demonstrated Ebola virus can infect the lungs. In fact, the actual study showed that if you take some of the proteins from the surface of Ebola and code them into a completely different virus (in this case a feline lentivirus, similar to HIV), you can infect human airway epithelial cells grown in cell culture. So this research did not use the full Ebola virus, and did not demonstrate this infection in a live animal model. Link to study here: http://www.ncbi.nlm.nih.gov/pubmed/12719583

Some of these negative aspects might be a consequence of the brevity of this story. However, in an information-dense world, people get the news in short snippets, so the media needs to be careful not to compromise accuracy.

Interestingly, on the same network, Shep Smith reported on Ebola with commendable accuracy. He communicated the facts clearly and concisely while criticizing “hysterical” reporting as “irresponsible.” 

I hope future reports from Fox News and the rest of the media follow his tone.

*Update Nov 19, 2014: A follow up to this post detailing a thoughtful response from Dr. Sanders can be found here.


Welcome!

Welcome to the Penn Science Policy Group.  We are a group of scientists interested in the relationship between science and public policy, examining how both domains affect each other to shape our society.  

Our mission is:

1) To educate scientists about the process of science policy, namely how research and public policy can inform and guide each other.

2) To advocate for research and improve communication of science to the public.

3) To provide resources and training for scientists interested in developing a career in science policy.


We achieve these goals by discussing current issues in interactive monthly meetings, receiving career information from speakers and info sessions, and refining relevant skills through written and oral public communication.  

Whether you are planning a science policy career, or simply looking to stay abreast with important issues, we are here to help you learn about and navigate the field of science policy. 

For more information, please email penn.science.policy@gmail.com.

Thanks,
PSPG