Event Recap: Dr. Sarah Rhodes, Health Science Policy Analyst

by Chris Yarosh

PSPG tries to hold as many events as limited time and funding permit, but we cannot bring in enough speakers to cover the range of science policy careers out there. Luckily, other groups at Penn hold fantastic events, too, and this week’s Biomedical Postdoc Program Career Workshop was no exception. While all of the speakers provided great insights into their fields, this recap focuses on Dr. Sarah Rhodes, a Health Science Policy Analyst in the Office of Science Policy (OSP) at the National Institutes of Health (NIH).

First, some background: Sarah earned her Ph.D. in Neuroscience from Cardiff University in the U.K., and served as a postdoc there before moving across the pond and joining a lab at the NIH. To test the policy waters, Sarah took advantage of NIH’s intramural detail program, which allows scientists to do temporary stints in administrative offices. For her detail, Sarah worked as a Policy Analyst in the Office of Autism Research Coordination (OARC) at the National Institute of Mental Health (NIMH). That experience convinced her to pursue policy full time. Following some immigration-related delays, Sarah joined OARC as a contractor and later became a permanent NIH employee.

After outlining her career path, Sarah provided an overview of how science policy works in the U.S. federal government, breaking the field broadly into three categories: policy for science, science for policy, and science diplomacy. According to Sarah (and as originally promulgated by Dr. Diane Hannemann, another one of this event’s panelists), the focus of different agencies roughly breaks down as follows:


This makes a lot of sense. Funding agencies like NIH and NSF are mostly concerned with how science is done, Congress is concerned with general policymaking, and the regulatory agencies both conduct research and regulate activities under their purview. Even so, Sarah did note that all these agencies do a bit of each type of policy (e.g. science diplomacy at NIH Fogarty International Center). In addition, different components of each agency have different roles. For example, individual Institutes focus more on analyzing policy for their core mission (aging at NIA, cancer at NCI, etc.), while the OSP makes policies that influence all corners of the NIH.

Sarah then described her personal duties at OSP’s Office of Scientific Management and Reporting (OSMR):
  • Coordinating NIH’s response to a directive from the President’s Office of Science and Technology Policy related to scientific collections (think preserved specimens and the like)
  • Managing the placement of AAAS S&T Fellows at NIH
  • Supporting the Scientific Management Review Board, which advises the NIH Director
  • Preparing for NIH’s appropriations hearings and responding to Congressional follow-ups
  • “Whatever fires needs to be put out”
If this sounds like the kind of job for you, Sarah recommends building a professional network and developing your communication skills ASAP (perhaps by blogging!?). This sentiment was shared by all of the panelists, and it echoes advice from our previous speakers. Sarah also strongly recommends volunteering for university or professional society committees. These bodies work as deliberative teams and are therefore good preparation for the style of government work.

For more information, check out the OSP’s website and blog. If you’re interested in any of the other speakers from this panel, I refer you to the Biomedical Postdoc Program.

Event Recap: Dr. Sarah Martin, ASBMB Science Policy Fellow

by Ian McLaughlin

On February 11th, Dr. Sarah Martin, a Science Policy Fellow at the American Society for Biochemistry and Molecular Biology (ASBMB), visited Penn to chat about her experience working in science policy. As it turns out, her story is perhaps more circuitous than one might expect.

An avid equestrian, Sarah earned a bachelor’s degree in animal sciences and a master’s degree in animal nutrition at the University of Kentucky before embarking on a Ph.D. in Molecular and Cellular Biochemistry at UK’s College of Medicine. While pursuing her degrees, Sarah realized that the tenure track was not for her, and she began exploring career options using the Individual Development Plan (IDP) provided by AAAS Careers. At the top of the list: science policy.

With an exciting career option in mind, Sarah sought ways to build “translatable skills” during her Ph.D. to help her move toward science policy. She served as treasurer, and later Vice President, of UK’s Graduate Student Congress and developed her communication skills by starting her own blog and participating in ThreeMinute Thesis.  Sarah stressed the importance of communicating with non-scientists, and she highlighted how her practice paid off during Kentucky’s first-ever State Capitol Hill Day, an event that showcases Kentucky-focused scientific research to that state’s legislators.

Sarah also shared  how she got up to speed on science policy issues, becoming a “student of policy” by voraciously reading The Hill, RollCall, Politico, ScienceInsider, and ASBMB’s own PolicyBlotter. Additionally, she started to engage with peers, non-scientists, and legislators on Twitter, noting how it’s a useful tool to sample common opinions on issues related to science.  Finally, she reached out to former ASBMB fellows for advice on how to pursue a career in science policy – and they were happy to help.

Sarah then described the typical responsibilities of an ASBMB fellow, breaking them down into four categories:
  1. Research- tracking new legislation, and a daily diet of articles regarding new developments in science and policy
  2. Meetings- with legislators on Capitol Hill, staff at the NIH, partner organizations such as the Federation of American Societies for Experimental Biology (FASEB), and others
  3. Writing- white papers, position statements, and blog posts on everything from ASBMB’s position on gene editing to the NIH Strategic Plan for FY 2016-2020
  4. Administration- organizing and preparing for meetings, composing executive summaries, and helping to plan and organize ASBMB’s Hill Day.

Sarah also talked about her own independent project at ASBMB, a core component of each Fellowship experience. Sarah aims to update ASBMB’s Advocacy Toolkit in order to consolidate all of the resources a scientist might need to engage in successful science advocacy.

Comparing the ASBMB fellowship to similar fellowships, she noted as an advantage that there is no specific end to the fellowship, which gives Fellows plenty of time to find permanent positions that match their interests.  Sarah also noted that, compared to graduate students and postdocs, she enjoys an excellent work/life balance.

Ultimately, Sarah made it clear that she loves what she does. She closed by providing the following resources from ASBMB Science Policy Analyst Chris Pickett for anyone interested in applying for the ASBMB fellowship or pursuing a career in science policy:

WHO says bacon causes cancer?

by Neha Pancholi

Note: Here at the PSPG blog, we like to feature writing from anyone in the Penn community interested in the science policy process or science for general interest. This is the 1st in a series of posts from new authors. Interested is writing for the blog? Contact us!

The daily meat consumption in the United States exceeds that of almost every other country1. While the majority of meat consumed in the United States is red meat2, the consumption of certain red meats has decreased over the past few decades due to associated health concerns, such as heart disease and diabetes1,2. In October, the World Health Organization (WHO) highlighted another potential health concern for red meat: cancer.

The announcement concerned both red and processed meat. Red meat is defined as unprocessed muscle meat from mammals, such as beef and pork3. Processed meat– generally red meat –has been altered to improve flavor through processes such as curing or smoking3. Examples of processed meat include bacon and sausage. The WHO confirmed that processed meat causes cancer and that red meat probably causes cancer. Given the prevalence of meat in the American diet, it was not surprising that the announcement dominated headlines and social media. So how exactly did the WHO decide that processed meat causes cancer?

The announcement by the WHO followed a report from the International Agency for Research on Cancer (IARC), which is responsible for identifying and assessing suspected causes of cancer. The IARC evaluates the typical level of exposure to a suspected agent, results from existing studies, and the mechanism by which the agent could cause cancer.

After a review of existing literature, the IARC classifies the strength of scientific evidence linking the suspected cancer-causing agent to cancer. Importantly, the IARC determines only whether there is sufficient evidence that something can cause cancer. The IARC does not evaluate risk, meaning that it does not evaluate how carcinogenic something is. The IARC classifies the suspected carcinogen into one of the following categories4:
  • Group 1 – There is convincing evidence linking the agent to cancer in humans. The agent is deemed carcinogenic.
  • Group 2A – There is sufficient evidence of cancer in animal models, and there is a positive association observed in humans. However, the evidence in humans does not exclude the possibility of bias, chance, or confounding variables. The agent is deemed as a probable carcinogen.
  • Group 2B – There is a positive association in humans, but the possibility of bias, chance, or confounding variables cannot be excluded. There is inadequate evidence in animal models.
  • This category is also used when there is sufficient evidence of cancer in animal models, but there is not an association observed in humans. The agent is a possible carcinogen.
  • Group 3 – There is inadequate evidence in humans and animals. The agent cannot be classified as carcinogenic or not carcinogenic.
  • Group 4 – There is sufficient evidence to conclude that the agent is not carcinogenic in humans or in animals.
The IARC reviewed over 800 studies that examined the correlation between consumption of processed or red meat and cancer occurrence in humans. These types of studies, which examine patterns of disease in different populations, are called epidemiological studies. The studies included observations from all over the world and included diverse ethnicities and diets. The greatest weight was given to studies that followed the same group of people over time and had an appropriate control group. Most of the available data examined the association between meat consumption and colorectal cancer, but some studies also assessed the effect on stomach, pancreatic, and prostate cancer. The majority of studies showed a higher occurrence of colorectal cancer in people whose diets included high consumption of red or processed meat compared to those who have low consumption. By comparing results from several studies, the IARC determined that for every 100 grams of red meat consumed per day, there is a 17% increase in cancer occurrence. For every 50 grams of processed meat eaten per day, there is an 18% increase. The average red meat consumption for those who eat it is 50-100 grams per day.3

The IARC also reviewed studies that examined how meat could cause cancer. They found strong evidence that consumption of red or processed meat leads to the formation of known carcinogens called N-nitroso compounds in the colon. It is also known that cooked meat contains two types of compounds that are known to damage DNA, which can lead to cancer. However, there is not a direct link between eating meat containing these compounds and DNA damage in the body.3

Based on the strong evidence demonstrating a positive association with consumption of processed meat and colorectal cancer, the IARC classified processed meat as a Group 1 agent3. This means that there is sufficient evidence that consumption of processed meat causes cancer.

There was a positive association between consumption of red meat and colorectal cancer in several epidemiological studies. However, the possibility of chance or bias could not be excluded from these studies. Furthermore, the best-designed epidemiological studies did not show any association between red meat consumption and cancer. Despite the limited epidemiological evidence, there was strong mechanistic evidence demonstrating that red meat consumption results in the production of known carcinogens in the colon. Therefore, red meat was classified as a probable carcinogen (Group 2A)3.

It will be interesting to see how the WHO announcement affects red meat consumption in the United States and worldwide. But before swearing off processed and red meat forever, there are a few things to consider.

First, it is important to bear in mind that agents classified within the same group have varying carcinogenic potential. Processed meat was classified as a Group 1 agent, which is the same classification for tobacco smoke. However, estimates by the Global Burden of Disease Project attribute approximately 34,000 cancer deaths per year to consumption of processed meat5. In contrast, one million cancer deaths per year are due to tobacco smoke5. While the evidence linking processed meat to cancer is strong, the risk of cancer due to processed meat consumption appears to be much lower than other known carcinogens. Second, the IARC did not evaluate studies that compared vegetarian or poultry diets to red meat consumption5. Therefore, it is unknown whether vegetarian or poultry diets are associated with fewer cases of cancer. Finally, red meat is high in protein, iron, zinc, and vitamin B123. Thus, while high red meat consumption is associated with some diseases, there are also several health benefits of consuming red meat in moderation. Ultimately, it will be important to balance the risks and benefits of processed and red meat consumption.


1http://www.npr.org/sections/thesalt/2012/06/27/155527365/visualizing-a-nation-of-meat-eaters
2http://www.usda.gov/factbook/chapter2.pdf
3Bouvard et al. Carcinogenicity of consumption of red and processed meat. The Lancet Oncology, 2015. 16(16): 1599-1600.
4http://www.iarc.fr/en/media-centre/iarcnews/pdf/Monographs-Q&A.pdf
5http://www.who.int/features/qa/cancer-red-meat/en/

Ready to Adapt: Experts Discuss Philadelphia Epidemic Preparedness

by Jamie DeNizio and Hannah Shoenhard


In early November, public health experts from a variety of organizations gathered on Penn’s campus to discuss Philadelphia’s communication strategies and preparation efforts in the event of an epidemic outbreak. In light of recent crises, such as H1N1 and Ebola in the US, AAAS Emerging Leaders in Science and Society (ELISS) fellows and the Penn Science Policy Group (PSPG) hosted local experts at both a public panel discussion and a focus group meeting to understand the systems currently in place and develop ideas about what more can be done.
Are we prepared?: Communication with the public
Dr. Max King, moderator of the public forum, set the tone for both events with a Benjamin Franklin quote: “By failing to prepare, you are preparing to fail.” Measures taken before a crisis begins can make or break the success of a public health response. In particular, in the age of the sensationalized, 24-hour news cycle, the only way for public health professionals to get the correct message to the public is to establish themselves as trustworthy sources of information in the community ahead of time.
For reaching the general population, the advent of social media has been game-changing. As an example, James Garrow, Director of Digital Public Health for the Philadelphia Department of Public Health, described Philadelphia’s use of its Facebook page to rapidly disseminate information during the H1N1 flu outbreak. The city was able to provide detailed information while interacting with and answering questions directly from members of the public in real time, a considerable advantage over traditional TV or print news.

However, Garrow was quick to note that “mass media still draws a ton of eyeballs,” and that any public health outreach program would be remiss to neglect traditional media such as TV, radio, and newspapers. At this point, social media is a complement to, but not a replacement for, other forms of media engagement.
Furthermore, those typically at greater risk during an epidemic are often unable to interact with social media channels due to economic disadvantage, age, or a language barrier. In Philadelphia, 21.5% of the population speaks a language other than English at home. Meanwhile, 12.5% of the population is over the age of 65 (U.S. Census Bureau). The focus group meeting specifically discussed how to reach these underserved groups. Some suggestions included having “block captains” or registries. “Block captains” would be Philadelphia citizens from a particular block or neighborhood that would be responsible for communicating important information to residents in their designated section. In addition to these methods of monitoring individuals, there was general agreement that there is a need for translation-friendly, culturally-relevant public health messages.

For example, during the open forum, Giang T. Nguyen, leader of the Penn Asian Health Initiative and Senior Fellow of the Penn Center for Public Health Initiatives, emphasized the importance of building ties with “ethnic media”: small publications or radio channels that primarily cater to immigrant communities in their own languages. He noted that, in the past, lack of direct contact between government public health organizations and non-English-speaking communities has led to the spread of misinformation in these communities.

On the other hand, Philadelphia has also successfully engaged immigrant communities in the recent past. For example, Garrow pointed to Philadelphia’s outreach in the Liberian immigrant community during the Ebola outbreak as a success story. When the outbreak began, the health department had already built strong ties with the Liberian community, to the point where the community actively asked the health department to hold a town hall meeting, rather than the reverse. This anecdote demonstrates the importance of establishing trust and building ties before a crisis emerges.
With regards to both general and community-targeted communication, the experts agreed that lack of funding is a major barrier to solving current problems. At the expert meeting, it was suggested that communication-specific grants, rather than larger grants with a certain percentage allotted for communication, might be one way of ameliorating this problem.
Are we prepared?: Communication between health organizations

The need for established communications networks extends beyond those for communicating directly with individuals. It is crucial for the local health department and healthcare system to have a strong relationship. Here in Philadelphia, the health department has a longstanding relationship with Penn Medicine, as well as other universities and major employers. In case of an emergency, these institutions are prepared to distribute vaccines or other medicines. Furthermore, mechanisms for distribution of vaccines already in place are “road-tested” every year during flu season. As an example, Penn vaccinated 2,500 students and faculty for the flu in eight hours during a recent vaccination drive, allowing personnel to sharpen their skills and identify any areas that need improvement.

In addition to the strong connections between major Philadelphia institutions, there is also a need for smaller health centers and community centers to be kept in the loop. These small providers serve as trusted intermediaries between large public health organizations and the public. According to the experts, these relationships are already in place. For example, during the recent Ebola crisis, the CDC set up a hotline for practitioners to call if one of their patients returned from an Ebola-stricken country with worrying symptoms. “You can’t expect everyone in the entire health system to know all they need to know [about treating a potential Ebola case],” said Nguyen, “but you can at least ensure that every practice manager and medical director knows the phone number to call.”

Can we adapt?
Ultimately, no crisis situation is fully predictable. Therefore, what matters most for responders is not merely having the proper protocols, resources, and avenues of communication in place, but also the ability to adjust their reaction to a crisis situation as it evolves. As Penn behavioral economics and health policy expert Mitesh Patel pointed out at the end of the open forum, “It’s not are we ready?, it’s are we ready to adapt?
The topic of adaptability was also heavily discussed at the focus group meeting. A lack of a central communication source was identified as a potential barrier to adaptability. So was a slow response from agencies further up the chain of command, such as the CDC. However, experts also disagreed about the precise degree of control the CDC should have at a local level. For example, representatives from local government agencies, which are more directly accountable to the CDC, expressed a desire for the CDC to proactively implement strategies, instead of attempting to direct the local response once it has already begun. Many physicians and hospital representatives, on the other hand, were of the opinion that plans formulated by the people closest to the crisis may be superior due to their situational specificity and lack of red tape. Despite this point of contention, experts agreed that there is a need for some consensus and coordination between hospitals in a particular region on how to respond to a large-scale health event.

One gap in Philadelphia’s preparedness identified by the experts in the focus group is its ability to case manage novel diseases—a challenge, since often the transmission route of novel diseases is not known. Some experts in the meeting also expressed doubt that Philadelphia is prepared for a direct biological attack. However, numerous epidemic-response frameworks already in place could potentially be repurposed for novel or deliberately-spread pathogens. In these cases, even more so than in “typical” epidemic situations, the experts identified adaptability as a key factor for success.

At the end of the open forum, the panelists affirmed the belief that Philadelphia is as prepared as it can be for an infectious disease crisis.  Furthermore, it seemed they had also moved the opinions of the event’s attendees: before the forum, attendees rated Philadelphia’s readiness at an average of 3.1 on a 6-point scale (with 0 being “not at all ready” and 6 being “completely ready”), while afterwards, the same attendees rated Philadelphia’s readiness at an average of 3.9 on the same scale (p=0.07, paired t-test).

Reminder: Science does not happen in a vacuum

by Chris Yarosh

It is very easy to become wrapped up in day-to-day scientific life. There is always another experiment to do, or a paper to read, or a grant to submit. This result leads to that hypothesis, and that hypothesis needs to be tested, revised, re-tested, etc. Scientists literally study the inner workings of life, matter and the universe itself, yet science often seems set apart from other worldly concerns.

But it’s not.

The terrorist attacks in Paris and Beirut and the ongoing Syrian refugee crisis have drawn the world’s attention, and rightfully so. These are genuine catastrophes, and it is difficult to imagine the suffering of those who must face the aftermath of these bouts of shocking violence.

At the same time, 80 world leaders are preparing to gather in freshly scarred Paris for another round of global climate talks. In a perfect world, these talks would focus only on the sound science and overwhelming consensus supporting action on climate change, and they would lead to an agreement that sets us on a path toward healing our shared home.

But this is not a perfect world.

In addition to the ongoing political struggle and general inertia surrounding climate change, we now must throw the fallout from the Paris attacks into the mix. Because of this, the event schedule will be limited to core discussions, which will deprive some people of their chance to demonstrate and make their voices heard on a large stage. This is a shame, but at least the meeting will go on. If the situation is as dire as many scientists and policy experts say it is, this meeting may be our last chance to align the world’s priorities and roll back the damage being caused to our planet. It was never going to be easy, and the fearful specter of terrorism—and the attention and resources devoted to the fight against it— does nothing to improve the situation.

This is a direct example of world events driving science and science policy, but possible indirect effects abound as well. It is not outside the realm of possibility that political disagreement over refugee relocation may lead to budget fights or government shutdown, both of which could seriously derail research in the U.S. With Election 2016 rapidly approaching, it is also possible that events abroad can drive voter preferences at home, with unforeseen impacts on how research is funded, conducted, and disseminated.

What does this mean for science and science policy?

For one, events like this remind us once again that scientists must stay informed and be ready to adapt as sentiments and attention shift in real time. Climate change and terrorism may not have seemed linked until now (though there is good reason to think that this connection runs deep), but the dramatic juxtaposition of both in Paris changes that. Scientists can offer our voices to the discussion, but it is vital that we keep abreast of the shifting political landscapes that influence the conduct and application of science. Keeping this birds-eye view is critical, because while these terrorist attacks certainly demand attention and action, they do nothing to change the urgent need for action on the climate, on health, and on a whole host of issues that require scientific expertise.

While staying current and engaging in policymaking is always a good thing for science (feel free to contact your representatives at any time), situations like the Syrian refugee crisis offer a more unique chance to lend a hand. Science is one of humanity’s greatest shared endeavors, an approach to understanding the world that capitalizes on the innate curiosity that all people share. This shared interest has always extended to displaced peoples, with the resulting collaborations providing a silver lining to the negative events that precipitated their migrations. Where feasible, it would be wise for universities across the globe to welcome Syrians with scientific backgrounds; doing so would provide continuity and support for the displaced while preventing a loss of human capital. Efforts to this effect are currently underway in Europe, though it is unclear how long these programs can survive the tension surrounding that continent.

For good and ill, world events have always shaped science. The tragedies in France, Syria, and elsewhere have incurred great human costs, and they will serve as a test of our shared humanity. As practitioners of one of our great shared enterprises, scientists have a uniquely privileged place in society, and we should use our station to help people everywhere in any way possible.

Communicating about an Epidemic in the Digital Age - Live Stream of Forum


To watch this event in real time, please follow this link (from 530 - 7pm, 11/4)


How prepared are Philadelphia’s institutions to communicate with the public in the event of a future epidemic? What specific challenges were successfully or unsuccessfully addressed during the Ebola crisis that could provide learning points going forward? Are there successful models or case studies for handling communication during epidemics that are worth emulating?

These questions will be up for debate on Wednesday at the University of Pennsylvania in a forum open to the public. The event will be held in the Penn bookstore (3601 Walnut St.) upstairs meeting room from 5:30 to 7 p.m. on Wednesday, November 4.

To learn more about this event, please read our preview article.

New funding mechanism aims to bring balance to the biomedical research (work)force

by Chris Yarosh

This past March, the National Cancer Institute (NCI) announced a new funding mechanism designed to stabilize the biomedical research enterprise by creating new career paths for PhD-level scientists. That mechanism, called the NCI Research Specialist Award (R50), is now live. Applications (of which there will likely be many) for the R50 will be accepted beginning in January, with the first crop of directly-funded Research Specialists starting in October 2016. More details about the grant can be found in the newly released FOA.

Why is this a big deal? In recent years, there have been increased calls for reform of the biomedical enterprise. More people than ever hold PhDs, and professor positions (the traditional career goal of doctorate holders) are scarce. This leaves many young researchers trapped somewhere in the middle in postdoctoral positions, something we've talked about  before on this blog. These positions are still considered to be training positions, and without professor openings (or funding for independent labs), these scientists often seek industry positions or leave the bench altogether in lieu of finding academic employment.

On the flip side, modern academic labs are highly dependent on a constant stream of graduate students and postdocs to do the lion’s share of the research funded by principal investigator-level grants (R01s). This creates a situation where entire labs can turn over in relatively short periods of time, possibly diminishing the impact of crucial research programs.

But what if there was another way? That, in a nutshell, is the aim of the R50. By funding the salaries (but not the research costs) of PhD-level researchers, the R50 seeks to create opportunities for scientists to join established research programs or core facilities without having to obtain larger grants or academic appointments. This attempts to kill two birds with one stone: more jobs for PhDs, less turnover in labs already funded by other NCI grants.

This approach is not all roses, however. For one, this doesn’t change the fact that research funding has been flat or worse in recent years. Even with more stable staffing, the amount of research being completed will continue to atrophy. Moreover, the money for future R50s will need to come from somewhere, and it is possible that this will put additional strain on the NCI’s budget if overall R&D spending is not increased soon. Lastly, there are some concerns about how the R50 will work in practice. For example, Research Specialists will be able to move to other labs with NCI approval, but how will this actually play out? Will R50s really be pegged to their recipients, or will there be an implicit understanding that they are tied to the supporting labs/institutions?

It should be noted that this is only a trial period, and that full evaluation of the program will not be possible until awards are actually made. Still, this seems like a positive response to the forces currently influencing the biomedical research enterprise, and it will be interesting to see if and when the other NIH institutes give something like this a shot.

Communicating about an Epidemic in the Digital Age

**Link for live streaming of this event can be found here**

by Hannah Shoenhard, Jamie DeNizio, and Michael Allegrezza

Craig Spencer, a New York City doctor, tested positive for Ebola on October 23. The story broke online the same day, and by the next morning, tabloids were plastered with images of masked and gowned health workers with headlines such as Bungle Fever and Ebola! Late-night comedy, Twitter, local news: the story was inescapable, the hysteria palpable. All in all, only eleven Ebola patients were treated on U.S. soil. But the media’s reaction affected the lives of anyone who watched television or had an internet connection.

The Ebola epidemic in Africa has died down. Liberia is Ebola-free, while Sierra Leone and Guinea continue to report cases in the low single digits per week. Most promisingly, a new vaccine has been shown to be highly effective in a clinical trial. Given the vaccine, it seems that the likelihood of future epidemics on the scale of the one in 2014 is low. But especially during the early days of the epidemic, miscommunication and mistrust of international public health workers slowed the medical response and exacerbated the epidemic. And, as the reaction to the New York City case shows us, this problem is not unique to West African countries.

Even if the threat from Ebola in particular is under control, infectious disease is endemic to civilization. Knowing that new epidemic threats can emerge at any time, important questions need to be considered. 

How prepared are Philadelphia’s institutions to communicate with the public in the event of a future epidemic? What specific challenges were successfully or unsuccessfully addressed during the Ebola crisis that could provide learning points going forward? Are there successful models or case studies for handling communication during epidemics that are worth emulating?

These questions will be up for debate on Wednesday at the University of Pennsylvania in a forum open to the public. The event will be held in the Penn bookstore (3601 Walnut St.) upstairs meeting room from 5:30 to 7 p.m. on Wednesday, November 4.

The event is hosted by two graduate student groups at Penn, the Emerging Leaders in Science and Society (ELISS) Fellows and the Penn Science Policy Group with the goal of fostering collaborative ideas to develop effective channels to manage trust, fear, and accurate communication during potential future epidemics.

On the panel for the forum will be three innovators in communicating public health issues, Dr. Mitesh Patel, MD, MBA, MS, James Garrow, MPH, and Dr. Giang T. Nguyen, MD, MPH, MSCE. Moderating the discussion will be Dr. Max King, Ph.D., Associate Vice Provost for Health and Academic Services at Penn.

Community members are encouraged to attend the forum with questions and comments. People can also watch a live stream of the event (check our twitter page for the link) and submit questions via twitter with #EpidemicPhilly.


Biographies of the panelists and moderator


Mitesh Patel
 
Mitesh S. Patel, MD, MBA, MS is a board-certified general internist, physician scientist, and entrepreneur. He is an Assistant Professor of Medicine and Health Care Management at the Perelman School of Medicine and The Wharton School at the University of Pennsylvania. His work focuses on leveraging behavioral economics, connected health approaches and data science to improve population health and increase value within the health care delivery system. 

As a physician-scientist, Mitesh studies how we can utilize innovative technology and connected health approaches to passively monitor activity and how we can use health incentives to motivate behavior change. His prior work has been published in NEJM, JAMA, the Annals of Internal Medicine and featured in the New York Times, NPR, and CNN. Mitesh also co-founded, Docphin, a startup that strives to improve the application of evidence-based medicine into clinical practice.

Mitesh holds undergraduate degrees in Economics and Biochemistry as well as a Medical Doctorate from the University of Michigan. He obtained an MBA in Health Care Management from The Wharton School and an MS in Health Policy Research from the Perelman School of Medicine at the University of Pennsylvania. Mitesh completed his internal medicine residency training and the Robert Wood Johnson Clinical Scholars Program at the University of Pennsylvania.

James Garrow

James Garrow, MPH is a nationally recognized proponent and advocate for the use of social media and digital tools in the execution of public health activities. His role as the Director of Digital Public Health in Philadelphia is among the first in the country charged with using new digital tools and techniques like social media, crowdsourcing, and big data utilization. He also provides media relations support for the Philadelphia Department of Public Health.

An accomplished public speaker and noted thought leader, Jim has been invited to and spoken at conferences across the US on social media use in public health and emergency response. He is an active social media user, maintaining two regularly scheduled Twitterchats and a blog on crisis and emergency risk communications.

Jim obtained a B.S. in Applied Sociology from Drexel in 2001 and a Master’s of Public Health from Temple in 2011. 


Giang T. Nguyen

Dr. Nguyen, MD, MPH, MSCE, is an Assistant Professor in the School of Medicine, Department of Family Medicine and Community Health at Penn.  He is also Chair of the MD-MPH Advisory Committee and a member of the MPH Program Curriculum Committee.

Dr. Nguyen leads the Penn Asian Health Initiatives. His research focus is in Asian immigrant health with concentrations in cancer control, disease prevention, and community-based participatory research. His community engagement work has included outreach to Vietnamese and other Southeast Asian refugees, health fairs and immunization clinics, cancer education workshops, advocacy, HIV/AIDS, and LGBT issues. He serves on boards and advisory committees for several Asian serving organizations, including the Asian and Pacific Islander National Cancer Survivors Network.

Dr. Nguyen is also the Medical Director of Penn Family Care, the clinical practice of the University of Pennsylvania's Department of Family Medicine and Community Health. He provides direct care to adult and pediatric patients in the primary care setting and teaches medical students and family medicine residents. He also is a Senior Fellow of the Penn Center for Public Health Initiatives, where he is a core faculty member for the Penn MPH program.
Max King
Previously the coordinator of Penn State's University Scholars Program, Max King, Ph.D., MS, is now Associate Vice Provost for Health and Academic Services at The University of Pennsylvania.
Dr. King holds three degrees from Penn State: a B.S. in Biological Health, M.S. in Health Education, and Ph.D. from the Interdisciplinary Program in Educational Theory and Policy. His research focus is the multidimensional Methodology of Q-Analysis, or Polyhedral Dynamics, a higher-level structural analysis approach derived from algebraic topology.

Dr. King also held an appointment as an Affiliate Assistant Professor and a member of the graduate faculty in the Department of Administration, Policy, Foundations, and Comparative/International Education at Penn State. He taught educational foundations, comparative education, British education, research methods, and international education. He also has extensive experience in computer systems, developing mainframe and microcomputer research and thesis applications.

NIH to chimera researchers: Let's talk about this...

by Chris Yarosh

When we think about the role of the National Institutes of Health (NIH) in biomedical research, we often think only in terms of dollars and cents. The NIH is a funding agency, after all, and most researchers submit grants with this relationship in mind. However, because the NIH holds the power of the purse, it also plays a large role in dictating the scope of biomedical research conducted in the U.S. It is noteworthy, then, that the NIH recently delayed some high profile grant applications related to one type of research: chimeras.

Chimeras, named for a Greek mythological monster composed of several different animals, are organisms that feature cells that are genetically distinct.  In the lab, this commonly refers to animals that contain cells from more than once species. Research into chimeras is not new; scientists have been successfully using animal/animal (e.g. sheep/goat) chimeras for over 30 years to learn about how animals develop. Human/animal chimeras are also a common research tool. For example, the transfer of cancerous human tissue into mice with weakened immune systems is standard practice in cancer biology research because it allows researchers to test chemotherapy drugs in a system that is more complex than a dish of cells before testing them in human subjects. These experiments are largely uncontroversial, save for individuals who fall into the anti-animal testing camp (and those who dispute the predictive power of mouse models in general). Why then, has the NIH decided to pump the brakes on this line of research?

Like many things, the answer lies in the timing. The temporarily-stalled research involves injecting human pluripotent cells—undifferentiated cells that can develop into any number of different cell types—not into mature animals, but instead into animal embryos. Unlike the tumor-in-a-mouse research mentioned above, this kind of experiment is specifically trying to get normal human cells to develop as an animal matures and remain, well, normal human cells. One idea is that someday we could grow an organ (liver, pancreas, etc.) in an animal, such as a pig, that is still a human organ. This would lower the barrier for successful transplantation, meaning that somebody in serious need of a new liver could receive one from livestock instead of waiting for a human donor from a transplant list. Another thought is that chimeric animals will better model human physiology, making subsequent clinical trials more accurate.

If you read the last paragraph and felt a bit uneasy, you’re not alone. For some, this type of research crosses the invisible line that separates humans from animals, and is therefore unacceptable. Others find this research troubling from an animal welfare standpoint, and still other worry about unanticipated differentiation (e.g. “we wanted a liver, but we found some human cells in the pig’s nerves, too”) or unethical uses for this type of technology.

The NIH hears these concerns, and wants to talk about them before giving scientists the go ahead to use public funds on this type of research. Some researchers have reacted negatively to this, fearing broader restrictions in the future, but I think this is an important part of the scientific process. We live (and for scientists, work) in an era of unprecedented ability to modify genomes and cell lineages, and human/animal chimeras are just one example of a type of research destined for more attention and oversight. It is important to get the guidelines right.

The NIH will convene a panel of scientists and bioethicists to discuss human/animal chimera research on November 6th, so keep an eye out for possible policy revisions after then. Given the promise of this type of research and the potential concerns over its use, this surely is only the beginning of the deliberative process.

UPDATE (11/05/2015): Scientists from Stanford University have posted an open letter in Science calling for a repeal of the current restrictions in this field. The full letter, found here, argues that there is little scientific justification for the NIH's stated concerns. Over at Gizmodo,  the NIH has responded by claiming that the true purpose of the stop order and review is to "stay ahead" of current research and anticipate future work. This is consistent with the NIH's views as articulated on the Under the Poliscope blog. All things considered, the workshop tomorrow, and any guidelines resulting from it, should be very interesting for people who wish to develop and use these tools.

Kickoff Meeting!

Hey everyone,

Please join us for our kickoff meeting this Thursday, September 17th in BRB 1412 at 6PM. We will be previewing the year ahead, and VP Chris Yarosh will talk about advocacy and his experience with ASBMB Hill Day. Refreshments included!

Penn Science Spotlight: Learning how T cells manage the custom RNA business

Chris Yarosh

This Science Spotlight focuses on the research I do here at Penn, the results of which are now in press at Nucleic Acids Research1. You can read the actual manuscript right now, if you would like, because NAR is “open access,” meaning all articles published there are available to anyone for free. We’ve talked about open access on this blog before, if you’re curious about how that works. 

First, a note about this type of science. The experiments done for this paper fall into the category of “basic research,” which means they were not designed to achieve an immediate practical end. That type of work is known as “applied” research. Basic research, on the other hand, is curiosity-driven science that aims to increase our understanding of something. That something could be cells, supernovas, factors influencing subjective well-being in adolescence, or anything else, really. This isn’t to say that basic research doesn’t lead to advances that impact people’s lives; quite the opposite is true. In fact, no applied work is possible without foundational basic work being done first. Rather, the real difference between the two categories is timeline and focus: applied research looks to achieve a defined practical goal (such as creating a new Ebola vaccine) as soon as possible, while basic research seeks to add to human knowledge over time. If you’re an American, your tax dollars support basic research (thanks!), often through grants from the National Institutes of Health (NIH) or the National Science Foundation (NSF). This work, for example, was funded in part by two grants from the NIH: one to my PhD mentor, Dr. Kristen Lynch (R01 GM067719), and the second to me (F31 AG047022). More info on science funding can be found here.

Now that you've gotten your basic research primer, let's talk science. This paper is primarily focused on how T cells (immune system cells) control a process called alternative splicing to make custom-ordered proteins. While most people have heard of DNA, the molecule that contains your genes, not everyone is as familiar with the RNA or proteins. I like to think of it this way: DNA is similar to the master blueprint for a building, specifying all of the necessary components needed for construction. This blueprint ultimately codes for proteins, the molecules in a cell that actually perform life’s work. RNA, which is “transcribed” from DNA and “translated” into protein, is a version of the master blueprint that can be edited as needed for different situations. Certain parts of RNA can be mixed and matched to generate custom orders of the same protein, just as you might change a building’s design based on location, regulations, etc. This mixing and matching process is called alternative splicing (AS), and though it sounds somewhat science-fictiony, AS naturally occurs across the range of human cell types.



While we know AS happens, scientists haven’t yet unraveled the different strategies cells use to control it. Part of the reason for this is the sheer number of proteins involved in AS (hundreds), and part of it is a lack of understanding of the nuts and bolts of the proteins that do the managing. This paper focuses on the nuts and bolts stuff. Previous work2 done in our lab has shown that a protein known as PSF manipulates AS to produce an alternate version of a different protein, CD45, critical for T cell response to antigens (bits of bacteria or viruses). PSF doesn’t do this, however, when a third protein, TRAP150, binds it, although we previously didn’t know why. This prompted us to ask two major questions: How do PSF and TRAP150 link up with one another, and how does TRAP150 change PSF’s function?

My research, as detailed in this NAR paper, answers these questions using the tools of biochemistry and molecular biology. In short, we found that TRAP150 actually prevents PSF from doing its job by binding in the same place RNA does. This makes intuitive sense: PSF can’t influence splicing of targets it can’t actually make contact with, and it can't contact them if TRAP150 is gumming up the works. To make this conclusion, we diced PSF and TRAP150 up into smaller pieces to see which parts fit together, and we also looked for which part of PSF binds RNA. These experiments helped us pinpoint all of the action in one region of PSF known as the RNA recognition motifs (RRMs), specifically RRM2. Finally, we wanted to know if PSF and TRAP150 regulate other RNA molecules in T cells, so we did a screen (the specific technique is called “RASL-Seq,” but that’s not critical to understanding the outcome) and found almost 40 other RNA molecules that appear to be controlled by this duo. In summary, we now know how TRAP150 acts to change PSF’s activity, and we have shown this interaction to be critical for regulating a bunch of RNAs in T cells.

So what are the implications of this research? For one, we now know that PSF and TRAP150 regulate the splicing of a range of RNAs in T cells, something noteworthy for researchers interested in AS or how T cells work. Second, we describe a mechanism for regulating proteins that might be applicable to some of those other hundreds of proteins responsible for regulating AS, too. Finally, PSF does a lot more than just mange AS in the cell. It actually seems to have a role in almost every step of the DNA-RNA-protein pathway. By isolating the part of PSF targeted by TRAP150, we can hypothesize about what PSF might do when TRAP150 binds it based on what other sections of the protein remain “uncovered.” It will take more experiments to figure it all out, but our data provide good clues for researchers who want to know more about all the things PSF does.

A map of the PSF protein. Figure adapted from Yarosh et al.WIREs RNA 2015, 6: 351-367. doi: 10.1002/wrna.1280
Papers cited:
1.) Christopher A. Yarosh; Iulia Tapescu; Matthew G. Thompson; Jinsong Qiu; Michael J. Mallory; Xiang-Dong Fu; Kristen W. Lynch. TRAP150 interacts with the RNA-binding domain of PSF and antagonizes splicing of numerous PSF-target genes in T cells. Nucleic Acids Research 2015;
doi: 10.1093/nar/gkv816

2.) Heyd F, Lynch KW. Phosphorylation-dependent regulation of PSF by GSK3 controls CD45 alternative splicing. Mol Cell 2010,40:126–137.

Training the biomedical workforce - a discussion of postdoc inflation


By Ian McLaughlin


Earlier this month, postdocs and graduate students from several fields met to candidly discuss the challenges postdocs are encountering while pursuing careers in academic research.  The meeting began with an enumeration of these challenges, discussing the different elements contributing to the mounting obstacles preventing postdocs from attaining faculty positions – such as the scarcity of faculty positions and ballooning number of rising postdocs, funding mechanisms and cuts, the sub-optimal relationship between publications and the quality of science, and the inaccurate conception of what exactly a postdoctoral position should entail.


From [15]

At a fundamental level, there’s a surplus of rising doctoral students whose progression outpaces the availability of faculty positions at institutions capable of hosting the research they intended to perform [10,15].  While 65% of PhDs attain postdocs, only 15-20% of postdocs attain tenure-track faculty positions [1].  This translates to significant extensions of postdoctoral positions, with the intentions of bolstering credentials and generating more publications to increase their appeal to hiring institutions.  Despite this increased time, postdocs often do not benefit from continued teaching experiences, and are also unable to attend classes to cultivate professional development.


From [10]
Additionally, there may never be an adequate position available. Instead of providing the training and mentorship necessary to generate exceptional scientists, postdoctoral positions have become “holding tanks” for many PhD holders unable to transition into permanent positions [5,11], resulting in considerably lower compensation relative to alternative careers 5 years after attaining a PhD.

From [13]

Perhaps this wouldn’t be quite so problematic if the compensation of the primary workhorse of basic biomedical research in the US was better.  In 2014, the US National Academies called for an increase of the starting postdoc salary of $42,840 to $50,000 – as well as a 5-year limit on the length of postdocs [1].  While the salary increase would certainly help, institutions like NYU, the University of California system, and UNC Chapel Hill have explored term limits.  Unfortunately, a frequent outcome of term limits was the promotion of postdocs to superficial positions that simply confer a new title, but are effectively extended postdocs. 

Given the time commitment required to attain a PhD, and the expanding durations of postdocs, several of the meeting’s attendees identified a particularly painful interference with their ability to start a family.  Despite excelling in challenging academic fields at top institutions, and dedicating professionally productive years to their work, several postdocs stated that they don’t foresee the financial capacity to start a family before fertility challenges render the effort prohibitively difficult.

However, administrators of the NIH have suggested this apparent disparity between the number of rising postdocs and available positions is not a significant problem, despite having no apparent data to back up their position. As Polka et al. wrote earlier this year, NIH administrators don’t have data quantifying the total numbers of postdocs in the country at their disposal – calling into question whether they are prepared to address this fundamental problem [5].

A possible approach to mitigate this lack of opportunity would be to integrate permanent “superdoc” positions for talented postdocs who don’t have ambitions to start their own labs, but have technical skills needed to advance basic research.  The National Cancer Institute (NCI) has proposed a grant program to cover salaries between $75,000-$100,000 for between 50-60 of such positions [1,2], which might be expanded to cover the salaries of more scientists.  Additionally, a majority of the postdocs attending the meeting voiced their desire for more comprehensive career guidance.  In particular, while they are aware that PhD holders are viable candidates for jobs outside of academia – the career trajectory out of academia remains opaque to them.

This situation stands in stark contrast to the misconception that the US suffers from a shortage of STEM graduates.  While the careers of postdocs stall due to a scarcity of faculty positions, the President’s Council of Advisors on Science and Technology announced a goal of one million STEM trainees in 2012 [3], despite the fact that only 11% of students graduating with bachelor’s degrees in science end up in fields related to science [4] due in part, perhaps, to an inflated sense of job security.  While the numbers of grad students and postdocs have increased almost two-fold, the proliferation of permanent research positions hasn’t been commensurate [5]. So, while making science a priority is certainly prudent – the point of tension is not necessarily a shortage of students engaging the fields, but rather a paucity of research positions available to them once they’ve attained graduate degrees. 

Suggested Solutions

Ultimately, if the career prospects for academic researchers in the US don't change, increasing numbers of PhD students will leave basic science research in favor of alternatives that offer better compensation and career trajectories – or leave the country for international opportunities.  At the heart of the problem is a fundamental imbalance between the funding available for basic academic research and the growing community of scientists in the U.S [9,14], and a dysfunctional career pipeline in biomedical research [9].  Some ideas of strategies to confront this problem included the following suggestions.

Federal grant-awarding agencies need to collect accurate data on the yearly numbers of postdoctoral positions available.  This way, career counselors, potential students, rising PhD students, and the institutions themselves will have a better grasp of the apparent scarcity of academic research opportunities.

As the US National Academies have suggested, the postdoc salary ought to be increased.  One possible strategy would be to increase the prevalence of “superdoc”-type positions creating a viable career alternative for talented researchers who wish to support a family but not secure the funding needed to open their own labs.  Additionally, if institutions at which postdocs receive federal funding were to consider them employees with all associated benefits, rather than trainees, rising scientists might better avoid career stagnation and an inability to support families [11].

As the number of rising PhDs currently outpaces the availability of permanent faculty positions, one strategy may be to limit the number of PhD positions available at each institution to prevent continued escalation of postdocs without viable faculty positions to which they might apply.  One attendee noted that this could immediately halt the growth of PhDs with bleak career prospects.

Several attendees brought up the problems many postdocs encounter in particularly large labs, which tend to receive disproportionately high grant funding.  Postdocs in such labs feel pressure to generate useful data to ensure they can compete with their peers, while neglecting other elements of their professional development and personal life. As well, the current system funnels funding to labs that can guarantee positive results, favoring conservative rather than potentially paradigm-shifting proposals – translating to reduced funding for new investigators [9]. Grant awarding agencies’ evaluations of grant proposals might integrate considerations of the sizes of labs with the goal of fostering progress in smaller labs. Additionally, efforts like Cold Spring Harbor Laboratory’s bioRχiv might be more widely used to pre-register research projects so that postdocs are aware of the efforts of their peers – enabling them to focus on innovation when appropriate.

While increased funding for basic science research would help to avoid the loss of talented scientists, and private sources may help to compensate for fickle federal funds [6], some attendees of the meeting suggested that the current mechanisms by which facilities and administrations costs are funded might be restructured. These costs, also called “indirect costs” - which cover expenditures associated with running research facilities, and not specific projects - might be restructured to avoid over 50 cents of every federally allocated dollar going to the institution itself, rather than the researchers of the projects that grants fund [7,8].  This dynamic has been suggested to foster the growth of institutions rather than investment in researchers, and optimizing this component of research funding might reveal opportunities to better support the careers of rising scientists [9,12]

Additionally, if the state of federal funding could be more predictable, dramatic fluctuations of the numbers of faculty positions and rising scientists might not result in such disparities [9].  For example, if appropriations legislation consistently adhered to 5 year funding plans, dynamics in biomedical research might avoid unexpected deficits of opportunities.


From [5]

Career counselors ought to provide accurate descriptions of how competitive a search for permanent faculty positions can be to their students, so they don’t enter a field with a misconceived sense of security.  Quotes from a survey conducted by Polka et al. reveal a substantial disparity between expectations and outcomes in academic careers, and adequate guidance might help avoid such circumstances.

As shown in the NSF’s Indicators report from 2014, the most rapidly growing reason postdocs identify as their rationale for beginning their projects is “other employment not available” – suggesting that a PhD in fields associated with biomedical sciences currently translates to limited opportunities. Even successful scientists and talented postdocs have become progressively more pessimistic about their career prospects.  Accordingly - while there are several possible solutions to this problem - if some remedial action isn’t taken, biomedical research in the U.S. may stagnate and suffer in upcoming coming years.

Citations
 1.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 2.    http://news.sciencemag.org/biology/2015/03/cancer-institute-plans-new-award-staff-scientists
 3.    https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf
 4.    http://www.nationalreview.com/article/378334/what-stem-shortage-steven-camarota
 5.    Polka JK, Krukenberg KA, McDowell GS. A call for transparency in tracking student and postdoc career outcomes. Mol Biol Cell. 2015 Apr 15;26(8):1413-5. doi: 10.1091/mbc.E14-10-1432.
 6.    http://sciencephilanthropyalliance.org/about.html
 7.    http://datahound.scientopia.org/2014/05/10/indirect-cost-rate-survey/
 8.    Ledford H. Indirect costs: keeping the lights on. Nature. 2014 Nov 20;515(7527):326-9. doi: 10.1038/515326a. Erratum in: Nature. 2015 Jan 8;517(7533):131
 9.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 10.    National Science Foundation (2014) National Science and Engineering Indicators (National Science Foundation, Washington, DC).
 11.    Bourne HR. A fair deal for PhD students and postdocs. Elife. 2013 Oct 1;2:e01139. doi: 10.7554/eLife.01139.
 12.    Bourne HR. The writing on the wall. Elife. 2013 Mar 26;2:e00642. doi: 10.7554/eLife.00642.
 13.    Powell K. The future of the postdoc. Nature. 2015 Apr 9;520(7546):144-7. doi: 10.1038/520144a.
 14.    Fix the PhD. Nature. 2011 Apr 21;472(7343):259-60. doi: 10.1038/472259b.
 15.   Schillebeeckx M, Maricque B, Lewis C. The missing piece to changing the university culture. Nat Biotechnol. 2013 Oct;31(10):938-41. doi: 10.1038/nbt.2706.

AAAS Forum Take #2

Another point of view of the AAAS Forum by Matthew Facciani:

I have provided scientific testimony and met with some of my local legislators, but I’ve never had any formal exposure to science policy. I was really excited to hear about the AAAS Science & Technology Policy Forum to learn more about how scientists can impact policy. The information I absorbed at the conference was overwhelming, but incredibly stimulating. Some of the lectures discussed the budget cuts and the depressing barriers for achieving science policy. However, I felt there was definitely an atmosphere of optimism at the conference and it was focused on how we can create positive change.

One of my favorite aspects of the conference were the discussions of how to effectively communicate science to non-scientists. Before we can even have discussions of funding, the general public needs to understand how science works and why basic science is so important. For example, science never proves anything with 100% certainty, but it may sound weak if politicians are only saying that science “suggests” instead of “proves.” One creative way to circumvent this problem is to use comparisons. Instead of saying “science suggests GMOs are safe” we could say “scientists are as sure that GMOs are safe as they are sure that smoking is bad for your health.” The conference was rife with these kinds of effective tactics and I left the conference with a sense of confidence that we can collectively make a difference to influence science policy.


Matthew Facciani is a sociology PhD student at The University of South Carolina. He is also a gender equality activist and science communicator. Learn more at www.matthewfacciani.com, and follow him at @MatthewFacciani.

2015 AAAS Science and Technology Policy Forum Summary


I recently had the opportunity to attend the 2015 AAAS Science and Technology Policy Forum in Washington, D.C. This annual meeting brings together a range of academics and professionals to discuss the broad S&T policy landscape. Below are some of my takeaways from the meeting. I hope to have additional comments from other National Science Policy Group members up soon.

By Chris Yarosh

The talks and panels at the Forum encompassed a huge range of topics from the federal budget and the appropriations outlook to manufacturing policy and, of course, shrimp treadmills. My opinion of the uniting themes tying this gamut together is just that—my opinion— and should only be taken as such. That being said, the threads I picked on in many of the talks can be summarized by three C’s: cooperation, communication, and citizenship.

First up, cooperation. Although sequestration’s most jarring impacts have faded, AAAS’s budget guru Matthew Hourihan warns that fiscal year 2016 could see a return of…let’s call it enhanced frugality. These cuts will fall disproportionately on social science, clean energy, and geoscience programs. With the possibility of more cuts to come, many speakers suggested that increased cooperation between entities could maximize value. This means increased partnership between science agencies and private organizations, as mentioned by White House Office of Science and Technology Policy Director John Holdren, and between federal agencies and state and local governments, as highlighted by NSF Director France Córdova. Cooperation across directorates and agencies will also be a major focus of big interdisciplinary science and efforts to improve STEM education. Whatever the form, the name of the game will be recognizing fiscal limitations and fostering cooperation to make the most of what is available.

The next “C” is communication. Dr. Córdova made a point of listing communication among the top challenges facing the NSF, and talks given by Drs. Patricia Brennan (of duck penis fame) and David Scholnick (the aforementioned shrimp) reinforced the scale of this challenge. As these two researchers reminded us so clearly, information on the Web and in the media can be easily be misconstrued for political or other purposes in absence of the correct scientific context. To combat this, many speakers made it clear that basic science researchers must engage a wider audience, including elected officials, or risk our research being misconstrued, distorted, or deemed unnecessary. As Dr. Brennan said, it is important to remind the public that while not every basic research project develops into something applied, “every application derives from basic science.”

The last “C” is citizenship. Several of the speakers discussed the culture of science and interconnections between scientists and non-scientists. I think that these presentations collectively described what I’ll call good science citizenship.  For one, good science citizenship means that scientists will increasingly need to recognize our role in the wider innovation ecosystem if major new programs are ever going to move forward. For example, a panel on new initiatives in biomedical research focused on 21st Century Cures and President Obama’s Precision Medicine Initiative. Both of these proposal are going to be massive undertakings; the former will involve the NIH and FDA collaborating to speed the development and introduction of new drugs to the market, while the latter is going to require buy in from a spectrum of stakeholders including funders, patient groups, bioethicists, and civil liberty organizations. Scientists are critical to these endeavors, obviously, but we will need to work seamlessly across disciplines and with other stakeholders to ensure the data collected from these programs are interpreted and applied responsibly.

Good science citizenship will also require critical evaluation of the scientific enterprise and the separation of the scientific process from scientific values, a duality discussed during the William D. Carey lecture given by Dr. William Press. This means that scientists must actively protect the integrity of the research enterprise by supporting all branches of science, including the social sciences (a topic highlighted throughout the event), and by rigorously weeding out misconduct and fraud. Scientists must also do a better job of making our rationalist approach works with different value systems, recognizing that people will need to come together to address major challenges like climate change.  Part of this will be better communication to the public, but part of it will also be learning how different value systems influence judgement of complicated scientific issues (a subject of another great panel about Public Opinion and Policy Making). Good science citizenship, cultivated through professionalism and respectful engagement of non-scientists, will ultimately be critical to maintaining broad support for science in the U.S.

Asking for a Small Piece of the Nation’s Pie

By Rosalind Mott, PhD


This article was originally published in the Penn Biomed Postdoctoral Council Newsletter (Spring 2015).

Historically, the NIH has received straightforward bipartisan support; in particular, the doubling of the NIH budget from FY98-03 led to a rapid growth in university based research. Unfortunately, ever since 2003, inflation has been slowly eating away at the doubling effort (Figure 1). There seems little hope for recovery other than the brief restoration in 2009 by the American Recovery and Reinvestment Act (ARRA). Making matters worse, Congress now has an abysmal record of moving policy through as bipartisan fighting dominates the Hill.

Fig 1: The slow erosion of the NIH budget over the past decade
(figure adapted from: http://fas.org/sgp/crs/misc/R43341.pdf)
Currently, support directed to the NIH is a mere 0.79% of federal discretionary spending. The bulk of this funding goes directly to extramural research, providing salaries for over 300,000 scientists across 2500 universities.  As the majority of biomedical researchers rely on government funding, it behooves these unique constituents to rally for sustainable support from Congress. Along with other scientists across the country who are becoming more politically involved, the Penn Science Policy Group  arranged for a Congressional Visit Day (CVD) in which a small group of post doctoral researchers and graduate students visited Capitol Hill on March 18th to remind the House and Senate that scientific research is a cornerstone to the US economy and to alert them to the impact of the erosion on young researchers. 

Led by post-docs Shaun O’Brien and Caleph Wilson, the group partnered with the National Science Policy Group (NSPG), a coalition of young scientists across the nation, to make over 60 visits to Congressional staff. NSPG leaders from other parts of the country, Alison Leaf (UCSF) and Sam Brinton (Third Way, Wash. DC), arranged for a productive experience in which newcomers to the Hill trained for their meetings.  The Science Coalition (TSC) provided advice on how to effectively communicate with politicians: keep the message clear and simple, provide them with evidence of how science positively impacts society and the economy, and tell personal stories of how budget-cuts are affecting your research. TSC pointed out the undeniable fact that face to face meetings with Congress are the most effective way to communicate our needs as scientists. With the announcement of President Obama’s FY16 budget request in February, the House and Senate are in the midst of the appropriations season, so it was no better time to remind them of just how important the funding mechanism is.

Meeting with the offices of Pennsylvania senators, Pat Toomey and Bob Casey, and representatives, Glenn Thompson and Chaka Fattah were key goals, but the meetings were extended to reach out to the states where the young scientists were born and raised – everywhere from Delaware to California. Each meeting was fifteen to twenty minutes of rapid discussion of the importance of federally funded basic research. At the end of the day, bipartisan support for the NIH was found to exist at the government’s core, but the hotly debated topic of how to fund the system has stalled its growth.
Shaun O’Brien recaps a disappointing experience in basic requests made to Senator Toomey. Sen. Toomey has slowly shifted his stance to be more supportive of the NIH, so meeting with his office was an important step in reaching the Republicans:

We mentioned the "Dear Colleague" letter by Sen. Bob Casey (D-PA) and Sen. Richard Burr (R-NC) that is asking budget appropriators to "give strong financial support for the NIH in the FY2016 budget". Sen. Toomey didn't sign onto it last year, especially as that letter asked for an increase in NIH funding to $31-32 billion and would have violated the sequester caps-which Sen. Toomey paints as a necessary evil to keep Washington spending in check. I asked the staffer for his thoughts on this year's letter, especially as it has no specific dollar figure and Sen. Toomey has stated his support for basic science research. The staffer said he would pass it along to Sen. Toomey and let him know about this letter.

Unfortunately, three weeks later, Sen. Toomey missed an opportunity to show his "newfound" support for science research as he declined to sign a letter that essentially supports the mission of the NIH.  I plan to call his office and see if I can get an explanation for why he failed to support this letter, especially as I thought it wouldn't have any political liability for him to sign.

Working with Congressman Chaka Fattah balanced the disappointment from Toomey with a spark of optimism. Rep. Fattah, a strong science supporter and member of the House Appropriations Committee, encourages scientists to implement twitter (tweet @chakafattah) to keep him posted on recent success stories and breakthroughs; these bits of information are useful tools in arguing the importance of basic research to other politicians.

Keeping those lines of communication strong is the most valuable role that we can play away from the lab.  Walking through the Russell Senate Office building, a glimpse of John McCain waiting for the elevator made the day surreal, removed from the normalcy of another day at the bench. The reality though is that our future as productive scientists is gravely dependent upon public opinion and in turn, government support. The simple act of outreach to the public and politicians is a common duty for all scientists alike whether it be through trips to the Hill or simple dinner conversations with our non-scientist friends.


Participants represented either their professional society and/or the National Science Policy Group, independent from their university affiliations. Support for the training and experience was provided by both the American Academy of Arts & Sciences (Cambridge, MA) and the American Association for the Advancement of Science (AAAS of Washington, DC).

Dr. Sarah Cavanaugh discusses biomedical research in her talk, "Homo sapiens: the ideal animal model"

Biology and preclinical medicine rely heavily upon research in animal models such as rodents, dogs, and chimps. But how translatable are the findings from these animal models to humans? And what alternative systems are being developed to provide more applicable results while reducing the number of research animals?
Image courtesy of PCRM


Last Thursday, PSPG invited Dr. Sarah Cavanaugh from the Physicians Committee for Responsible Medicine to discuss these issues. In her talk entitled, “Homo sapiens: the ideal animal model,” she emphasized that we are not particularly good at translating results from animal models into human patients. Data from the FDA says that 90% of drugs that perform well in animal studies fail when tested in clinical trials.  It may seem obvious, but it is important to point out that the biology of mice is not identical to human biology. Scientific publications have demonstrated important dissimilarities in regards to the pathology of inflammation, diabetes, cancer, Alzheimer’s, and heart disease.

All scientists understand that model systems have limitations, yet they have played an integral role in shaping our understanding of biology. But is it possible to avoid using experimental models entirely and just study human biology?

The ethics of studying biology in people are different from those of studying biology in animals.  The “do no harm” code of medical ethics dictates that we can’t perform experiments that have no conceivable benefit for the patient, so unnecessarily invasive procedures can not be undertaken just to obtain data. This limitation restricts the relative amount of information we can obtain about human biology as compared to animal biology.  Regardless, medical researchers do uncover important findings from human populations. Dr. Cavanaugh points out that studies of risk factors (both genetic and environmental) and biomarkers are important for understanding diseases, and non-invasive brain-imaging has increased our understanding of neurodegenerative diseases like Alzheimer’s.

Yet these are all correlative measures. They show that factor X correlates with a higher risk of a certain disease. But in order to develop effective therapies, we need to understand cause and effect relationships - in other words, the mechanism. To uncover mechanisms researchers need to be able to perturb the system and measure physiological changes or observe how a disease progresses. Performing these studies in humans is often hard, impossible, or unethical. For that reason, researchers turn to model systems in order to properly control experimental variables to understand biological mechanisms. We have learned a great deal about biology from animal models, but moving forward, can we develop models that better reflect human biology and pathology?

Using human post-mortem samples and stem cell lines is one way to avoid species differences between animals and humans, but studying isolated cells in culture does not reflect the complex systems-level biology of a living organism. To tackle this problem, researchers have started designing ways to model 3D human organs in vitro, such as the brain-on-a-chip system. Researchers also have envisioned using chips to model a functioning body using 10 interconnected tissues representing organs such as the heart, lungs, skin, kidneys, and liver.
Image from: http://nanoscience.ucf.edu/hickman/bodyonachip.php

Dr. Cavanaugh explained that toxicology is currently a field where chip-based screening shows promise. It makes sense that organs-on-a-chip technology could be useful for screening drug compounds before testing in animals. Chip-screening could filter out many molecules with toxic effects, thus reducing the number of compounds that are tested in animals before being investigated clinically.

A major counterpoint raised during the discussion was whether replacing animal models with human organs on a chip was simply replacing one imperfect, contrived model with another. Every model has limitations, so outside of directly testing therapeutics in humans, it is unlikely that we will be able to create a system that perfectly reflects the biological response in patients. The question then becomes, which models are more accurate? While ample data shows the limitations of animal models, very little is available showing that alternatives to animal-free models perform better than existing animal models. Dr Cavanaugh argues, however, that there is an opportunity to develop these models instead of continuing to pursue research in flawed animal models. “I don’t advocate that we end all animal research right now, rather that we invest in finding alternatives to replace the use of animals with technologies that are more relevant to human biology.”

This topic can ignite a passionate debate within the medical research community. Animal models are the status quo in research, and they are the gatekeepers in bench-to-bedside translation of scientific discoveries into therapeutics. In the absence of any shift in ethical standards for research, replacing animal models with alternatives will require mountains of strong data demonstrating better predictive performance. The incentives exist, though. Drug companies spend roughly $2.6 billion to gain market approval for a new prescription drug. Taking a drug into human trials and watching it fail is a huge waste of money. If researchers could develop new models for testing drugs that were more reliable than animal models at predicting efficacy in humans, it’s safe to say that Big Pharma would be interested. Very interested.


-Mike Allegrezza

"Wistar rat" by Janet Stephens via Wikimedia Commons 

Publish or Perish: an old system adapting to the digital era

    By Annie Chen and Michael Allegrezza
      
           When scientific publishing was developed in the 19th century, it was designed to overcome barriers that prevented scientists from disseminating their research findings efficiently. It was not feasible for scientists to arrange for typesetting, peer review, printing, and shipping of their results to every researcher in their field. As payment for these services offered by publishers, the researchers would transfer the exclusive copyrights for this material to the publisher, who would then charge subscribers access fees. To limit the printing costs associated with this system, journals only published articles with the most significant findings. Now, nearly 200 years later, we have computers, word processors, and the Internet. Information sharing has become easier than ever before, and it is nearly instantaneous. But the prevailing model of subscription-based publishing remains tethered to its pre-digital origins, and for the most part these publishers have used the Internet within this model, rather than as a tool to create a new and better system for sharing research.

Figure 1. Trend lines show an annual increase of 6.7%
for serials expenditures vs 2.9% for the Consumer Price
Index over the period 1986-2010, relative to 1986 prices.
In theory, digitization should have decreased costs of communicating science: (authors can perform many of the typesetting functions, articles can be uploaded online instead of printed and shipped, etc. In practice, however, digitization has actually increased the price of journals. Statistics from the Association of Research Libraries show that the amount spent on serials increased 6.7% per year between 1986 and 2011, while inflation as measured by the US Consumer Prices Index only rose 2.9% per year over the same period (Figure 1).1 Shawn Martin, a Penn Scholarly Communication Librarian, explained, “Penn pays at least twice for one article, but can pay up to 7 or more times for the same content,” in the process of hiring researchers to create the content, buying subscriptions from journals, and paying for reuse rights. To be fair, the transition phase from print to digital media has been costly for publishers because they have had to invest in infrastructure for digital availability while still producing print journals. Many publishers argue that while journal prices may have increased, the price per reader has actually decreased due to a surge in the ease of viewers accessing articles online.

Regardless of whether increasing journal prices was justified, a new model for academic publishing emerged in the 1990s in opposition: open access (OA). There are two ways of attaining open access: Gold OA, when the publisher makes the article freely accessible, and Green OA, which is self-archiving by the author. A few years ago, Laakso et al. conducted a quantitative analysis of the annual publication volumes of Direct OA journals from 1993 to 2009 and found that the development of open access could be described by three phases: Pioneering (1993-1999), Innovation (2000-2004), and Consolidation (2005-2009).2 During the pioneering years, there was high year-to-year growth of open access articles and journals, but the total numbers were still relatively small. OA publishing bloomed considerably from 2000 to 2009, growing from 19,500 articles and 740 journals to 191,850 articles and 4,769 journals, respectively. During the innovation years, new business models emerged. For example, BioMedCentral, later purchased by Springer in 2008, initiated the author charge. In 2004, some subscription-based journals began using a hybrid model, such as Springer’s Open Choice program, which gave authors the option of paying a fee to make their article openly available. During the consolidation phase, year-to-year growth for articles decreased from previous years but was still high, at about 20%.

The introduction of open access journals has sparked fierce and passionate debates among scientists. Proponents of open access believe scientific research should be available to everyone from anywhere in the world. Currently, subscription fees prevent many people from accessing the information they need. With open access, students and professors in low- and middle-income countries, health care professionals in resource-limited settings, and the general public would gain access to essential resources. For instance, Elizabeth Lowenthal, MD, at the Penn Center for AIDS Research, recently published a paper in PLoS One analyzing variables that influence adherence to retroviral drugs in HIV+ adolescents living in Botswana. Her decision to publish open access was because “the article will be of most direct use to clinicians working in Botswana and I wanted to make sure that it would be easy for them to access it.” Open access also provides re-use rights and may facilitate a more rapid exchange of ideas and increased interactions among scientists to generate new scientific information.

However, there may also be some downsides to increased access. Open access may increase the number of articles that people have to sift through to find important studies.3 Furthermore, people who do not know how to critically read scientific papers may be misled by articles with falsified data or flawed experiments. While these papers often get retracted later on, they may undermine the public’s confidence in scientists and medicine. Wakefield’s (retracted) article linking vaccines to autism, for example, may have contributed to the rise of the anti-vaccine movement in the US.4 Furthermore, many open access journals require authors to pay for their papers to be published to offset the cost of publication, and some people have taken advantage of this new payment system to make a profit through predatory journals (a list of predatory OA journals can be found here: http://scholarlyoa.com/publishers/). It is clear though, that the expansion of open access from 1993 until present time suggests that open access can be a sustainable alternative to the traditional model of subscription-based academic publishing.

In addition to facilitating access to scientific articles, the Internet has also created opportunities to improve the peer review process. Peer review was designed to evaluate the technical merit of a paper and to select papers that make significant contributions to a field. Scientists supporting the traditional model of publishing argue that the peer review process in some open access journals may not be as rigorous, and this may lead to the emergence of a “Wild West” in academic publishing. Last year, reporter John Bohannon from Science magazine sent a flawed paper to 304 open access journals, and of the 255 journals that responded, 157 accepted the paper, suggesting little or no peer review process in these journals.5 However, even high-impact journals publish papers with flawed experiments.6 Michael Eisen, co-founder of PLoS, wrote, “While they pocket our billions, with elegant sleight of hand, they get us to ignore the fact that crappy papers routinely get into high-profile journals simply because they deal with sexy topics…. Every time they publish because it is sexy, and not because it is right, science is distorted. It distorts research. It distorts funding. And it often distorts public policy.”7 Nature, for example, published two articles last year about acid-bath stem cell induction, which were later retracted due to data manipulation. However, according to Randy Sheckman, editor-in-chief of eLife, “these papers will generate thousands of citations for Nature, so they will profit from those papers even if they are retracted.”8

With digital communication, peer review for a manuscript could shift from a rigid gate controlled by 3 or 4 people, who might not even be active scientists, into a more dynamic, transparent, and ongoing process with feedback from thousands of scientists. Various social media platforms with these capabilities already exist, including ResearchGate9 and PubMedCommons.10 Some open access journals are using different strategies to address these issues in peer review. eLIFE, for example, employs a fast, streamlined peer review process to decrease the amount of time from submission to publication while maintaining high-quality science. On the other hand, PLoS One, one of the journals published by the Public Library of Science, judges articles based on technical merit alone, not on the novelty.

We polled a few scientists at Penn who had recently published for their thoughts on open access and peer review. Most people did not experience a difference in the peer review process at an open access journal compared to non-open access. The exception was at eLIFE, where reviewers’ comments were prompt, and the communication between reviewers and editors is “a step in the right direction,” according to Amita Sehgal, PhD. To improve the peer review process, some suggested a blind process to help eliminate potential bias towards well-known labs or against lesser-known labs.

The digital revolution is changing the culture of academic publishing, albeit slowly. In 2009, the NIH updated their Public Access Policy to require that any published research conducted with NIH grants be available on PubMed Central 12 months after publication.11 Just last month, the publisher Macmillan announced that all research papers in Nature and its sister journals will be made free to access online in a read-only format that can be annotated but not copied, printed or downloaded. However, only journal subscribers and some media outlets will be able to share links to the free full-text, read-only versions.12 Critics such as Michael Eisen13 and John Wilbanks14 have labeled this change merely a public relations ploy to appeal to demands without actually increasing access. It will be interesting to see if other publishers follow this trend.

Scientific communication has yet to reap the full benefits in efficiency made possible by the Internet. The current system is still less than ideal at furthering ideas and research with minimal waste of resources. But this generation of young researchers is more optimistic and may revolutionize scientific publishing as we know it. “I think [open access is] the future for all scientific publications,” says Bo Li, a postdoc at Penn. “I hope all research articles will be freely accessible to everyone in the world.”

A companion opinion article by Penn PhD student Brian S. Cole can be found here.

This article appeared in the Penn Science Policy January 2015 newsletter



Annie Chen
Michael Allegrezza














1Ware M, Mabe M. (2012) The stm report. http://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf
2Laakso M, Welling P, Bukvova, H, et al. (2011) The development of open access journal publishing from 1993 to 2009. PLoS ONE.
3Hannay, T. (2014) Stop the deluge of scientific research. The Guardian: Higher Education Network Blog. http://www.theguardian.com/higher-education-network/blog/2014/aug/05/why-we-should-publish-less-scientific-research.
4Wakefield AJ, Murch SH, Anthony A, Linnell, et al. (1998) Ileal lymphoid nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children [retracted]. Lancet. 351:637-41.
5Bohannon, J. (2013) Who’s afraid of peer review? Science. 342(6154): 60-65. DOI: 10.1126/science.342.6154.60
6Wolfe-Simon F, Switzer Blum J, Kulp TR, et al. (2011) A bacterium that can grow by using arsenic instead of phosphorus. Science. 332(6034) 1163-6. doi: 10.1126/science.1197258.
7Eisen M. (2013) I confess, I wrote the Arsenic DNA paper to expose flaws in peer-review at subscription based journals. it is NOT junk. http://www.michaeleisen.org/blog/?p=1439.
8(2014) Episode 12. The eLIFE podcast. http://elifesciences.org/podcast/episode12
9ResearchGate. http://www.researchgate.net/
10PubMed Commons. http://www.ncbi.nlm.nih.gov/pubmedcommons/
11NIH Public Access Policy Details. http://publicaccess.nih.gov/policy.htm
12Baynes G, Hulme L, MacDonald S. (2014) Articles on nature.com to be made widely available to read and share to support collaborative research. Nature. http://www.nature.com/press_releases/share-nature-content.html
13Eisen M. (2014) Is Nature’s “free to view” a magnanimous gesture or a cynical ploy?. it is NOT junk. http://www.michaeleisen.org/blog/?p=1668
14Van Noorden R. (2014) Nature promotes read-only sharing by subscribers. Nature. http://www.nature.com/news/nature-promotes-read-only-sharing-by-subscribers-1.16460

The Richest Return of Wisdom

 By Brian S. Cole

    The real lesson I’ve gleaned from my time in pursuit of a PhD in biomedical research hasn’t been the research itself; indeed many of my colleagues and I came into the program already equipped with extensive bench experience, but the real eye-opener has been how science is communicated.  When I was an undergraduate, assiduously repeating PCR after PCR that quietly and dutifully failed to put bands on a gel, I just assumed that experiments always worked in the well-funded, well-respected, well-published labs that wrote the papers we read in school.  As an undergraduate, I had implicit trust in scientific publications; at the end of the PhD, I have implicit skepticism.  It turns out I’m not alone.

    The open access movement has taken a new tone in the past year: increasing recognition of the irreplicability1 and alarming prevalence of scientific misconduct2 in highly-cited journals has led to questioning of the closed review process.  Such a process disallows the public access to reviewers’ comments on the work, as well as the editorial correspondence and decision process.  The reality of the publication industry is selling ads and subscriptions, and it is likely that editors often override scientific input by peer reviewers that throws a sexy new manuscript into question.  The problem is the public doesn’t get access to the review process, and closed peer review is tantamount to no peer review at all as far as accountability is concerned.

    For these reasons, our current scientific publication platform has two large-scale negative consequences: the first economic, and the second epistemic.  First, intellectual property rights for publicly funded research are routinely transferred to nonpublic entities that then use these rights for profit.  Second, there is insufficient interactivity within the scientific community and with the public as a result of the silo effect of proprietary journals.  The open access revolution is gaining momentum on the gravity of these issues, but to date, open access journals and publishers have largely conformed to the existing model of journals and isolated manuscripts, and while open access journals have enabled public access to scientific publications, they fail to provide the direly needed interactivity that the internet enables.

    In the background of the open access revolution in science, a 70 year old idea3 about a new system for disseminating scientific publications was realized two decades ago on a publicly licensed code stack4 that allows not just open review, but distributed and continuous open review with real-time version control and hypertext interlinking: not just citations, links to the actual source.  Imagine being able to publish a paper that anybody can review, suggest edits to, add links to, and discuss publicly, with every step of that ongoing process versioned and stored.  If another researcher repeats your experiment, they can contribute their data.  If you extend or strengthen the message of your paper with a future experiment, that can also be appended.  Such a platform would utterly transform scientific publication from a series of soliloquies into an evolving cloud of interlinked ideas.  We’ve had that technology for an alarmingly long time given its lack of adoption by researchers who continue to grant highly cited journals ownership over the work the public has already paid for.

    I’ve kicked around the idea of a Wikiscience5 publication system for a long time with a lot of scientists, and the concerns that came up were cogent and constructive.  In testament to the tractability of a wiki replacement for our system of scientific publication is Wikipedia, one of the greatest gifts to humankind ever to grace the worldwide web.  The distributed review and discussion system that makes Wikipedia evolve does work, and most of us are old enough to remember a time when nobody thought it would.  But how can we assess impact and retain attribution in a distributed publication and review system such as a wiki?  Metrics such as journal impact factor and article-level metrics wouldn’t directly apply to a community-edited, community-reviewed scientific resource.  Attribution and impact assessment are important challenges to any system that aims to replace our journal and manuscript method for disseminating scientific information.  While a distributed scientific information system would not easily fit into the context of the current metrics for publication impact that are an intimate part of the funding, hiring, and promotion processes in academia, the consideration of such a system presents an opportunity to explore innovative analyses of the relevance and impact of scientific research.  Indeed, rethinking the evaluation of scientists and their work6 is a pressing need even within the context of the current publication system.

    We should be thinking about the benefit of the networked consciousness of online collectivism, not the startling failures of our current publication system to put scientific communication into the hands of the public that enabled it, or even the challenges in preserving integrity and attribution in a commons-based peer production system.7  We are the generation that grew up with Napster and 4chan, the information generation, the click-on-it-and-it’s-mine generation, born into a world of unimaginable technological wealth.  Surely we can do better than paywalls, closed peer review, and for-profit publishers.  We owe it to everybody: as Emerson put it, “He who has put forth his total strength in fit actions, has the richest return of wisdom.” 8


This article accompanies a feature piece about scientific publishing in the digital era and also appeared in the Penn Science Policy Group January 2015 newsletter

Brian S. Cole

1Ioannidis, John P. A. "Why Most Published Research Findings Are False."PLoS Medicine 2.8 (2005): E124.
2Stern, Andrew M., Arturo Casadevall, R. Grant Steen, and Ferric C. Fang. "Research: Financial Costs and Personal Consequences of Research Misconduct Resulting in Retracted Publications." ELife 3 (2014)
3Bush, Vannevar. "As We May Think." The Atlantic. Atlantic Media Company, 01 July 1945.
4"MediaWiki 1.24." - MediaWiki. <http://www.mediawiki.org/wiki/MediaWiki_1.24>.
5"WikiScience." - Meta. <http://meta.wikimedia.org/wiki/WikiScience>.
6"San Francisco Declaration on Research Assessment." American Society for Cell Biology. <http://www.ascb.org/dora/>.
7"3. Peer Production and Sharing." <http://cyber.law.harvard.edu/wealth_of_networks/3._Peer_Production_and_Sharing>.
8Emerson, Ralph W. "The American Scholar." The American Scholar. Web. <http://www.emersoncentral.com/amscholar.htm>.

Penn researchers interview HIV-positive adolescents in Botswana to better understand the factors affecting adherence to antiretroviral treatments

Of the more than three million children infected with HIV, 90% live in Africa. As HIV-positive children become adolescents, it is important that antiretroviral treatments are maintained to protect their own health, as well as to safeguard the adolescents from developing resistant strains of HIV and to prevent infection of other individuals.

HIV-positive adolescents’ adherence to these treatments has been identified as a public health challenge for Botswana. However, the assessment tools testing psychosocial factors that are likely associated with poor adherence have been developed in Western countries and their constructs may not be relevant to African contexts. A new study published in PLOS ONE by Penn researchers Elizabeth Lowenthal and Karen Glanz described the cultural adaptation of these assessment tools for Botswana.

The psychosocial assessments investigate factors that may affect adolescents’ adherence to antiretroviral treatments. As Lowenthal summarized, “one of the key reasons why adolescents with HIV have higher rates of death compared with people with HIV in other age groups is that they have trouble taking their medications regularly.”

Researchers looked at the following factors by testing 7 separate assessment scales developed with Western cohorts for their applicability to Botswanan adolescents.
  • Psychological reactance- an aversion to abide by regulations that impose upon freedom and autonomy
  • Perceived stigma
  • Outcome expectancy- whether treatments were expected to improve health
  • Consideration of future consequences- the extent to which adolescents plan for their futures rather than focusing on immediate gains
  • Socio-emotional support- how adolescents receive the social and emotional support they need

The researchers interviewed 34 HIV-positive Botswanan adolescents in depth, sub grouped by age in order to talk about the factors in ways participants could understand.

The study confirmed the construct validity of some assessment tools, but highlighted four areas that caused tools to not relate to participants:
  • Socio-emotional support for the adolescents mostly came from parents rather than peers.
  • Denial of being HIV infected was more common than expected.
  • Participants were surprisingly ambivalent about taking their medicine.

Some of the tools (psychological reactance, future consequences) required major modifications to obtain construct validity for adolescents with HVI in Botswana.The assessment tools were modified during the course of the study based on participant feedback. Future research will test the association between these modified assessment tools and HIV treatment outcomes in order to provide insight into how to best support HIV infected adolescents.

First author Lowenthal suggested that the study could inform studies of adolescent adherence to other treatments as well, stating that “questions that we are able to answer in our large cohort of HIV-positive adolescents will likely be generalizable to other groups of adolescents with chronic diseases.”

-Barbara McNutt