Penn Science Spotlight: Learning how T cells manage the custom RNA business

Chris Yarosh

This Science Spotlight focuses on the research I do here at Penn, the results of which are now in press at Nucleic Acids Research1. You can read the actual manuscript right now, if you would like, because NAR is “open access,” meaning all articles published there are available to anyone for free. We’ve talked about open access on this blog before, if you’re curious about how that works. 

First, a note about this type of science. The experiments done for this paper fall into the category of “basic research,” which means they were not designed to achieve an immediate practical end. That type of work is known as “applied” research. Basic research, on the other hand, is curiosity-driven science that aims to increase our understanding of something. That something could be cells, supernovas, factors influencing subjective well-being in adolescence, or anything else, really. This isn’t to say that basic research doesn’t lead to advances that impact people’s lives; quite the opposite is true. In fact, no applied work is possible without foundational basic work being done first. Rather, the real difference between the two categories is timeline and focus: applied research looks to achieve a defined practical goal (such as creating a new Ebola vaccine) as soon as possible, while basic research seeks to add to human knowledge over time. If you’re an American, your tax dollars support basic research (thanks!), often through grants from the National Institutes of Health (NIH) or the National Science Foundation (NSF). This work, for example, was funded in part by two grants from the NIH: one to my PhD mentor, Dr. Kristen Lynch (R01 GM067719), and the second to me (F31 AG047022). More info on science funding can be found here.

Now that you've gotten your basic research primer, let's talk science. This paper is primarily focused on how T cells (immune system cells) control a process called alternative splicing to make custom-ordered proteins. While most people have heard of DNA, the molecule that contains your genes, not everyone is as familiar with the RNA or proteins. I like to think of it this way: DNA is similar to the master blueprint for a building, specifying all of the necessary components needed for construction. This blueprint ultimately codes for proteins, the molecules in a cell that actually perform life’s work. RNA, which is “transcribed” from DNA and “translated” into protein, is a version of the master blueprint that can be edited as needed for different situations. Certain parts of RNA can be mixed and matched to generate custom orders of the same protein, just as you might change a building’s design based on location, regulations, etc. This mixing and matching process is called alternative splicing (AS), and though it sounds somewhat science-fictiony, AS naturally occurs across the range of human cell types.



While we know AS happens, scientists haven’t yet unraveled the different strategies cells use to control it. Part of the reason for this is the sheer number of proteins involved in AS (hundreds), and part of it is a lack of understanding of the nuts and bolts of the proteins that do the managing. This paper focuses on the nuts and bolts stuff. Previous work2 done in our lab has shown that a protein known as PSF manipulates AS to produce an alternate version of a different protein, CD45, critical for T cell response to antigens (bits of bacteria or viruses). PSF doesn’t do this, however, when a third protein, TRAP150, binds it, although we previously didn’t know why. This prompted us to ask two major questions: How do PSF and TRAP150 link up with one another, and how does TRAP150 change PSF’s function?

My research, as detailed in this NAR paper, answers these questions using the tools of biochemistry and molecular biology. In short, we found that TRAP150 actually prevents PSF from doing its job by binding in the same place RNA does. This makes intuitive sense: PSF can’t influence splicing of targets it can’t actually make contact with, and it can't contact them if TRAP150 is gumming up the works. To make this conclusion, we diced PSF and TRAP150 up into smaller pieces to see which parts fit together, and we also looked for which part of PSF binds RNA. These experiments helped us pinpoint all of the action in one region of PSF known as the RNA recognition motifs (RRMs), specifically RRM2. Finally, we wanted to know if PSF and TRAP150 regulate other RNA molecules in T cells, so we did a screen (the specific technique is called “RASL-Seq,” but that’s not critical to understanding the outcome) and found almost 40 other RNA molecules that appear to be controlled by this duo. In summary, we now know how TRAP150 acts to change PSF’s activity, and we have shown this interaction to be critical for regulating a bunch of RNAs in T cells.

So what are the implications of this research? For one, we now know that PSF and TRAP150 regulate the splicing of a range of RNAs in T cells, something noteworthy for researchers interested in AS or how T cells work. Second, we describe a mechanism for regulating proteins that might be applicable to some of those other hundreds of proteins responsible for regulating AS, too. Finally, PSF does a lot more than just mange AS in the cell. It actually seems to have a role in almost every step of the DNA-RNA-protein pathway. By isolating the part of PSF targeted by TRAP150, we can hypothesize about what PSF might do when TRAP150 binds it based on what other sections of the protein remain “uncovered.” It will take more experiments to figure it all out, but our data provide good clues for researchers who want to know more about all the things PSF does.

A map of the PSF protein. Figure adapted from Yarosh et al.WIREs RNA 2015, 6: 351-367. doi: 10.1002/wrna.1280
Papers cited:
1.) Christopher A. Yarosh; Iulia Tapescu; Matthew G. Thompson; Jinsong Qiu; Michael J. Mallory; Xiang-Dong Fu; Kristen W. Lynch. TRAP150 interacts with the RNA-binding domain of PSF and antagonizes splicing of numerous PSF-target genes in T cells. Nucleic Acids Research 2015;
doi: 10.1093/nar/gkv816

2.) Heyd F, Lynch KW. Phosphorylation-dependent regulation of PSF by GSK3 controls CD45 alternative splicing. Mol Cell 2010,40:126–137.

Training the biomedical workforce - a discussion of postdoc inflation


By Ian McLaughlin


Earlier this month, postdocs and graduate students from several fields met to candidly discuss the challenges postdocs are encountering while pursuing careers in academic research.  The meeting began with an enumeration of these challenges, discussing the different elements contributing to the mounting obstacles preventing postdocs from attaining faculty positions – such as the scarcity of faculty positions and ballooning number of rising postdocs, funding mechanisms and cuts, the sub-optimal relationship between publications and the quality of science, and the inaccurate conception of what exactly a postdoctoral position should entail.


From [15]

At a fundamental level, there’s a surplus of rising doctoral students whose progression outpaces the availability of faculty positions at institutions capable of hosting the research they intended to perform [10,15].  While 65% of PhDs attain postdocs, only 15-20% of postdocs attain tenure-track faculty positions [1].  This translates to significant extensions of postdoctoral positions, with the intentions of bolstering credentials and generating more publications to increase their appeal to hiring institutions.  Despite this increased time, postdocs often do not benefit from continued teaching experiences, and are also unable to attend classes to cultivate professional development.


From [10]
Additionally, there may never be an adequate position available. Instead of providing the training and mentorship necessary to generate exceptional scientists, postdoctoral positions have become “holding tanks” for many PhD holders unable to transition into permanent positions [5,11], resulting in considerably lower compensation relative to alternative careers 5 years after attaining a PhD.

From [13]

Perhaps this wouldn’t be quite so problematic if the compensation of the primary workhorse of basic biomedical research in the US was better.  In 2014, the US National Academies called for an increase of the starting postdoc salary of $42,840 to $50,000 – as well as a 5-year limit on the length of postdocs [1].  While the salary increase would certainly help, institutions like NYU, the University of California system, and UNC Chapel Hill have explored term limits.  Unfortunately, a frequent outcome of term limits was the promotion of postdocs to superficial positions that simply confer a new title, but are effectively extended postdocs. 

Given the time commitment required to attain a PhD, and the expanding durations of postdocs, several of the meeting’s attendees identified a particularly painful interference with their ability to start a family.  Despite excelling in challenging academic fields at top institutions, and dedicating professionally productive years to their work, several postdocs stated that they don’t foresee the financial capacity to start a family before fertility challenges render the effort prohibitively difficult.

However, administrators of the NIH have suggested this apparent disparity between the number of rising postdocs and available positions is not a significant problem, despite having no apparent data to back up their position. As Polka et al. wrote earlier this year, NIH administrators don’t have data quantifying the total numbers of postdocs in the country at their disposal – calling into question whether they are prepared to address this fundamental problem [5].

A possible approach to mitigate this lack of opportunity would be to integrate permanent “superdoc” positions for talented postdocs who don’t have ambitions to start their own labs, but have technical skills needed to advance basic research.  The National Cancer Institute (NCI) has proposed a grant program to cover salaries between $75,000-$100,000 for between 50-60 of such positions [1,2], which might be expanded to cover the salaries of more scientists.  Additionally, a majority of the postdocs attending the meeting voiced their desire for more comprehensive career guidance.  In particular, while they are aware that PhD holders are viable candidates for jobs outside of academia – the career trajectory out of academia remains opaque to them.

This situation stands in stark contrast to the misconception that the US suffers from a shortage of STEM graduates.  While the careers of postdocs stall due to a scarcity of faculty positions, the President’s Council of Advisors on Science and Technology announced a goal of one million STEM trainees in 2012 [3], despite the fact that only 11% of students graduating with bachelor’s degrees in science end up in fields related to science [4] due in part, perhaps, to an inflated sense of job security.  While the numbers of grad students and postdocs have increased almost two-fold, the proliferation of permanent research positions hasn’t been commensurate [5]. So, while making science a priority is certainly prudent – the point of tension is not necessarily a shortage of students engaging the fields, but rather a paucity of research positions available to them once they’ve attained graduate degrees. 

Suggested Solutions

Ultimately, if the career prospects for academic researchers in the US don't change, increasing numbers of PhD students will leave basic science research in favor of alternatives that offer better compensation and career trajectories – or leave the country for international opportunities.  At the heart of the problem is a fundamental imbalance between the funding available for basic academic research and the growing community of scientists in the U.S [9,14], and a dysfunctional career pipeline in biomedical research [9].  Some ideas of strategies to confront this problem included the following suggestions.

Federal grant-awarding agencies need to collect accurate data on the yearly numbers of postdoctoral positions available.  This way, career counselors, potential students, rising PhD students, and the institutions themselves will have a better grasp of the apparent scarcity of academic research opportunities.

As the US National Academies have suggested, the postdoc salary ought to be increased.  One possible strategy would be to increase the prevalence of “superdoc”-type positions creating a viable career alternative for talented researchers who wish to support a family but not secure the funding needed to open their own labs.  Additionally, if institutions at which postdocs receive federal funding were to consider them employees with all associated benefits, rather than trainees, rising scientists might better avoid career stagnation and an inability to support families [11].

As the number of rising PhDs currently outpaces the availability of permanent faculty positions, one strategy may be to limit the number of PhD positions available at each institution to prevent continued escalation of postdocs without viable faculty positions to which they might apply.  One attendee noted that this could immediately halt the growth of PhDs with bleak career prospects.

Several attendees brought up the problems many postdocs encounter in particularly large labs, which tend to receive disproportionately high grant funding.  Postdocs in such labs feel pressure to generate useful data to ensure they can compete with their peers, while neglecting other elements of their professional development and personal life. As well, the current system funnels funding to labs that can guarantee positive results, favoring conservative rather than potentially paradigm-shifting proposals – translating to reduced funding for new investigators [9]. Grant awarding agencies’ evaluations of grant proposals might integrate considerations of the sizes of labs with the goal of fostering progress in smaller labs. Additionally, efforts like Cold Spring Harbor Laboratory’s bioRχiv might be more widely used to pre-register research projects so that postdocs are aware of the efforts of their peers – enabling them to focus on innovation when appropriate.

While increased funding for basic science research would help to avoid the loss of talented scientists, and private sources may help to compensate for fickle federal funds [6], some attendees of the meeting suggested that the current mechanisms by which facilities and administrations costs are funded might be restructured. These costs, also called “indirect costs” - which cover expenditures associated with running research facilities, and not specific projects - might be restructured to avoid over 50 cents of every federally allocated dollar going to the institution itself, rather than the researchers of the projects that grants fund [7,8].  This dynamic has been suggested to foster the growth of institutions rather than investment in researchers, and optimizing this component of research funding might reveal opportunities to better support the careers of rising scientists [9,12]

Additionally, if the state of federal funding could be more predictable, dramatic fluctuations of the numbers of faculty positions and rising scientists might not result in such disparities [9].  For example, if appropriations legislation consistently adhered to 5 year funding plans, dynamics in biomedical research might avoid unexpected deficits of opportunities.


From [5]

Career counselors ought to provide accurate descriptions of how competitive a search for permanent faculty positions can be to their students, so they don’t enter a field with a misconceived sense of security.  Quotes from a survey conducted by Polka et al. reveal a substantial disparity between expectations and outcomes in academic careers, and adequate guidance might help avoid such circumstances.

As shown in the NSF’s Indicators report from 2014, the most rapidly growing reason postdocs identify as their rationale for beginning their projects is “other employment not available” – suggesting that a PhD in fields associated with biomedical sciences currently translates to limited opportunities. Even successful scientists and talented postdocs have become progressively more pessimistic about their career prospects.  Accordingly - while there are several possible solutions to this problem - if some remedial action isn’t taken, biomedical research in the U.S. may stagnate and suffer in upcoming coming years.

Citations
 1.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 2.    http://news.sciencemag.org/biology/2015/03/cancer-institute-plans-new-award-staff-scientists
 3.    https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf
 4.    http://www.nationalreview.com/article/378334/what-stem-shortage-steven-camarota
 5.    Polka JK, Krukenberg KA, McDowell GS. A call for transparency in tracking student and postdoc career outcomes. Mol Biol Cell. 2015 Apr 15;26(8):1413-5. doi: 10.1091/mbc.E14-10-1432.
 6.    http://sciencephilanthropyalliance.org/about.html
 7.    http://datahound.scientopia.org/2014/05/10/indirect-cost-rate-survey/
 8.    Ledford H. Indirect costs: keeping the lights on. Nature. 2014 Nov 20;515(7527):326-9. doi: 10.1038/515326a. Erratum in: Nature. 2015 Jan 8;517(7533):131
 9.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 10.    National Science Foundation (2014) National Science and Engineering Indicators (National Science Foundation, Washington, DC).
 11.    Bourne HR. A fair deal for PhD students and postdocs. Elife. 2013 Oct 1;2:e01139. doi: 10.7554/eLife.01139.
 12.    Bourne HR. The writing on the wall. Elife. 2013 Mar 26;2:e00642. doi: 10.7554/eLife.00642.
 13.    Powell K. The future of the postdoc. Nature. 2015 Apr 9;520(7546):144-7. doi: 10.1038/520144a.
 14.    Fix the PhD. Nature. 2011 Apr 21;472(7343):259-60. doi: 10.1038/472259b.
 15.   Schillebeeckx M, Maricque B, Lewis C. The missing piece to changing the university culture. Nat Biotechnol. 2013 Oct;31(10):938-41. doi: 10.1038/nbt.2706.

AAAS Forum Take #2

Another point of view of the AAAS Forum by Matthew Facciani:

I have provided scientific testimony and met with some of my local legislators, but I’ve never had any formal exposure to science policy. I was really excited to hear about the AAAS Science & Technology Policy Forum to learn more about how scientists can impact policy. The information I absorbed at the conference was overwhelming, but incredibly stimulating. Some of the lectures discussed the budget cuts and the depressing barriers for achieving science policy. However, I felt there was definitely an atmosphere of optimism at the conference and it was focused on how we can create positive change.

One of my favorite aspects of the conference were the discussions of how to effectively communicate science to non-scientists. Before we can even have discussions of funding, the general public needs to understand how science works and why basic science is so important. For example, science never proves anything with 100% certainty, but it may sound weak if politicians are only saying that science “suggests” instead of “proves.” One creative way to circumvent this problem is to use comparisons. Instead of saying “science suggests GMOs are safe” we could say “scientists are as sure that GMOs are safe as they are sure that smoking is bad for your health.” The conference was rife with these kinds of effective tactics and I left the conference with a sense of confidence that we can collectively make a difference to influence science policy.


Matthew Facciani is a sociology PhD student at The University of South Carolina. He is also a gender equality activist and science communicator. Learn more at www.matthewfacciani.com, and follow him at @MatthewFacciani.

2015 AAAS Science and Technology Policy Forum Summary


I recently had the opportunity to attend the 2015 AAAS Science and Technology Policy Forum in Washington, D.C. This annual meeting brings together a range of academics and professionals to discuss the broad S&T policy landscape. Below are some of my takeaways from the meeting. I hope to have additional comments from other National Science Policy Group members up soon.

By Chris Yarosh

The talks and panels at the Forum encompassed a huge range of topics from the federal budget and the appropriations outlook to manufacturing policy and, of course, shrimp treadmills. My opinion of the uniting themes tying this gamut together is just that—my opinion— and should only be taken as such. That being said, the threads I picked on in many of the talks can be summarized by three C’s: cooperation, communication, and citizenship.

First up, cooperation. Although sequestration’s most jarring impacts have faded, AAAS’s budget guru Matthew Hourihan warns that fiscal year 2016 could see a return of…let’s call it enhanced frugality. These cuts will fall disproportionately on social science, clean energy, and geoscience programs. With the possibility of more cuts to come, many speakers suggested that increased cooperation between entities could maximize value. This means increased partnership between science agencies and private organizations, as mentioned by White House Office of Science and Technology Policy Director John Holdren, and between federal agencies and state and local governments, as highlighted by NSF Director France Córdova. Cooperation across directorates and agencies will also be a major focus of big interdisciplinary science and efforts to improve STEM education. Whatever the form, the name of the game will be recognizing fiscal limitations and fostering cooperation to make the most of what is available.

The next “C” is communication. Dr. Córdova made a point of listing communication among the top challenges facing the NSF, and talks given by Drs. Patricia Brennan (of duck penis fame) and David Scholnick (the aforementioned shrimp) reinforced the scale of this challenge. As these two researchers reminded us so clearly, information on the Web and in the media can be easily be misconstrued for political or other purposes in absence of the correct scientific context. To combat this, many speakers made it clear that basic science researchers must engage a wider audience, including elected officials, or risk our research being misconstrued, distorted, or deemed unnecessary. As Dr. Brennan said, it is important to remind the public that while not every basic research project develops into something applied, “every application derives from basic science.”

The last “C” is citizenship. Several of the speakers discussed the culture of science and interconnections between scientists and non-scientists. I think that these presentations collectively described what I’ll call good science citizenship.  For one, good science citizenship means that scientists will increasingly need to recognize our role in the wider innovation ecosystem if major new programs are ever going to move forward. For example, a panel on new initiatives in biomedical research focused on 21st Century Cures and President Obama’s Precision Medicine Initiative. Both of these proposal are going to be massive undertakings; the former will involve the NIH and FDA collaborating to speed the development and introduction of new drugs to the market, while the latter is going to require buy in from a spectrum of stakeholders including funders, patient groups, bioethicists, and civil liberty organizations. Scientists are critical to these endeavors, obviously, but we will need to work seamlessly across disciplines and with other stakeholders to ensure the data collected from these programs are interpreted and applied responsibly.

Good science citizenship will also require critical evaluation of the scientific enterprise and the separation of the scientific process from scientific values, a duality discussed during the William D. Carey lecture given by Dr. William Press. This means that scientists must actively protect the integrity of the research enterprise by supporting all branches of science, including the social sciences (a topic highlighted throughout the event), and by rigorously weeding out misconduct and fraud. Scientists must also do a better job of making our rationalist approach works with different value systems, recognizing that people will need to come together to address major challenges like climate change.  Part of this will be better communication to the public, but part of it will also be learning how different value systems influence judgement of complicated scientific issues (a subject of another great panel about Public Opinion and Policy Making). Good science citizenship, cultivated through professionalism and respectful engagement of non-scientists, will ultimately be critical to maintaining broad support for science in the U.S.

Asking for a Small Piece of the Nation’s Pie

By Rosalind Mott, PhD


This article was originally published in the Penn Biomed Postdoctoral Council Newsletter (Spring 2015).

Historically, the NIH has received straightforward bipartisan support; in particular, the doubling of the NIH budget from FY98-03 led to a rapid growth in university based research. Unfortunately, ever since 2003, inflation has been slowly eating away at the doubling effort (Figure 1). There seems little hope for recovery other than the brief restoration in 2009 by the American Recovery and Reinvestment Act (ARRA). Making matters worse, Congress now has an abysmal record of moving policy through as bipartisan fighting dominates the Hill.

Fig 1: The slow erosion of the NIH budget over the past decade
(figure adapted from: http://fas.org/sgp/crs/misc/R43341.pdf)
Currently, support directed to the NIH is a mere 0.79% of federal discretionary spending. The bulk of this funding goes directly to extramural research, providing salaries for over 300,000 scientists across 2500 universities.  As the majority of biomedical researchers rely on government funding, it behooves these unique constituents to rally for sustainable support from Congress. Along with other scientists across the country who are becoming more politically involved, the Penn Science Policy Group  arranged for a Congressional Visit Day (CVD) in which a small group of post doctoral researchers and graduate students visited Capitol Hill on March 18th to remind the House and Senate that scientific research is a cornerstone to the US economy and to alert them to the impact of the erosion on young researchers. 

Led by post-docs Shaun O’Brien and Caleph Wilson, the group partnered with the National Science Policy Group (NSPG), a coalition of young scientists across the nation, to make over 60 visits to Congressional staff. NSPG leaders from other parts of the country, Alison Leaf (UCSF) and Sam Brinton (Third Way, Wash. DC), arranged for a productive experience in which newcomers to the Hill trained for their meetings.  The Science Coalition (TSC) provided advice on how to effectively communicate with politicians: keep the message clear and simple, provide them with evidence of how science positively impacts society and the economy, and tell personal stories of how budget-cuts are affecting your research. TSC pointed out the undeniable fact that face to face meetings with Congress are the most effective way to communicate our needs as scientists. With the announcement of President Obama’s FY16 budget request in February, the House and Senate are in the midst of the appropriations season, so it was no better time to remind them of just how important the funding mechanism is.

Meeting with the offices of Pennsylvania senators, Pat Toomey and Bob Casey, and representatives, Glenn Thompson and Chaka Fattah were key goals, but the meetings were extended to reach out to the states where the young scientists were born and raised – everywhere from Delaware to California. Each meeting was fifteen to twenty minutes of rapid discussion of the importance of federally funded basic research. At the end of the day, bipartisan support for the NIH was found to exist at the government’s core, but the hotly debated topic of how to fund the system has stalled its growth.
Shaun O’Brien recaps a disappointing experience in basic requests made to Senator Toomey. Sen. Toomey has slowly shifted his stance to be more supportive of the NIH, so meeting with his office was an important step in reaching the Republicans:

We mentioned the "Dear Colleague" letter by Sen. Bob Casey (D-PA) and Sen. Richard Burr (R-NC) that is asking budget appropriators to "give strong financial support for the NIH in the FY2016 budget". Sen. Toomey didn't sign onto it last year, especially as that letter asked for an increase in NIH funding to $31-32 billion and would have violated the sequester caps-which Sen. Toomey paints as a necessary evil to keep Washington spending in check. I asked the staffer for his thoughts on this year's letter, especially as it has no specific dollar figure and Sen. Toomey has stated his support for basic science research. The staffer said he would pass it along to Sen. Toomey and let him know about this letter.

Unfortunately, three weeks later, Sen. Toomey missed an opportunity to show his "newfound" support for science research as he declined to sign a letter that essentially supports the mission of the NIH.  I plan to call his office and see if I can get an explanation for why he failed to support this letter, especially as I thought it wouldn't have any political liability for him to sign.

Working with Congressman Chaka Fattah balanced the disappointment from Toomey with a spark of optimism. Rep. Fattah, a strong science supporter and member of the House Appropriations Committee, encourages scientists to implement twitter (tweet @chakafattah) to keep him posted on recent success stories and breakthroughs; these bits of information are useful tools in arguing the importance of basic research to other politicians.

Keeping those lines of communication strong is the most valuable role that we can play away from the lab.  Walking through the Russell Senate Office building, a glimpse of John McCain waiting for the elevator made the day surreal, removed from the normalcy of another day at the bench. The reality though is that our future as productive scientists is gravely dependent upon public opinion and in turn, government support. The simple act of outreach to the public and politicians is a common duty for all scientists alike whether it be through trips to the Hill or simple dinner conversations with our non-scientist friends.


Participants represented either their professional society and/or the National Science Policy Group, independent from their university affiliations. Support for the training and experience was provided by both the American Academy of Arts & Sciences (Cambridge, MA) and the American Association for the Advancement of Science (AAAS of Washington, DC).

Dr. Sarah Cavanaugh discusses biomedical research in her talk, "Homo sapiens: the ideal animal model"

Biology and preclinical medicine rely heavily upon research in animal models such as rodents, dogs, and chimps. But how translatable are the findings from these animal models to humans? And what alternative systems are being developed to provide more applicable results while reducing the number of research animals?
Image courtesy of PCRM


Last Thursday, PSPG invited Dr. Sarah Cavanaugh from the Physicians Committee for Responsible Medicine to discuss these issues. In her talk entitled, “Homo sapiens: the ideal animal model,” she emphasized that we are not particularly good at translating results from animal models into human patients. Data from the FDA says that 90% of drugs that perform well in animal studies fail when tested in clinical trials.  It may seem obvious, but it is important to point out that the biology of mice is not identical to human biology. Scientific publications have demonstrated important dissimilarities in regards to the pathology of inflammation, diabetes, cancer, Alzheimer’s, and heart disease.

All scientists understand that model systems have limitations, yet they have played an integral role in shaping our understanding of biology. But is it possible to avoid using experimental models entirely and just study human biology?

The ethics of studying biology in people are different from those of studying biology in animals.  The “do no harm” code of medical ethics dictates that we can’t perform experiments that have no conceivable benefit for the patient, so unnecessarily invasive procedures can not be undertaken just to obtain data. This limitation restricts the relative amount of information we can obtain about human biology as compared to animal biology.  Regardless, medical researchers do uncover important findings from human populations. Dr. Cavanaugh points out that studies of risk factors (both genetic and environmental) and biomarkers are important for understanding diseases, and non-invasive brain-imaging has increased our understanding of neurodegenerative diseases like Alzheimer’s.

Yet these are all correlative measures. They show that factor X correlates with a higher risk of a certain disease. But in order to develop effective therapies, we need to understand cause and effect relationships - in other words, the mechanism. To uncover mechanisms researchers need to be able to perturb the system and measure physiological changes or observe how a disease progresses. Performing these studies in humans is often hard, impossible, or unethical. For that reason, researchers turn to model systems in order to properly control experimental variables to understand biological mechanisms. We have learned a great deal about biology from animal models, but moving forward, can we develop models that better reflect human biology and pathology?

Using human post-mortem samples and stem cell lines is one way to avoid species differences between animals and humans, but studying isolated cells in culture does not reflect the complex systems-level biology of a living organism. To tackle this problem, researchers have started designing ways to model 3D human organs in vitro, such as the brain-on-a-chip system. Researchers also have envisioned using chips to model a functioning body using 10 interconnected tissues representing organs such as the heart, lungs, skin, kidneys, and liver.
Image from: http://nanoscience.ucf.edu/hickman/bodyonachip.php

Dr. Cavanaugh explained that toxicology is currently a field where chip-based screening shows promise. It makes sense that organs-on-a-chip technology could be useful for screening drug compounds before testing in animals. Chip-screening could filter out many molecules with toxic effects, thus reducing the number of compounds that are tested in animals before being investigated clinically.

A major counterpoint raised during the discussion was whether replacing animal models with human organs on a chip was simply replacing one imperfect, contrived model with another. Every model has limitations, so outside of directly testing therapeutics in humans, it is unlikely that we will be able to create a system that perfectly reflects the biological response in patients. The question then becomes, which models are more accurate? While ample data shows the limitations of animal models, very little is available showing that alternatives to animal-free models perform better than existing animal models. Dr Cavanaugh argues, however, that there is an opportunity to develop these models instead of continuing to pursue research in flawed animal models. “I don’t advocate that we end all animal research right now, rather that we invest in finding alternatives to replace the use of animals with technologies that are more relevant to human biology.”

This topic can ignite a passionate debate within the medical research community. Animal models are the status quo in research, and they are the gatekeepers in bench-to-bedside translation of scientific discoveries into therapeutics. In the absence of any shift in ethical standards for research, replacing animal models with alternatives will require mountains of strong data demonstrating better predictive performance. The incentives exist, though. Drug companies spend roughly $2.6 billion to gain market approval for a new prescription drug. Taking a drug into human trials and watching it fail is a huge waste of money. If researchers could develop new models for testing drugs that were more reliable than animal models at predicting efficacy in humans, it’s safe to say that Big Pharma would be interested. Very interested.


-Mike Allegrezza

"Wistar rat" by Janet Stephens via Wikimedia Commons 

Publish or Perish: an old system adapting to the digital era

    By Annie Chen and Michael Allegrezza
      
           When scientific publishing was developed in the 19th century, it was designed to overcome barriers that prevented scientists from disseminating their research findings efficiently. It was not feasible for scientists to arrange for typesetting, peer review, printing, and shipping of their results to every researcher in their field. As payment for these services offered by publishers, the researchers would transfer the exclusive copyrights for this material to the publisher, who would then charge subscribers access fees. To limit the printing costs associated with this system, journals only published articles with the most significant findings. Now, nearly 200 years later, we have computers, word processors, and the Internet. Information sharing has become easier than ever before, and it is nearly instantaneous. But the prevailing model of subscription-based publishing remains tethered to its pre-digital origins, and for the most part these publishers have used the Internet within this model, rather than as a tool to create a new and better system for sharing research.

Figure 1. Trend lines show an annual increase of 6.7%
for serials expenditures vs 2.9% for the Consumer Price
Index over the period 1986-2010, relative to 1986 prices.
In theory, digitization should have decreased costs of communicating science: (authors can perform many of the typesetting functions, articles can be uploaded online instead of printed and shipped, etc. In practice, however, digitization has actually increased the price of journals. Statistics from the Association of Research Libraries show that the amount spent on serials increased 6.7% per year between 1986 and 2011, while inflation as measured by the US Consumer Prices Index only rose 2.9% per year over the same period (Figure 1).1 Shawn Martin, a Penn Scholarly Communication Librarian, explained, “Penn pays at least twice for one article, but can pay up to 7 or more times for the same content,” in the process of hiring researchers to create the content, buying subscriptions from journals, and paying for reuse rights. To be fair, the transition phase from print to digital media has been costly for publishers because they have had to invest in infrastructure for digital availability while still producing print journals. Many publishers argue that while journal prices may have increased, the price per reader has actually decreased due to a surge in the ease of viewers accessing articles online.

Regardless of whether increasing journal prices was justified, a new model for academic publishing emerged in the 1990s in opposition: open access (OA). There are two ways of attaining open access: Gold OA, when the publisher makes the article freely accessible, and Green OA, which is self-archiving by the author. A few years ago, Laakso et al. conducted a quantitative analysis of the annual publication volumes of Direct OA journals from 1993 to 2009 and found that the development of open access could be described by three phases: Pioneering (1993-1999), Innovation (2000-2004), and Consolidation (2005-2009).2 During the pioneering years, there was high year-to-year growth of open access articles and journals, but the total numbers were still relatively small. OA publishing bloomed considerably from 2000 to 2009, growing from 19,500 articles and 740 journals to 191,850 articles and 4,769 journals, respectively. During the innovation years, new business models emerged. For example, BioMedCentral, later purchased by Springer in 2008, initiated the author charge. In 2004, some subscription-based journals began using a hybrid model, such as Springer’s Open Choice program, which gave authors the option of paying a fee to make their article openly available. During the consolidation phase, year-to-year growth for articles decreased from previous years but was still high, at about 20%.

The introduction of open access journals has sparked fierce and passionate debates among scientists. Proponents of open access believe scientific research should be available to everyone from anywhere in the world. Currently, subscription fees prevent many people from accessing the information they need. With open access, students and professors in low- and middle-income countries, health care professionals in resource-limited settings, and the general public would gain access to essential resources. For instance, Elizabeth Lowenthal, MD, at the Penn Center for AIDS Research, recently published a paper in PLoS One analyzing variables that influence adherence to retroviral drugs in HIV+ adolescents living in Botswana. Her decision to publish open access was because “the article will be of most direct use to clinicians working in Botswana and I wanted to make sure that it would be easy for them to access it.” Open access also provides re-use rights and may facilitate a more rapid exchange of ideas and increased interactions among scientists to generate new scientific information.

However, there may also be some downsides to increased access. Open access may increase the number of articles that people have to sift through to find important studies.3 Furthermore, people who do not know how to critically read scientific papers may be misled by articles with falsified data or flawed experiments. While these papers often get retracted later on, they may undermine the public’s confidence in scientists and medicine. Wakefield’s (retracted) article linking vaccines to autism, for example, may have contributed to the rise of the anti-vaccine movement in the US.4 Furthermore, many open access journals require authors to pay for their papers to be published to offset the cost of publication, and some people have taken advantage of this new payment system to make a profit through predatory journals (a list of predatory OA journals can be found here: http://scholarlyoa.com/publishers/). It is clear though, that the expansion of open access from 1993 until present time suggests that open access can be a sustainable alternative to the traditional model of subscription-based academic publishing.

In addition to facilitating access to scientific articles, the Internet has also created opportunities to improve the peer review process. Peer review was designed to evaluate the technical merit of a paper and to select papers that make significant contributions to a field. Scientists supporting the traditional model of publishing argue that the peer review process in some open access journals may not be as rigorous, and this may lead to the emergence of a “Wild West” in academic publishing. Last year, reporter John Bohannon from Science magazine sent a flawed paper to 304 open access journals, and of the 255 journals that responded, 157 accepted the paper, suggesting little or no peer review process in these journals.5 However, even high-impact journals publish papers with flawed experiments.6 Michael Eisen, co-founder of PLoS, wrote, “While they pocket our billions, with elegant sleight of hand, they get us to ignore the fact that crappy papers routinely get into high-profile journals simply because they deal with sexy topics…. Every time they publish because it is sexy, and not because it is right, science is distorted. It distorts research. It distorts funding. And it often distorts public policy.”7 Nature, for example, published two articles last year about acid-bath stem cell induction, which were later retracted due to data manipulation. However, according to Randy Sheckman, editor-in-chief of eLife, “these papers will generate thousands of citations for Nature, so they will profit from those papers even if they are retracted.”8

With digital communication, peer review for a manuscript could shift from a rigid gate controlled by 3 or 4 people, who might not even be active scientists, into a more dynamic, transparent, and ongoing process with feedback from thousands of scientists. Various social media platforms with these capabilities already exist, including ResearchGate9 and PubMedCommons.10 Some open access journals are using different strategies to address these issues in peer review. eLIFE, for example, employs a fast, streamlined peer review process to decrease the amount of time from submission to publication while maintaining high-quality science. On the other hand, PLoS One, one of the journals published by the Public Library of Science, judges articles based on technical merit alone, not on the novelty.

We polled a few scientists at Penn who had recently published for their thoughts on open access and peer review. Most people did not experience a difference in the peer review process at an open access journal compared to non-open access. The exception was at eLIFE, where reviewers’ comments were prompt, and the communication between reviewers and editors is “a step in the right direction,” according to Amita Sehgal, PhD. To improve the peer review process, some suggested a blind process to help eliminate potential bias towards well-known labs or against lesser-known labs.

The digital revolution is changing the culture of academic publishing, albeit slowly. In 2009, the NIH updated their Public Access Policy to require that any published research conducted with NIH grants be available on PubMed Central 12 months after publication.11 Just last month, the publisher Macmillan announced that all research papers in Nature and its sister journals will be made free to access online in a read-only format that can be annotated but not copied, printed or downloaded. However, only journal subscribers and some media outlets will be able to share links to the free full-text, read-only versions.12 Critics such as Michael Eisen13 and John Wilbanks14 have labeled this change merely a public relations ploy to appeal to demands without actually increasing access. It will be interesting to see if other publishers follow this trend.

Scientific communication has yet to reap the full benefits in efficiency made possible by the Internet. The current system is still less than ideal at furthering ideas and research with minimal waste of resources. But this generation of young researchers is more optimistic and may revolutionize scientific publishing as we know it. “I think [open access is] the future for all scientific publications,” says Bo Li, a postdoc at Penn. “I hope all research articles will be freely accessible to everyone in the world.”

A companion opinion article by Penn PhD student Brian S. Cole can be found here.

This article appeared in the Penn Science Policy January 2015 newsletter



Annie Chen
Michael Allegrezza














1Ware M, Mabe M. (2012) The stm report. http://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf
2Laakso M, Welling P, Bukvova, H, et al. (2011) The development of open access journal publishing from 1993 to 2009. PLoS ONE.
3Hannay, T. (2014) Stop the deluge of scientific research. The Guardian: Higher Education Network Blog. http://www.theguardian.com/higher-education-network/blog/2014/aug/05/why-we-should-publish-less-scientific-research.
4Wakefield AJ, Murch SH, Anthony A, Linnell, et al. (1998) Ileal lymphoid nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children [retracted]. Lancet. 351:637-41.
5Bohannon, J. (2013) Who’s afraid of peer review? Science. 342(6154): 60-65. DOI: 10.1126/science.342.6154.60
6Wolfe-Simon F, Switzer Blum J, Kulp TR, et al. (2011) A bacterium that can grow by using arsenic instead of phosphorus. Science. 332(6034) 1163-6. doi: 10.1126/science.1197258.
7Eisen M. (2013) I confess, I wrote the Arsenic DNA paper to expose flaws in peer-review at subscription based journals. it is NOT junk. http://www.michaeleisen.org/blog/?p=1439.
8(2014) Episode 12. The eLIFE podcast. http://elifesciences.org/podcast/episode12
9ResearchGate. http://www.researchgate.net/
10PubMed Commons. http://www.ncbi.nlm.nih.gov/pubmedcommons/
11NIH Public Access Policy Details. http://publicaccess.nih.gov/policy.htm
12Baynes G, Hulme L, MacDonald S. (2014) Articles on nature.com to be made widely available to read and share to support collaborative research. Nature. http://www.nature.com/press_releases/share-nature-content.html
13Eisen M. (2014) Is Nature’s “free to view” a magnanimous gesture or a cynical ploy?. it is NOT junk. http://www.michaeleisen.org/blog/?p=1668
14Van Noorden R. (2014) Nature promotes read-only sharing by subscribers. Nature. http://www.nature.com/news/nature-promotes-read-only-sharing-by-subscribers-1.16460

The Richest Return of Wisdom

 By Brian S. Cole

    The real lesson I’ve gleaned from my time in pursuit of a PhD in biomedical research hasn’t been the research itself; indeed many of my colleagues and I came into the program already equipped with extensive bench experience, but the real eye-opener has been how science is communicated.  When I was an undergraduate, assiduously repeating PCR after PCR that quietly and dutifully failed to put bands on a gel, I just assumed that experiments always worked in the well-funded, well-respected, well-published labs that wrote the papers we read in school.  As an undergraduate, I had implicit trust in scientific publications; at the end of the PhD, I have implicit skepticism.  It turns out I’m not alone.

    The open access movement has taken a new tone in the past year: increasing recognition of the irreplicability1 and alarming prevalence of scientific misconduct2 in highly-cited journals has led to questioning of the closed review process.  Such a process disallows the public access to reviewers’ comments on the work, as well as the editorial correspondence and decision process.  The reality of the publication industry is selling ads and subscriptions, and it is likely that editors often override scientific input by peer reviewers that throws a sexy new manuscript into question.  The problem is the public doesn’t get access to the review process, and closed peer review is tantamount to no peer review at all as far as accountability is concerned.

    For these reasons, our current scientific publication platform has two large-scale negative consequences: the first economic, and the second epistemic.  First, intellectual property rights for publicly funded research are routinely transferred to nonpublic entities that then use these rights for profit.  Second, there is insufficient interactivity within the scientific community and with the public as a result of the silo effect of proprietary journals.  The open access revolution is gaining momentum on the gravity of these issues, but to date, open access journals and publishers have largely conformed to the existing model of journals and isolated manuscripts, and while open access journals have enabled public access to scientific publications, they fail to provide the direly needed interactivity that the internet enables.

    In the background of the open access revolution in science, a 70 year old idea3 about a new system for disseminating scientific publications was realized two decades ago on a publicly licensed code stack4 that allows not just open review, but distributed and continuous open review with real-time version control and hypertext interlinking: not just citations, links to the actual source.  Imagine being able to publish a paper that anybody can review, suggest edits to, add links to, and discuss publicly, with every step of that ongoing process versioned and stored.  If another researcher repeats your experiment, they can contribute their data.  If you extend or strengthen the message of your paper with a future experiment, that can also be appended.  Such a platform would utterly transform scientific publication from a series of soliloquies into an evolving cloud of interlinked ideas.  We’ve had that technology for an alarmingly long time given its lack of adoption by researchers who continue to grant highly cited journals ownership over the work the public has already paid for.

    I’ve kicked around the idea of a Wikiscience5 publication system for a long time with a lot of scientists, and the concerns that came up were cogent and constructive.  In testament to the tractability of a wiki replacement for our system of scientific publication is Wikipedia, one of the greatest gifts to humankind ever to grace the worldwide web.  The distributed review and discussion system that makes Wikipedia evolve does work, and most of us are old enough to remember a time when nobody thought it would.  But how can we assess impact and retain attribution in a distributed publication and review system such as a wiki?  Metrics such as journal impact factor and article-level metrics wouldn’t directly apply to a community-edited, community-reviewed scientific resource.  Attribution and impact assessment are important challenges to any system that aims to replace our journal and manuscript method for disseminating scientific information.  While a distributed scientific information system would not easily fit into the context of the current metrics for publication impact that are an intimate part of the funding, hiring, and promotion processes in academia, the consideration of such a system presents an opportunity to explore innovative analyses of the relevance and impact of scientific research.  Indeed, rethinking the evaluation of scientists and their work6 is a pressing need even within the context of the current publication system.

    We should be thinking about the benefit of the networked consciousness of online collectivism, not the startling failures of our current publication system to put scientific communication into the hands of the public that enabled it, or even the challenges in preserving integrity and attribution in a commons-based peer production system.7  We are the generation that grew up with Napster and 4chan, the information generation, the click-on-it-and-it’s-mine generation, born into a world of unimaginable technological wealth.  Surely we can do better than paywalls, closed peer review, and for-profit publishers.  We owe it to everybody: as Emerson put it, “He who has put forth his total strength in fit actions, has the richest return of wisdom.” 8


This article accompanies a feature piece about scientific publishing in the digital era and also appeared in the Penn Science Policy Group January 2015 newsletter

Brian S. Cole

1Ioannidis, John P. A. "Why Most Published Research Findings Are False."PLoS Medicine 2.8 (2005): E124.
2Stern, Andrew M., Arturo Casadevall, R. Grant Steen, and Ferric C. Fang. "Research: Financial Costs and Personal Consequences of Research Misconduct Resulting in Retracted Publications." ELife 3 (2014)
3Bush, Vannevar. "As We May Think." The Atlantic. Atlantic Media Company, 01 July 1945.
4"MediaWiki 1.24." - MediaWiki. <http://www.mediawiki.org/wiki/MediaWiki_1.24>.
5"WikiScience." - Meta. <http://meta.wikimedia.org/wiki/WikiScience>.
6"San Francisco Declaration on Research Assessment." American Society for Cell Biology. <http://www.ascb.org/dora/>.
7"3. Peer Production and Sharing." <http://cyber.law.harvard.edu/wealth_of_networks/3._Peer_Production_and_Sharing>.
8Emerson, Ralph W. "The American Scholar." The American Scholar. Web. <http://www.emersoncentral.com/amscholar.htm>.

Penn researchers interview HIV-positive adolescents in Botswana to better understand the factors affecting adherence to antiretroviral treatments

Of the more than three million children infected with HIV, 90% live in Africa. As HIV-positive children become adolescents, it is important that antiretroviral treatments are maintained to protect their own health, as well as to safeguard the adolescents from developing resistant strains of HIV and to prevent infection of other individuals.

HIV-positive adolescents’ adherence to these treatments has been identified as a public health challenge for Botswana. However, the assessment tools testing psychosocial factors that are likely associated with poor adherence have been developed in Western countries and their constructs may not be relevant to African contexts. A new study published in PLOS ONE by Penn researchers Elizabeth Lowenthal and Karen Glanz described the cultural adaptation of these assessment tools for Botswana.

The psychosocial assessments investigate factors that may affect adolescents’ adherence to antiretroviral treatments. As Lowenthal summarized, “one of the key reasons why adolescents with HIV have higher rates of death compared with people with HIV in other age groups is that they have trouble taking their medications regularly.”

Researchers looked at the following factors by testing 7 separate assessment scales developed with Western cohorts for their applicability to Botswanan adolescents.
  • Psychological reactance- an aversion to abide by regulations that impose upon freedom and autonomy
  • Perceived stigma
  • Outcome expectancy- whether treatments were expected to improve health
  • Consideration of future consequences- the extent to which adolescents plan for their futures rather than focusing on immediate gains
  • Socio-emotional support- how adolescents receive the social and emotional support they need

The researchers interviewed 34 HIV-positive Botswanan adolescents in depth, sub grouped by age in order to talk about the factors in ways participants could understand.

The study confirmed the construct validity of some assessment tools, but highlighted four areas that caused tools to not relate to participants:
  • Socio-emotional support for the adolescents mostly came from parents rather than peers.
  • Denial of being HIV infected was more common than expected.
  • Participants were surprisingly ambivalent about taking their medicine.

Some of the tools (psychological reactance, future consequences) required major modifications to obtain construct validity for adolescents with HVI in Botswana.The assessment tools were modified during the course of the study based on participant feedback. Future research will test the association between these modified assessment tools and HIV treatment outcomes in order to provide insight into how to best support HIV infected adolescents.

First author Lowenthal suggested that the study could inform studies of adolescent adherence to other treatments as well, stating that “questions that we are able to answer in our large cohort of HIV-positive adolescents will likely be generalizable to other groups of adolescents with chronic diseases.”

-Barbara McNutt 

Penn researchers identify novel therapeutic target for kidney cancer


Kidney cancer, also known as renal cancer, is one of the ten most common cancers in both men and women. The American Cancer Society’s most recent estimates state that of the predicted 63,920 new cases of kidney cancer this year, roughly 20% of  patients will die from the disease. By far, the most common type of kidney cancer is renal cell carcinoma (RCC). The majority of RCCs are clear cell RCCs (ccRCCs), a subtype characterized by metabolic alterations, specifically increased carbohydrate and fat storage. More than 90% of ccRCCs have been found to have mutations in the von Hippel-Lindau (VHL) tumor suppressor gene; however, kidney specific VHL deletion in mice does not induce tumorigenesis or cause metabolic changes similar to those seen in ccRCC tumors. So what additional factors are needed for ccRCC tumor formation and progression? A recent study by Penn researchers published in the journal Nature identified the rate-limiting gluconeogenesis enzyme fructose-1,6-bisphosphatase (FBP1) as a key regulator of ccRCC progression.

To better understand ccRCC progression, the study’s first author, Bo Li, a post-doctoral researcher in the lab of Dr. Celeste Simon, performed metabolic profiling on human ccRCC tumors while also analyzing ccRCC metabolic gene expression profiles. Compared to the adjacent normal kidney tissue, ccRCC tumors had increased amounts of metabolites involved in sugar metabolism and significantly lower expression of carbohydrate storage genes, including FBP1. Further investigation revealed FBP1 expression was reduced in almost all tumor samples tested (>600) and reduced FBP1 expression strongly correlated with advanced tumor stage and poor patient survival. Thus, understanding the role of FBP1 in ccRCCs could significantly impact the treatment of this disease.

How do reduced levels of FBP1 promote ccRCC tumor progression? The authors found that FBP1 depletion in ccRCC cells stimulates growth and relieves inhibition of sugar breakdown (glycolysis), which provides energy for the growing cancer cells. In addition, VHL mutations associated with ccRCCs prevent the degradation of a transcription factor that responds to decreases in oxygen, known as hypoxia-inducible factor α (HIFα), thus stabilizing it. Stabilized HIFα does not cause FBP1 depletion, but its activity is tightly regulated by FBP1. This study emphasized the importance of the interaction between HIFα and FBP1, particularly when glucose and oxygen levels are low, for the formation and progression of the ccRCC.

Why is this work so important? Little is known about how changes in cell metabolism contribute to the formation and progression of ccRCC tumors. As stated by Li, “elucidating how FBP1 impacts the altered metabolic and genetic programs of ccRCC improves our knowledge of the molecular details accompanying ccRCC progression, and identifies novel therapeutic targets for this common malignancy.” Future work may focus on identifying how FBP1 is suppressed and whether reversing FBP1 suppression could improve patient outcomes. 

-Renske Erion

Purdue professor Dr. Sanders responds to commentary about his Ebola interview with Fox News

Last month I analyzed the media coverage of Ebola in a post where I dissected an interview between Fox News reporters and Dr. David Sanders. I was recently contacted by Dr. Sanders, who wished to clarify a few issues that I raised in my article. The purpose of my post was to demonstrate how the media sometimes covers scientific issues in ways that exaggerate and oversimplify concepts, which can potentially mislead non-scientist citizens.

I stated that the way Dr. Sanders described his research sounded a little misleading. I intended to convey how I thought an average non-scientist listener might interpret the dialogue. However, Dr. Sanders points out that he was careful with his wording to avoid possible confusion. He explained, “as you have pointed out, one says one thing, and the media (and the Internet) render it as something else.  I would just like to point out that I carefully stated that Ebola can ENTER human lung from the airway side; I never said infect.  I also try to avoid the use of the term ‘airborne’ because of the confusion about its meaning.”

Also, he had several good scientific points about the validity of using pseudotyped viruses and the comparison to other viruses when considering the potential for a change in Ebola transmission.

“Pseudotyped viruses are used widely for studying viral entry, and I know of no examples where the conclusions on the cell biology of the entry of pseudotyped viruses have been contradicted by studies of entry of the intact virus despite such comparisons having been published numerous times.” 

“When we discovered that there was maternal-child transmission of HIV was that a new mode of transmission or merely a discovery of a previously unknown mode of transmission? How was Hepatitis C transmitted between humans before injections and blood transfusions? I don't know either. How is Ebola virus transmitted between fruit bats or from fruit bats to humans? Perhaps modes of transmission differ in their efficiency. The HIV comparison with Ebola ("HIV hasn't become airborne") is fallacious given the cell biology of entry for the two viruses.  The receptors for HIV (the CD4 attachment factor and the chemokine receptor) are present on blood cells and not on lung tissue.  The receptors for Ebola are present on a diverse set of cells including lung cells. In addition, Influenza A switches in real time from a gastrointestinal virus in birds to a respiratory virus in mammals--not that many mutations required.”

Additionally, he wisely pointed out that “precedent may be a valid argument in medical practice or the law, but it is not valid in science.” In fact, science seeks to uncover things that were previously unknown, and thus were without precedent.

I appreciate Dr. Sander’s response to my article. I think that rational and in-depth discussions about science need to happen more frequently in the media. Short, simplified stories with shock-factor headlines only detract from the important conversations that are necessary to find practical solutions to challenges like Ebola.

-Mike Allegrezza

Penn researchers identify neurons that link circadian rhythms with behavioral outcomes.

Our bodies evolved to alternate rhythmically through sleep and wake periods with the 24-hr cycle of the day. These “circadian rhythms” are controlled by specific neurons in the brain that act as molecular clocks. The experience of jet lag when we change time zones is the out-of-sync period before the brain’s internal clock re-aligns with the external environment.

How does this molecular clock work in the brain? Decades of research have uncovered that environmental signals, such as light, are integrated into a circadian clock by specific neurons in the brain. However, little is understood about how these circadian clock cells drive biological effects such as sleep, locomotion, and metabolism. A study by Penn researchers published earlier this year in Cell has discovered critical neural circuits linking the circadian clock neurons to behavioral outputs.

The researchers used the fruit fly Drosophila as a model organism because like humans, they also have circadian rhythms, yet they are very easy to manipulate genetically and many powerful tools exist to study the 150 circadian clock neurons in their brains. The study found that a crucial part of the circadian output network exists in the pars intercerebralis (PI), the functional equivalent of the human hypothalamus.

“Flies are normally active during the day and quiescent at night, but when I activate or ablate subsets of PI neurons, they distribute their activity randomly across the day,” describes the study’s first author, Daniel Cavanaugh, PhD, a post-doc working in the lab of Amita Sehgal, PhD. Importantly, the research showed that modulating the PI neurons lead to behavioral changes without affecting the molecular oscillations in central circadian clock neurons, indicating that the PI neurons link signals from the circadian clock neurons to behavioral outputs.

The study also showed that the PI neurons are anatomically connected to core clock neurons using a technique involving the fluorescent protein GFP. Cavanaugh explains, “The GFP molecule is split into two components, which are expressed in two different neuronal [cell] populations. If those populations come into close synaptic contact with one another, the split GFP components are able to reach across the synaptic space to reconstitute a fluorescent GFP molecule, which can be visualized with fluorescence microscopy.”

Additionally, their experiments showed that a peptide called DH44, a homolog to the mammalian corticotropin-releasing hormone, is expressed in PI neurons and important for maintaining circadian-driven behavioral rhythms.

While these new data are interesting for understanding general mechanisms of biology, they also have implications for human health and disease.

“People exposed to chronic circadian misalignment, such as occurs during shift work, show increased rates of heart disease, diabetes, obesity, cancer, and gastrointestinal disorders,” says Cavanaugh. “In order to understand the connection between circadian disruption and these diseases, we have to understand how the circadian system works to control the physiological outputs that underlie these disease processes.”

-Mike Allegrezza

Fox News demonstrates both good and bad ways to cover Ebola

Some news outlets, including Fox, have been wildly spreading fears about Ebola. As an example of both good and bad ways that the media covers science, let’s take a look at a recent clip from Fox News in which they interview Dr. David Sanders about the possibility of Ebola virus mutating to become airborne-transmissible (right now it is only spread by direct contact!)



Their story is titled "Purdue professor says Ebola 'primed' to go airborne.Here is a link to the video.

I’ll start off with the good things:

1) Dr. Sanders did a good job explaining that Ebola is not airborne right now, but there is a "non-zero" probability that Ebola might mutate to infect the lungs and become air transmissible. And this probability increases as more people are infected.
2) The newscasters did a good job of accurately recapping what he was explaining without blowing it out of proportion.

Now for some bad things:

1) Quite obviously, the scare-you-into-clicking-on-it title. First of all, it's completely misleading for the sole purpose of grabbing attention (it got me!). Second of all, it's completely false. I watched it three times and Dr. Sanders never said "primed." So it is blatantly incorrect.
2) They did not include coverage of other scientists that claim the fears of airborne transmission are over-hyped because there are no instances of that ever happening naturally for a virus that infects humans. HIV and hepatitis are both good examples that have infected millions without changing their route of transmission.
3) The way Dr. Sanders describes his published research is a little misleading in the context of this story. It sounds like he describes the research demonstrated Ebola virus can infect the lungs. In fact, the actual study showed that if you take some of the proteins from the surface of Ebola and code them into a completely different virus (in this case a feline lentivirus, similar to HIV), you can infect human airway epithelial cells grown in cell culture. So this research did not use the full Ebola virus, and did not demonstrate this infection in a live animal model. Link to study here: http://www.ncbi.nlm.nih.gov/pubmed/12719583

Some of these negative aspects might be a consequence of the brevity of this story. However, in an information-dense world, people get the news in short snippets, so the media needs to be careful not to compromise accuracy.

Interestingly, on the same network, Shep Smith reported on Ebola with commendable accuracy. He communicated the facts clearly and concisely while criticizing “hysterical” reporting as “irresponsible.” 

I hope future reports from Fox News and the rest of the media follow his tone.

*Update Nov 19, 2014: A follow up to this post detailing a thoughtful response from Dr. Sanders can be found here.


At the interface of science and society - a career fostering public interest in science at The Franklin Institute.

Credit: The Franklin Institute
Everybody loves science museums. Their fun and interactive way of presenting science reconnects you with your childhood self, when you were curious, when you wondered, and when you were so amazed that you could only manage to say, “Wow!” But what is it like to work at a science museum?

On Wednesday, we hosted Jayatri Das, PhD, to describe her career engaging the public with science as the Chief Bioscientist at The Franklin Institute. As you would expect, her transition from the lab into the museum was cultivated by a strong interest in outreach and teaching. After receiving her PhD from Princeton, she gained experience as a Christine Mirzayan Science and Technology Policy Fellow developing programs for the Marian Koshland Science Museum in Washington, DC. Following a short post-doctoral appointment, she landed a position with The Franklin Institute, an opportunity that she partly ascribes to fortuitous timing, as PhD level positions at museums are rare.

In her job she embraces a new paradigm for how science should interact with society. The goal is no longer public understanding of science. Rather, she urges we should strive for public engagement with science. “We want to communicate to our visitors that they are part of the conversation on how we use science and technology,” she says.

Science and technology do not exist in a void. Jayatri describes that:

1) Values shape technology
2) Technology affects social relationships
3) Technologies work because they are part of systems.

As an example, consider nanotechnology. This field has opened new possibilities to create quantum computing, high-tech military clothing, flexible inexpensive solar panels, clean energy, simple water filters, and new cancer treatments; even invisibility cloaks and elevators into space have been envisioned. But which of these technologies are developed will depend on the values of those funding the research and the circumstances driving market demand for them. Priorities would be different for a wealthy businesswoman in Japan, a US-trained Iraqi solider, a European who lost a spouse to cancer, and a cotton farmer in India.

As she points out, “Investments [in R&D] are being made by people with values different than most of the world’s population.” Therefore, it is important to challenge people to think globally.

Why are science museums a great place for these conversations? First, they provide trusted and stimulating information. Second, they are a place where people can reflect on science, technology, and the world. And third, they are a place for conversation because many visitors attend in groups.

Part of her job involves designing the many ways that The Franklin Institute engages the public with science, which in addition to interactive exhibits includes public programs, digital media, and partnerships with schools and communities. For instance, she recently led a public discussion about concussions in sports. The all-ages audience was presented with the neuroscience of head trauma and testimony from former Eagles’ linebacker Jeremiah Trotter, and then they discussed what age kids should be allowed to play tackle football.

Because science and technology are so integrated into our lives now, conversations like these are crucial. In order for breakthroughs to be beneficial for society, they have to interact with public attitudes and values. This communication between science and society occurs naturally at science museums, so they offer fulfilling positions for people like Jayatri who are motivated to connect the frontiers of science with casual visitors. 


Interested in volunteering? You can find information here

Bioethics/Policy Discussion - Storage and Weaponization of Biological Agents (Biosafety)

This summer has seen a surge in discussion over biosafety. Should we still be storing smallpox? Is the risk of bioterrorism greater now in the post-genomic era? Should we artificially increase virulence in the lab to be prepared for it possibly occurring in the environment?

On Tuesday the Penn Science Policy Group discussed the issue of biosafety as it relates to potential uses of biological weapons and risks of accidental release of pathogens from research labs.

The idea of using biological weapons has existed long before cells, viruses, and bacteria were discovered. Around 1,500 B.C.E the Hittites in Asia Minor were deliberately sending diseased victims into enemy lands. In 1972, an international treaty known as the Biological Weapons Convention officially banned the possession and development of biological weapons, but that has not ended bioterrorist attacks. In 1982 a cult in Oregon tried to rig a local election by poisoning voters with salmonella. Anthrax has been released multiple times - in Tokyo in 1993 by a religious group, and in 2001 it was mailed to US congressional members. And recently, an Italian police investigation claims the existence of a major criminal organization run by scientists, veterinarians, and government officials that attempted to spread avian influenza to create a market for a vaccine they illegally produced and sold.
Graphic by Rebecca Rivard

Possibilities for bioterrorism are now being compounded by advances in biological knowledge and the ease of digital information sharing. Which begs the question: should we regulate dual-use research, defined as research that could be used for beneficial or malicious ends? Only in the last few years have funding agencies officially screened proposals for potential dual-use research. After two research groups reported studies in 2012 that enhanced the transmissibility of H5N1 viruses, the US Department of Health and Human Services created a policy for screening dual-use research proposals. These proposals have special requirements, including that researchers submit manuscripts for review prior to publication.

We debated whether censorship of publication was the appropriate measure for dual-use researchers. Some people wondered how the inability to publish research findings would affect researchers’ careers. Ideas were proposed that regulations on dual-use research be set far in advance of publication to avoid a huge waste of time and resources. For instance, scientists wishing to work on research deemed too dangerous to publish should be given the chance to consent to censorship before being funded for the study.

In addition to concerns of bioterrorism, public warnings have been issued over accidental escape of pathogens from research labs, fueled by recent incidents this past summer involving smallpox, anthrax, and influenza.  Some caution that the risks of laboratory escape outweigh the benefits gained from the research.  Scientists Marc Lipsitch and Alison Galvani calculate that ten US labs working with dangerous pathogens for ten years run a 20% chance of a laboratory acquired infection, a situation that could possibly create an outbreak. On July 14, a group of academics called the Cambridge Working Group release a consensus statement that “experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches.”

In defense of the research is the group Scientists for Science, which contends, “biomedical research on potentially dangerous pathogens can be performed safely and is essential for a comprehensive understanding of microbial disease pathogenesis, prevention and treatment.

Our discussion over this research also demonstrated the divide. Some pointed out that science is inherently unpredictable, so estimating the possible benefits of research is difficult. In other words, the best way to learn about highly pathogenic viruses is to study them directly. Another person mentioned they heard the argument that studying viruses in ferrets (as was done with the controversial influenza experiments in 2012) is safe because those virus strains don’t infect humans. However, Nicholas Evans, a Penn Bioethicist and member of the Cambridge Working Group, said he argued in a recent paper that claiming research on ferrets is safer because it doesn’t infect humans also implies that it has limited scientific merit because it is not relevant to human disease.

There seems to be agreement that research on potentially dangerous and dual-use agents should be looked at more closely than it has been. The debate really centers on how much oversight and what restrictions are placed on research and publications. With both Scientists for Science and the Cambridge Working Group accruing signatures on their statements, it is clear that the middle ground has yet to be found.


PSPG hits the streets to explain how genes make us who we are

           With help from the University of Pennsylvania and GAPSA, PSPG was able to run a volunteer booth at the Philly Science Carnival on May 3rd. The carnival was part of the annual 9-day Philly Science Festival which provides informal science educational experiences throughout Philadelphia’s many neighborhoods. The title of the PSPG exhibit was “Who owns your genes?” and featured activities for children and adults alike to educate visitors about how genes make us who we are, what we can and cannot learn from personalized genomics services like 23andMe, and how several biotech companies have attempted to patent specific genes.
           Kids learned how genes act as the instructions for building an organism by drawing alleles for different traits out of a hat and using the genotype to decide how to put together a “monster.” In so doing, they were exposed to the basic principles of genetics (dominant vs. recessive alleles, complete vs. incomplete dominance, and codominance), and they got to leave with a cute pipe-cleaner monster too.
           For our older visitors we presented actual results from a 23andMe single nucleotide polymorphism (SNP) report generously provided by one of our own members (advocacy coordinator Mike Convente). This part of the exhibit walked visitors through the process of sequencing for SNPs, which are single nucleotide bases that vary widely between individuals and can give hints about ancestry, physical traits and possibly diseases, and the implications of bringing these types of tests to the general public. Right now the Food and Drug Administration is trying to figure out how to regulate services like these, which provide genetic information directly to consumers without a qualified middle-man (such as a doctor or geneticist) to explain the complicated results.
           Our exhibit also featured a section entitled “How Myriad Genetics Almost Owned Your Genes” which highlighted the recent Supreme Court case brought against a biotech company that wished to patent two genes (BRCA1 and BRCA2) involved in the development of breast cancer. The genes were discovered at the University of Utah in a lab run by Mark Skolnick, who subsequently founded Myriad Genetics. Myriad went on to develop a high-throughput sequencing assay to test patients for breast cancer susceptibility and eventually obtained patents for both genes. This was controversial for several reasons: 1. These genes exist in nature in every human being and are not an invention; 2. The genes were originally discovered with public funding; and 3. Myriad had a monopoly on testing for BRCA mutations and prevented universities and hospitals from offering the tests. Last year in Association for Molecular Pathology v. Myriad Genetics, several medical associations, doctors and patients sued Myriad to challenge the patents and the Supreme Court decided that patenting naturally occurring genes is unconstitutional (however synthetically-made complementary DNA is still eligible for patenting). It is likely that the patenting of DNA sequences will continue to be an issue in the future considering recent advances in the field of synthetic biology.

Genetically-modified food is not going to give you cancer



Last week PSPG and the Penn Biotech Group hosted Dr. Val Giddings, President and CEO of the consulting firm PrometheusAB and Senior Science Policy Fellow at the Information Technology and Innovation Foundation. Dr. Giddings specializes in issues concerning genetically-modified organisms (GMOs) or as he prefers to call them, “biotech-improved” organisms, which have been genetically engineered to have certain beneficial traits. This usually means that a gene from one organism is inserted into the genome of a different organism to alter its properties or behavior in some beneficial way. GMO crops are frequently altered to improve tolerance to herbicides (think RoundUp) and resistance to insects and pathogens. They can also be modified to change their agronomic qualities (how/when they grow) which helps farmers to be more productive. Crops can also be modified to improve their quality: for example Golden Rice has been engineered to produce beta-carotene, the precursor to vitamin A, which is an essential nutrient that many children in developing countries don’t get enough of1,2.  GMO crops are quite prevalent within the US agriculture, with over 90% of soybeans, 80% of cotton and 75% of corn crops in the US being genetically modified in some way3. Outside of the US, GMO crops are grown in 27 countries by 18 million farmers, most of whom are smallholders in developing countries4. So what are the consequences of all these genetic modifications in our food supply?


The Pros: GMO crops with improved agronomic properties have allowed farmers to increase yield on less land, reducing CO2 output and allowing more wild habitats to remain untouched. GMOs have also decreased the need for pesticides because insect-resistant plants fight off pests on their own, which is good for the environment and good for you. GMO crops that have been modified to increase yield and produce essential nutrients could be a boon for developing countries where hunger and vitamin deficiencies are a serious problem.

The Cons: GMOs could lead to the overuse of herbicides like RoundUp because herbicide-tolerant plants can be sprayed more often with more chemicals. However, Dr. Giddings argued that herbicide-tolerant crops would have to be treated less often because the weeds could be killed off quickly in one fell swoop so there may be a trade-off there. I think the most serious concerns about GMO crops primarily relate to unintended ecological consequences. GMO crops, if they somehow escaped the farm and started growing wild, might out-compete other plants and reduce overall biodiversity. They could also seriously disrupt the food chain, especially considering that they can kill off insect species which are undoubtedly a food source for other animals. And then there’s the question of whether GMOs are safe to eat. There are concerns that GMO crops which produce foreign proteins (such as those that kill off insects) might trigger allergic reactions in some individuals; however there have never been any legitimate reports of this happening. There are also concerns that GM foods could cause cancer; however rigorous, peer-reviewed scientific studies have effectively ruled out this scenario. In fact the most prominent study to claim a link between GMOs and cancer had to be retracted because the sample size was too small to draw any conclusions and the authors used a rat strain which was known to have a high frequency of cancer to begin with5. The bottom line is that there is no evidence that GMOs are bad for you and the Food and Drug Administration has concluded that GMOs are safe to eat6.

Existing federal law requires food labels to be accurate, informative and not misleading. Nutrition labels must contain material information relating to health, safety and nutrition. The fact of the matter is that GMOs are considered safe so there is no reason for the FDA to force companies to identify their products as GMO. Basically the FDA decided that genetically modified foods are subject to the same labeling rules as any other food. Here are the highlights from FDA’s recommendations on how to label GM food7:


  •  If a bioengineered food is significantly different from its traditional counterpart such that the common or usual name no longer adequately describes the new food, the name must be changed to describe the difference.

  • If a bioengineered food has a significantly different nutritional property, its label must reflect the difference.

  •  If a new food includes an allergen that consumers would not expect to be present based on the name of the food, the presence of that allergen must be disclosed on the label.


However this doesn’t mean consumers are completely in the dark about what they’re buying. If you wish to avoid GM foods you can buy food labeled “USDA Organic” or “Non-GMO certified.” Otherwise it’s probably safe to assume a product includes some kind of bioengineered ingredient.
So why the big fuss over GMOs? It’s pretty clear that bioengineered foods are safe to eat and GMOs are probably more helpful than hurtful to the environment. Dr. Giddings offered a few explanations for the widespread resistance to GMOs in the Western world. Firstly, the organic/health food industry reaps big profits by distinguishing itself as a healthy, safe alternative to the Big Ag.  Secondly and more importantly, food is a huge part of every human being’s life and nobody likes the idea of it being messed with in ways they might not understand. It’s especially disconcerting when huge tentacle-y corporations are responsible for these changes.  So with all these considerations in mind, it’s up to you, the consumer, to decide whether genetically modified foods are worth the risks. Feel free to comment if there are any issues I omitted that you think are worth noting.

-Nicole Aiello 


1. Ye X, Al-Babili S, Klöti A, Zhang J, Lucca P, Beyer P, Potrykus I (2000) Engineering the provitamin A (β-carotene) biosynthetic pathway into (carotenoid-free) rice endosperm. Science 287:303-305.
2. Grune T, Lietz G, Palou A, Ross AC, Stahl W, Tang G, Thurnham D, Yin S, Biesalski HK (2010) β-Carotene is an important vitamin A source for humans. Journal of Nutrition doi: 10.3945/jn.109.119024.
3. US Department of Agriculture (USDA). Economic Research Service (ERS) 2013. Adoption of Genetically Engineered Crops in the US data product.
4. Clive James, ISAA Brief 46.
5. Séralini, Gilles-Eric; Clair, Emilie; Mesnage, Robin; Gress, Steeve; Defarge, Nicolas; Malatesta, Manuela; Hennequin, Didier; De Vendômois, Joël Spiroux (2012). "Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize". Food and Chemical Toxicology 50 (11): 4221–31.

The unintended impact of impact factors



Dr. Mickey Marks of UPenn stopped by PSPG yesterday to discuss the San Francisco Declaration on Research Assessment (DORA) which calls for new metrics to determine the value of scientific contributions. The system in question is the Thomson Reuters’ Impact Factor (IF) which was developed in the 1970s to help libraries decide which journals to curate. Since then IF has taken on an inflated level of importance that can even influence promotional and hiring decisions.  But can a single number really summarize the value of a scientific publication?
IF is calculated by dividing the average number of citations by the number of citable articles a journal has published over the last two years. One reason Dr. Marks became involved in DORA is because he is co-editor at a journal whose IF had been steadily dropping over the last few years, a trend experienced by numerous other cell biology journals. This led many in the field to question whether IF was really accurate and useful. As you might imagine there are many factors that can skew IF one way or another: for example, in some fields papers are slower to catch on and might not start accumulating citations until well past the two year IF has been calculated. Journal editors can game the system by reducing the number of “citable” articles they publish: citable articles must be a certain length, so if a journal publishes many short articles they can decrease the denominator and inflate their IF. So how reliable is the IF system? Are journals with a high IF really presenting the best science? A few years ago the editors at one journal (Infection and Immunity) set out to address that very question and the answer may (or may not) surprise you. The editors found a strong correlation between IF and retractions (see graph).

Infect. Immun. October 2011 vol. 79 no. 10 3855-3859
          Why are these high impact journals forced to retract at such a higher rate? It might be because these editors are looking for sexy science (because that’s what sells) and might be willing to overlook sloppy research conduct to print an exciting story. Another reason may be that researchers are under extreme pressure to publish in these journals and are willing to omit inconsistent data and allow mistakes and even misconduct slip through to keep the story neat. And this brings us to the real problem presented by IF: in some circles individual researchers’ scientific contributions are being judged almost entirely on what journals they publish in. Scientists learn very early in their careers that if you want a faculty job you need to publish in Science, Nature and Cell. This is because it is faster and easier to draw conclusions based on a journal’s reputation rather than actually read an applicant’s publications.
There are a few alternatives to IF, including Eigenfactor and SCImago which are similar to IF but take into account the overall impact of each journal, and Google’s PageRank which ranks journals based on search engine results. These alternatives generally result in similar rankings to IF, however. The real issue isn’t the rankings themselves but how we as scientists use them. If the system is going to change it will have to start with us.  Scientists must decide together to de-emphasize impact factors and publication rankings when making decisions about promotions, hirings and grants.

Nicole Aiello
PSPG Communications