High tech living— is there high tech dying too?

Every once in a while I read something that plunges me into deep philosophical mode and I have sympathy for the poor souls who must deal with my sometimes week-long, sometimes month-long tendency to shift all conversation to questions about how others view issues related to either a. the distribution of wealth, b. dying, or c. both of the above. Somehow these topics tend to fall into the categories of things you don’t discuss at the dinner table along with religion and politics (maybe because they are related to both).

Blogs are probably like the dinner table. But sometimes, I think the avoidance of difficult topics only heightens the barrier to understanding them. This week, I read an interview titled “The Long Goodbye: Katy Butler on How Modern Medicine Decreases Our Chance of A Good Death.”

The interview stands in contrast to much of the positive media surrounding advances in medical technology, and indeed technology in general. There is a tendency to emphasize how simple technology makes human living. There is neglect in discussing how difficult technology makes human dying. But has dying become more difficult?

I agree that there are many ways in which technology has made 21st century living easier. Credit goes out to microwave ovens, smart phones, cars that park themselves and the FM radio toaster. We also have technology and trade that allow Pennsylvanians to eat bananas in March, farmers to extend the growing season and teachers to expand the classroom, reaching students on many different continents (shout out to CIDD’s MOOC). In an age with hyper-connectivity, increased accessibility to affordable technology and sophisticated means of communication, we have changed the many facets of human living.

In some ways we have gone from simple to simpler. In others, we have gone from simple to unforeseen complexity. We turned creating fire into a light switch. Simple. We turned talking to each other into waiting on phone lines, constant refreshing of email, Facebook poking. Kind of complex. In modern medicine, we have turned most infectious diseases into outpatient cases with simple pills. But the same science has also turned our experience of suffering from something human into something that we fear, heighten and prolong.

In the interview mentioned above and published this month in The Sun magazine, Katy Butler, journalist and author of Knocking on Heaven’s Door: The Path to a Better Way of Deathadvocates for the Slow Medicine movement and expresses her thoughts on how our medical industry and culture has altered the experience of death in modern times. In the interview, Butler says “Death used to be a spiritual ordeal; now it’s a technological flailing. We’ve taken a domestic and religious event, in which the most important factor was the dying person’s state of mind and moved it into the hospital and mechanized it, putting patients, families, doctors, and nurses at the mercy of technology.” Butler’s perspective comes from her experience with her father, who after suffering a stroke and undergoing surgery to implant a pacemaker, lived an extended life that she felt only prolonged suffering.

The interview calls to mind the story of Vivian Bearing, the central character in one of my favorite plays, W;t by Margaret Edson (the film adaptation is on YouTube). In the play, Vivian Bearing is diagnosed with metastatic ovarian cancer. As an English professor, she seeks comfort in the poetry of John Donne, and slowly encounters a new understanding of death, which was once a motif she studied only in the classroom. “We are discussing life and death, and not in the abstract, either; we are discussing my life and my death […]” Surrounded by medical staff focused on the outcome of her disease from a research perspective, she becomes frustrated in how unromantic, how brutal and terrifying aging has become in contrast to its sometimes beautiful portrayal in literature. She is inundated with medical jargon and unfamiliar terms for stating how rapidly she is declining. The irony here is that even an English scholar is having trouble understanding the words we use to describe medical technology and medical decisions in modern hospitals. All her life she has enjoyed language games but facing death, Bearing says “Now is not the time for verbal swordplay […] / And nothing could be worse than a detailed scholarly analysis. Erudition. Interpretation. Complication. / Now is a time for simplicity. Now is a time for, dare I say it, kindness,” (p.69).

Technology has made many things simpler. Healthcare does not seem to always be one of them. However, the complex system that has developed in many industrialized nations stands in unusual contrast to medicine in many other areas of the world. People like Vivian Bearing or Katey Butler are facing problems that seem tragic, but I wonder if such tragedies come as a cost to the luxury of having access to any medical care. There are places in our world that have not yet achieved the simple. Many regions, in America, but especially abroad, lack healthcare and disease prevention measures we take for granted such as adequate access to vaccines, contraceptives and clean water. Can we find a balance between using medical technology to live better while also using it to die “better”? Can we distribute our medical resources and knowledge to maximize our potentials when living, while also understanding, recognizing and appreciating the fact that the beauty of living is in part due to its ephemeral quality.

It seems like Butler and Bearing are exceptions. The diseases they have encountered (cancer, stroke, dementia and old age) are unlike deaths that the majority of people face in our world. In most places, isn’t technology making both life and death better? In the case of malaria, a disease that was easily resolved in America, the disease is still reeking havoc in the global scene. However, technology has allowed us to change this. With the use of antimalarial drugs and prophylaxis, we have caused a decline in the malaria death rate. Surely here modern medicine has increased our chance of “a good death”?

Malaria discussions in lab bring up many interesting discussions of problems related to drug resistance and host-parasite evolution but we less frequently discuss the interaction of these problems with poverty, infrastructure and how we die. Poverty and infrastructure seem like they could influence drug compliance, degree of drug pressure, and also, as Steve Lindsay discussed in his CIDD seminar, they influence the exposure to vectors that spread the disease. Are these things too simple or too complex to add to our focus?

I think technology has made us shift attention to the things we have made complicated, while the things we have simplified seem to be easily forgotten. Perhaps this is what happens to all things: complexity consumes more thought by its nature and simplicity is what allows things to drift from attention. In the American discussion of medical technology, we often lament its complexity — but is this because its simplicity (availability of pills, easy access to care, decline in infections compared to other nations) is easy to forget? Medical technology has changed how humans die, but I am not sure if it has made it better or worse, more complicated or more simple. In malaria, unlike the discussions of cancer, or diabetes or diseases of the industrialized world, technology seems like a force that enhances and improves quality of life rather than something that hurts our chances for a “good death.” Butler’s argument is a good one to consider in the modern American clinic, but one that needs to include an appreciation for what technology has done for global deaths due to preventable infections.

(On a side note, if you want to also read how technology has affected our ability to cook and my other thoughts on Butler’s interview, I blogged more extensively on the topic here).

Dispatches from the field: Part I

ihi1

 

I am at the Ifakara Health Institute (IHI) in Tanzania for the next three weeks to do some experiments, and hopefully to get a handle on what it’s like to do fieldwork in Tanzania.

The last two days have mostly been filled with sorting out space and facilities, and several trips to the local town for supplies – so far I’ve bought 13 bed nets for a total of 80,000 Tanzanian shillings (a little bit less than $50).

Today we decided to venture out to a more remote field site where some colleagues wanted to set out mosquito traps and I needed to pick up equipment. About 30 minutes from IHI we turned off the paved road into a small dirt road cutting through corn fields that quickly gave away to rice fields. It’s the rainy reason in Tanzania right now and last night, it rained heavily and constantly. As a result, the fields were pretty flooded and the road had deep holes and ruts covered by water.

After a couple of close calls, things really went awry for the cruiser traveling in front of us. Two and a half hours later, the land cruiser was free but the sun had set and any hope of making it to the field was long gone.

Better luck on Monday, when we attempt the trek on foot or maybe in some cattle drawn carts? Like many things here, the exact plan remains nebulous but always hopeful.

Putting my feet up for science

http://www.dreamstime.com/stock-photos-businessman-relaxing-green-office-image9638033

Dropped off in the middle of a Windows background, this guy does some thinking.

I prefer to read with my feet up. I’ve long held the theory that this increases blood flow to my brain, making what I’m reading easier to understand and more meaningful.

There’s no evidence that good posture (sitting upright at your desk) leads to better retention of information, despite the misleading title of this lifehacker blogpost. However, there is plenty of evidence that sitting all day on the job is bad for your body, which has led to a whole culture of people standing while working and loads of contraptions (some of which look like torture devices) to make this easier. However, any indication of whether it’s better to do your thinking with your feet up was hard to find. Another blogger also looked into this topic and also decided there wasn’t much evidence for or against learning lying down or reclined.

The only thing I found was Amy Cuddy’s excellent TED talk on power poses – one pose was sitting back with feet up – suggesting this posture might improve confidence and success. Good enough for me. Back to reading.

Down with PowerPoint

Physicists working on the Large Hadron Collider recently banned the use of PowerPoint slides in all meetings.  It is a trend slowly growing in the corporate world, government and even academia. The technology we have developed to enhance education and communication has not always been effective in relaying messages. More and more people are realizing the value of traditional approaches to learning and the “old school” setup for forums and meetings: talking, no supplies needed.

In an age with pervasive use of smart phones, tablets, free wi-fi, and real time communication it is difficult to separate what is a technological distraction from a technological enhancement. The exponential increase in product availability and constant reinforcement of the use of technology in the workplace, academic and social settings makes our gadgets both addictive and socially encouraged. How do we break free from a culture enthused by the advent of high-tech everything? Maybe by starting with PowerPoint bans.

One of my favorite TED talks discusses the benefits of using interpretive dance rather than slides. I am convinced that I could learn far more by the dancing approach to learning. New lab meeting policy? (If you want to practice turning science into choreography, check out the “Dance your PhD” contest.)

The physicists working on the Collider are not alone in their efforts. The U.S. military has also changed meetings’ design to avoid unproductive hours of staring at slides. To illustrate the world’s current frustration with our reliance on boring slide shows for meetings, General James N. Mattis of the U.S. Marine Corps has said: “PowerPoint makes us stupid.”

*In order to prevent disappointment, I forewarn that I will be using PowerPoint for lab meeting on Wednesday. I’ll save the dance moves for TBD.

GMO-free free

I have personally always wondered why anyone would be against genetically modified organisms (GMOs).  GMOs, to me, have always seemed like the inevitable next tier in the Green Revolution, and so I often go out of my way not to buy products that advertise “GMO-free” on the label.  But I’ve never actually done any research into why people object to GMOs.  So I finally did some research.  Granted, not enough research to write a paper, or even enough to necessarily have a well-informed opinion on the topic, but it is probably enough for a blog post.

The benefits of GMOs are fairly obvious.  Through genetic engineering, it is possible to increase crop yield, increase crop stress/drought tolerance, increase product longevity, reduce pest management issues, and grow crops on land previously thought to be non-arable.  Taken together, these factors increase overall production and decrease waste.  Good.

There are, however, at least three legitimate potential issues with GMOs as well.  First, insecticide genes inserted into crops could potentially transfer into human gut microflora through the process of horizontal gene transfer, potentially leading to the transcription of insecticides in the human gut.  I highly doubt that an insecticide-producing gene would grant any benefit to bacteria in the gut, and so the expression levels of insecticides will probably be low.  It does, however, seem to me that there should be some oversight to ensure that only human-safe insecticides are produced by GMO crops.

Second, patents on GMOs severely restrict farmers.  My understanding is that it is currently illegal for farmers to plant seeds from patented GMOs grown on their own land, and that these farmers must instead repurchase seeds each growing season for annual plants.  Although this is a potentially important political issue, it isn’t really a GMO-issue.  It is instead an issue with our current intellectual property laws.  The exact same issue would occur for non-GMO crops if it were possible to patent them as well.

Third, and this is the most legitimate claim in my opinion, depending on the genes that are inserted, GMOs might cause allergic reactions that are unforeseeable by the consumer.  A consumer allergic to peanuts can avoid purchasing products that contain “peanuts” in the list of ingredients.  That same consumer, however, might have a hard time avoiding GMO crops that contain the allergenic peanut gene, if the use of peanut genes is not disclosed in some transparent way.  How to disclose a full list of genes and gene products seems like a non-trivial challenge with current technology, and so I can concede that for people with food allergies, avoiding foods that contain GMOs might be reasonable.  But since I don’t have any known food allergies, I think I’ll try to stay foods-labelled-GMO-free free.

Does cold cause colds?

It’s been brutally cold this winter–so does that mean we’re more susceptible to the common cold? Many folks (myself included) seem to feel that a severe chill often precedes a cold, because our immune systems must be weakened. I wanted to know if there’s any evidence, so I looked up some recent Cochrane reviews on the common cold and learned the following:

Vitamin C is useless, UNLESS one happens to be exercising in the Arctic circle, or a marathon runner, skier, etc. Then Vitamin C might be a very good idea.

Zinc helps, if taken within the first 24 hours of symptoms, but the benefits are offset by the terrible taste and nausea that follow sucking on a zinc lozenge.

– There’s no evidence that echinacea helps, but the appropriate trials haven’t been performed either.

– Sanity check: antibiotics do not help people get over colds, and they do nasty things to one’s digestive system.

So bitter cold (or exercise, or possibly exercise in the cold) may increase susceptibility to the common cold. But despite our poor understanding of the chain of causation, there are evidence-based ways of managing colds. Thanks to the magic of Cochrane meta-analyses, we also have fair warning about evidence-based methods for triggering indigestion.

Bold (and unfounded) statements

I just read these first two opening lines of a 2014 paper by Tokponnon et al.

“The widespread use of insecticide-treated nets (LLINs) leads to the development of vector resistance to insecticide. This resistance can reduce the effectiveness of LLIN-based interventions and perhaps reverse progress in reducing malaria morbidity.”

I first read this as: “Using insecticide treated nets reduces their effectiveness which could increase malaria”. A somewhat misleading start to a paper that concluded that there was actually decreased malaria prevalence in areas with high mosquito insecticide resistance.

I doubt that pyrethroid resistance is mainly caused by using insecticide treated bed nets. Typically the main suspect for increasing resistance is either large scale spraying of pyrethroids often for agricultural purposes, and this is particularly true in Benin where this study took place.

Is it possible that the number of bed nets deployed could make enough of a dent on mosquito populations to be the main selective driver for resistance? It’s worth debating, but this paper didn’t show that resistance was a problem. In fact, either because of local environmental differences of even behavioral differences (like biting at different times of day for example when people aren’t under nets) they found more susceptible mosquito populations were actually transmitting more malaria parasites even with the same LLIN coverage and use. Should we worry about insecticide resistance? Maybe, but I think more studies like this one would be very useful to judge how much we should worry.

Androgynous names and the mind-altered present

Two things happened when I started grad school: one, I started drinking more coffee, and two, I started noticing how frequently my name gets misheard, mistaken or mispronounced. The two things are highly correlated: coffee shops are where my name is most often lost in translation. Spelled on my coffee cups have been names from Jill to Gel to Jeff to Josie, none of which I mind and all of which are close, or somewhat close to Jo.

This weekend for the third time, I got JF.

Most of the time people’s tendency to be bad with names gets blamed on memory, but when name mishaps only occur on first encounters (i.e. with unfamiliar people in coffee shops), it is hard to blame an unmemorable past. For a while, I blamed my diction. Maybe my “o” sounds like “f” or “l” or “s.” Maybe I am so in need of coffee during these times that my uncaffeinated grumblings sound like word slurrs. Maybe coffee shops are too loud to hear the distinction between “o”s and the 25 other letters of the alphabet. Maybe I have an accent that causes my words to sound full of consonants. So I told myself it was my fault, I needed to speak up, lean in, I tried spelling. And now instead of Joans and Jills, I get JF.

My new hypothesis is that it is not me as much as it is my name. If my name was Jill it would probably never get confused to Gel. Do Rachels ever get Richards? Does Karly ever get Carl? Probably not. The problem here is living androgynously in a gender schemata world. I am guessing that names that are androgynous but have a stronger association with one or the other gender are less expected when meeting a new person, and thus more frequently mistaken.

Though we don’t think of our minds as expecting a particular name when we first meet someone, we likely are. A study at Miami University explored why it is that certain people seem to “look” like or “fit” their names while others don’t and what facial features trigger certain name expectations. Gender is an obvious cue for most people to remember names, but less obvious features also play a role in setting up our expectations for what someone’s name should be. The study linked specific facial features to specific name associations: The name Bob is generally linked to a rounder and larger face, Tim is thinner. Perhaps it is when name and feature are incongruous that names on coffee cups go awry. A large faced Tim or a skinny Bob — would they get as high a frequency of mistakenly named cups as a girl named Jo?

If expectations for names alters our ability to hear a person’s name, where else are our mind’s expectant neurons veering us away from being more perceptive of our surroundings? We hear often that memory is unreliable, but it seems like our minds can be just as unreliable in the present when we let our expectations dominate.

The natural response to the emergence of resistant parasites.

*Not recommended for patients actually suffering from an infection with resistant parasites. Maker's Mark has been shown to dampen the response of rage and frustration in experimentalists, in combination with a healthy diet and exercise. Individuals may respond differently to Maker's Mark - side effects include talking far too fast about your failed experiment, even though you swore you wouldn't mention it, aggressiveness or a sense of sheer calm.

This idea is all mime

At this time next month, I’m going to be in Roscoff, France for a conference that I’m very excited about.  This being my first time in France, I decided that I should go a few days early, and spend those days in Paris.  But I have to say, I’m not looking forward to the looks of contempt and distain that I will undoubtedly receive for not being a native French speaker.  I should also say that I’m not a nonnative French speaker either.  I speak no French.  None.

But, being resourceful, I found a potential solution; dress as a mime.   I can fake my way through ordering meals using a combination of gesticulation and pointing — exactly what I would have done anyway.  I can ask for directions the same way — when I don’t understand the directions, I can cup my ear pretending I can’t hear, and then point in a random direction with a confused look on my face.  I don’t know the local laws in Paris, but I’m pretty sure the cops would be lenient on a mime — have you ever seen one in handcuffs?  And if I get tired, I can just sleep on the curb — the stupid, disgusting Americans might even leave money in my hat.

If you love something, set it free

Artistic recreation of uploading data to Dryad

I was talking with my friend Ben today when he asked me what I thought about #PLoSFail. For those of you not up on your twitter lingo, a hashtag is a way of tagging a tweet so that people can easily follow tweets based on keywords or phrases. In this case, tweets concerning the new data policy at PLoS which requires that published data be made freely available. There are, of course, exceptions in the policy for ethical and legal reasons. And PLoS doesn’t require the whole data set, just the “minimal dataset… used to reach the conclusions drawn in the manuscript with related metadata and methods, and any additional data required to replicate the reported study findings in their entirety.”

Andrew has already blogged about open date making him a bit tense and apparently a lot of scientists on Twitter feel the same way. Personally, I don’t go out of my way to post my data and I’m hardly an open data proselytizer, but I have no problem with making my data available once a paper is published. Science is a peer-reviewed, iterative process and I suspect open data benefits this kind of process. I think that, at worse, open data is a wash. That said, if you know of anybody making a career by parasitizing Dryad and/or GenBank without making any substantial intellectual contribution in their own right, I’d be very interested to hear about it.

Feelings on open data aside, the thing that puzzles me the most about #PLoSFail is how strongly people are reacting. Almost every journal I have ever submitted to has some kind of data archiving policy similar to the new one at PLoS. I can’t imagine how you could do science – biology at least – today without having come to terms with open access data.