Avoid Boring People

In ten days, I am in Atlanta at a poultry meeting. There might be 25,000 people there. Not at my talk. But there.

Much to my total pleasure, I have made happen a dinner with Bruce Levin, his wife Adriana, and Ashley and I. Bruce and I are sparring over a number of issues in drug resistance evolution. But I wanted to have a foursome dinner because he is never dull. He tells me in e-correspondence that he cares more about being interesting than about being right. Sean Nee, the most stimulating theorist of my generation, used to say that as well. So too JMS.

Avoid Boring People… the title of one of Jim Watson’s books. Think about it. It washes both ways. And Watson’s subtitle?: Lessons from a Life in Science. Lessons from life I reckon.

I don’t know

Ygritte knows what’s up.

The first time I really learned about qualifying exams, I was an undergrad in a class on comparative endocrinology at Berkeley. My professor was Tyrone Hayes. Every exam was a booklet of open ended questions, including one question where the answer was always simply “I don’t know.” If you want to drive a room of overachieving pre-meds crazy, this is a pretty good way to do it.

One class, shortly after an exam, Professor Hayes announced that one of our TAs had passed his qualifying exam with flying colors. More important than what the grad student knew, Professor Hayes said, was his willingness to admit what he didn’t know.

This was a lesson that was reinforced when I became a grad student and it was my turn to take a qualifying exam. It’s a hard lesson to learn. Especially because most classes aren’t set up like my comparative endocrinology class, so it’s a lesson that is rarely reinforced until relatively late in our academic lives.

It’s a lesson that I was reminded of when I heard this story on NPR today, about a man who has made it his goal to be rejected each day. By doing so, he put himself through a form of exposure therapy for rejection. I’m sure that scientists across the country heard this story while drinking their morning cup of coffee, and thought about all their past and impending rejections. I know I did.

But it also struck me that fear of rejection comes from the same place as the aversion to saying “I don’t know”. And like my professor tried to teach his endocrinology class, we are indeed better scientists – and probably better off as people – when we make our peace with these uncomfortable feelings.

Eradicating the screwworm with 2,700 pounds of dry milk

If there was an insect-mass-rearing version of Godwin’s law, the screwworm* would be Adolf Hitler. The more anyone reads about insect farming, the more likely one ends up reading about the U.S.’s successful 1966 eradication of the screwworm via sterile releases. Using SIT (sterile insect technique) screwworms were not only eliminated from the continental U.S., but by 1984, all of North America, was screwworm-free. How did we do it? By mass rearing 150 million sterile screwworms a week in a screwworm factory in Mission, Texas. Before the program ended, there were over two billion sterile flies being released each year.

That is insane.

Lucky me got some free literature on screwworm eradication from a friendly Penn State librarian.

Lucky me got some free literature on screwworm eradication from a friendly Penn State librarian.

What is striking however, is that screwworms, are like mosquito malaria-vectors in that both require animal tissue or blood to complete their life cycle. For screwworms a successful replacement was developed over fifty years ago. For the mosquito, we still rely on donated human blood or live animals.

There are a lot of reasons why we might want to mass rear the mosquito like we mass reared the screwworm. For those of us who saw Mike Turrelli’s talk at the CIDD this past fall, it takes a lot of mosquitoes to spread an artificially reared mosquito with certain traits (or in his case infection status) into a natural population. Simple things cause problems, like roads. We need a lot more mosquitoes than we originally thought. We need millions.

How do we rear millions of mosquitoes?

How did we rear millions of screwworms? Edward Fred Knipling (“Knip”), a USDA entomologist, developed an “artificial” diet made from slaughterhouse waste (200,000 pounds of beef and pork lungs, 11,000 pounds of dried blood, 8,500 pounds of horse meat), and milk (2,700 pounds of non-fat dried milk). Those 150 million screwworms that were produced a week as a result, fit into a plant that measured 75,000 square feet. In field trials a population could be eradicated in as little as 6 months. Once released into the United States in 1957, the screwworm was eradicated in the Southeast by 1959, the Southwest by 1966 and all of Mexico by 1984.

If it was possible for the screwworm, it seems possible for the malaria vector. Maybe we can even do one better: save some milk, save some beef and come up with a truly “artificial” way to rear mosquitoes without mammalian hosts?

*The screwworm is a parasitic fly (Cochliomyia hominivorax) that feeds on live tissue of either livestock or human hosts. It’s cousin, Cochliomyia macellaria, is a friendlier version that feeds only on already dead tissues and is employed in maggot therapy for medicinal purposes and in forensic entomology to determine time since death.

As a p.s. to this post, my current obsession with insect mass-rearing has turned into a library exhibit I've set up in the display cases on the 4th floor of Paterno. I'll give you a free tour if anyone is interested.

As a p.s. to this post, my current obsession with insect mass-rearing has turned into a library exhibit I’ve set up in the display cases on the 4th floor of Paterno. I’ll give you a free tour if anyone is interested.

Dispatches from the field: Part IV

GerladKuku

The Gerald Kuku chicken stand. Every time I pass it (twice a day) I wonder who Gerald is and why he never has any chickens.

In two days, I’ll be boarding a Swiss Air flight out of Dar es Salaam. All told, this visit to Ifakara has been relatively uneventful (which isn’t necessarily a bad thing). It’s the period of short rains right now, which is just enough to dampen down the dust on some days but for the most part it’s hot and sunny and dry as a bone. I haven’t been stuck in the mud even once this time. This isn’t to say that life hasn’t been interesting. Some of the things I’ve seen include:

Giraffes, elephants, a breathtaking diversity of insects. Mantises as big as my foot and one small enough to fit on the tip of my pinky. Tiny, bright blue birds that put North American blue birds to shame. An huge owl taking flight off a fence post.

A rooster trying to mate with a duck, and a man recording it on his cell phone. Demonstrating that people the world over will whip out their cell phones whenever animals start doing something freaky.

A man with broken basket of tomatoes on his bike, hundreds of tomatoes scattering across the width of the one paved road here. Four people riding one motorcycle. One of them was holding up the flag of the political party that has held the presidency in Tanzania since they gained independence in 1964.

Which reminds me, it’s election season here in Tanzania right now. The opposing party is much more popular, even at the local level. The other day I was out walking with one of the technicians and we stopped by the house of his “grandfather” (not actually his grandfather, just an older family friend). The grandfather gave us fliers for one of the candidates running for a village leadership position and they tell me, confidently, that even Obama endorses their party.

Is science about money?

I hate talking about money. Mostly because talking about money is boring. People are either complaining that they have too little, that other people have too much or that it is distributed in all the wrong places.

Scientists seem to choose the distribution complaint. A few months ago, NPR posted a story “When scientists give up,” on how tighter competition for grants and a crunched funding environment is leading some scientists to give up on science. The two former PIs interviewed both dropped out of academia, one to start a liquor distillery, and the other to own a grocery store in California.

The more I read about science and funding the more I start thinking about the prospect of, say, grocery stores. But I also start thinking about the prospect of a world without money.

Money has been around for a long time and is a simple concept that saves us a lot of hassle writing out IOUs when we can’t think of anything good to barter. Money also gives us a lot of headaches. Minor headaches like remembering to pay rent, and major headaches like income disparity leading to social unrest.

Would it ever be possible to just not have it? No money, no cash, no credit cards, no ibeacon payment systems, no bitcoin, no bartering.

The alternative to a monetary system is something called a gift economy. In a future where we have Schweebs, self-driving cars, automation of most menial tasks, super nanobots for gene therapy, and houses built by 3D printers, why don’t we ever talk about our monetary system changing? Or consider that money might be a “back-in-the-2000s” figment of the past, something we will only teach to monkeys?

Without money maybe scientists wouldn’t worry so much about having it or not having it. We could all go to work in our human powered monorail system of rapid transit, give the gifts of our ideas and hope we get lots of NIH and NSF gift supplies.

How can I do better?

I’ve recently attended two talks given by colleagues of mine. After each talk, I was asked to comment on the presentation and provide feedback.

 

Initially I felt a little funny about doing this. On one hand, I am a huge fan of these individuals and I want to be maximally supportive of their scientific endeavors. On the other hand, although I certainly have very strong opinions about what constitutes a good presentation …….. I’ve never actually vetted these opinions with the community.

 

I did share my thoughts with these individuals. In retrospect, the conversations were really useful for me, and I hope they were also helpful for my colleagues.

 

It’s occurred to me that this is something we could do better as a group. We should cultivate a habit of having critical, open, honest conversations about how we are communicating our research. I’m not exactly sure what the best way to do this is. Lab meeting is not the place for this. As Dave put it, “lab meeting is for trying our hardest to poke holes in our work and then figuring out how to make it better”. It’s not the place to practice presenting a finished product.

 

Despite my love affair with cold hard facts. I’ve been forced time and time again to acknowledge the (cold hard) fact that science is conducted by humans. And humans …… in accord with the splendor of nature……. are a complicated mix of logical thought and fervent emotion. Although I’m still figuring out how to do this ……….. I’ve realized that its not enough to present the facts. The facts are absolutely essential. Without them, I would have nothing to say. But if I am truly interested in communicating my research I also have to demonstrate why I am passionate about what I do.

Dispatches from the field: Part III

Tanzanian trucks

Waiting in traffic caused by a washed out dirt road during the rainy season in Ifakara, Tanzania.

My most used, most valued skill at the moment is patience. Not in the long suffering sense of the word but in the sense that I have to be at peace with waiting. It’s hard to spend a two day traveling around the world to sit and wait, but it is a skill and I practice it daily.

My office is right next to the hospital and I envy the medical doctors I see passing by every day who, I imagine, spend the entire day busily saving lives and making a difference with their more useful skills: diagnosing, treating, operating, healing.

I’d really like to make a difference. It’s why I got into this business. If it was purely for the love of science, I’d work on butterflies or better yet, drosophila. I believe that the work here can make a difference but it requires technicians, builders, Tanzanian scientists who understand the system and speak Swahili, and I am none of those things.

Still, I can see that my presence alone spurs activity, even if there is very little that I can actually do myself. I have to believe that just by being here, something will be accomplished. Science by gentle irritation, like a grain of sand in an oyster. A difference that I will only see in retrospect.

So I sit and I watch the medical doctors go by. I practice my patience. I try to teach myself Swahili (habari kazi? Ngumu). And I try to make peace with being the annoying white foreigner.

 

Scaling a Collaboration

‘Do you have any advice?’

‘Stay calm’.

‘Is that it?’

‘Yep.’

This an honest account of an exchange between Andrew and I that occurred a couple of weeks ago. I wanted to know how to make the most out of a short visit to Aaron King, a theoretical ecologist with whom I’ve been collaborating. For six months, we’d been trying to translate the within-host dynamics of malaria infections into equations. We’d done pretty well over Skype, got a model up and running, but the extent to which the eye of a MacBook can capture the shape of functions drawn in the air is limitedIt was time to really make some headway, particularly on our model of the immune system. So off I flew to Detroit.

Evening 1 – We investigate the data. All very very exciting. There are plots with swirls on them! By combining two data sets, we come up with The Pattern in the data that our new model has to be able to explain. We have a beer and sleep on it.

Day 2 – We walk to work together. Fueled by slumber and coffee we come up with a couple of verbal models that might explain the data. Almost running in to his office, we draw the models out on the board. Two competing models. NO, one model. It looks like this! Two equations, two curves. We look at more data – Is the data from that other experiment consistent with the predictions of this model? Yes! It might work!

‘I LOVE SCIENCE’, I said.

An hour or so later, just as we were about to head out for a victorious lunch. ‘Why is that first curve shaped like that?’.

Huh.

‘That doesn’t work at all’.

I laughed to myself. Stay calm, he’d said.

****

So it is with collaborative projects, I’ve found. Together you gamble up the intellectual hill then, while one of you is tying their shoes, the other gets a glimpse of the surroundings and realizes you’ve got to trot back down a bit. Staying calm and slowing down from the start would certainly be less exhausting. That said, at least at the beginning, perhaps you need the fun of running with ideas together to motivate you through the inevitable less-exciting moments? Moreover, when somebody points out the place you’ve got to doesn’t look right, its a good indication that they’re thinking and are willing to speak up. A climb down (and importantly a re-ascent) can’t be bad for trust and delineating the problem, either. Im hoping that, just like an eventful Sunday walk in the hills that turns out well in the end, all this yoyoing’ll be good fodder for a laugh and a well-deserved beer, afterward.

Can we ever predict the weather?

I went hiking this weekend and got stuck in the snow. Ten days ago, the forecast had predicted sunny weather with temperatures in the upper-40s to low-50s, by five days that changed to cloud-cover with a chance of rain, by the night before, the weather forecast couldn’t make up it’s mind — maybe sun, maybe rain, maybe snow?

Beautiful autumn day for a winter hike? It's Nov.2

Beautiful autumn day for a winter hike? It’s Nov.2

It snowed and I am confident it is impossible to predict the weather.

I don’t like saying things are impossible. Most things that we can’t predict now, I’m optimistic we could predict in the future, but my possibilist tendencies stop with the weather. I don’t think we will ever be able to predict it: we can’t now and I don’t think we can in the future.

When I say we can’t now, we are very, very poor predictors. Even with fancy tools like doppler radar and satellites, we haven’t made improvements to forecasts beyond six days in the past 20 years. According to a blog by Josh Rosenberg, who analyzed the accuracy of forecasts made by Accuweather, we can more accurately predict the long-term weather forecast using historical averages than by using modern technological tools and complicated prediction formulas. The average person with climatic trends data is just as good as a weatherman at telling you if it will rain next week. That is incredible to me and has eliminated my confidence that we will ever improve longterm weather predictions. In the short-term predictions, technology has helped us, but Accuweather’s new 21-day forecast is like using tealeaves and coffee grounds. The significant difference in accuracy between Accuweather and historical averages goes away within 4-5 days of the date of interest. In that case it seems to me like using yesterday’s weather to weight historical averages could be just as good as the satellite-doppler approach.

I figure I might as well put my own predictive power to the test, so here is the weather for State College, according to me. It is cool and sunny today in State College. My weather prediction for tomorrow is cool and sunny. I’ll go ahead and say it will be cool and sunny for the next 21-days with colder temperatures later in the month and scattered snow flurries as we approach Thanksgiving, chance of showers sometime in between. In 21 days I think I’ll see if I fared better or worse than my iPhone weather app.

 

 

The Beige Revolution

The world’s population is projected to reach 9 billion by the year 2050.  Using current food production methods, we cannot sustainably feed a population of this size.  Various solutions to this problem have been proposed; change over to a vegetarian/vegan diet, genetically modify existing food sources to increase yield or arable farmland, use insects as a food source, grow meat in a laboratory, panic, etc..  I’d like to suggest an alternative solution.  Farm bacteria.

“Bacteria” is a catchall term that can apply to a lot of different things (even moreso than usual as I include Archaea in the term), and so let me first clarify that I’m not proposing that we all start eating random prokaryotes that we happen to come across.  I’m suggesting that we start farming bacteria in much the same way that we originally started farming plants and animals — find a few species that seem alright to eat and start selecting them for desirable characteristics.  I imagine that safety during consumption, nutritional content, and taste would be the most immediately relevant traits to select on, but feel free to disagree.  After a year of selection (i.e. on the order of 10,000 bacterial generations), it is hard to imagine that we couldn’t make some real progress on any of our traits of interest.

There are numerous advantages to growing bacteria over growing plants.  To name a few, bacteria can be grown literally anywhere.  Bacteria have a natural ability to acquire and lose genetic elements in the form of plasmids — for example, if a vanilla flavor is desired, add a plasmid that encodes for vanillin.  Nutritional requirements for bacteria to replicate are minimal relative to agricultural plants.  Doubling times are extremely short relative to growth rates of plants and animals providing advantages in production.  And perhaps most importantly, many bacteria can grow in the complete absence of other organisms, whereas plants and animals rely heavily on bacterial and fungal microbiomes.  This ability to grow in a true monoculture simplifies the complexity of the system, greatly enhancing the potential for research and development on limited budgets.

I’ve told this idea to many people over the last 10 or so years.  The most common responses I get are “ew, gross”, and “wouldn’t the texture be weird”.  My immediate response is, “people eat tofu, people will eat anything”.  My more thought out response, however, is that the texture and color of bacteria are highly variable in nature (not all bacteria are beige), and so it is hard to imagine that a palatable combination of taste, texture, and color couldn’t be generated with a little effort.

But the Beige Revolution won’t happen unless someone actually makes the effort to implement it, and I’m not going to be that person.  So if you think it’s a good idea, I wish you good luck, and I’ll put in my pre-order for an E. coli steak now.

Spherification

Spherification in Charlie and the Chocolate Factory (illustration by Quentin Blake)

I recently bought some sodium citrate from Amazon. In addition to giving club soda its signature tang, sodium citrate solves one of the major concerns in my kitchen. Namely, how do you get cheese sauce that is both flavorful and creamy?

The problem with cheese is that when it gets hot, like when you’re making a cheese sauce, the hydrophobic dairy fats tend to separate from the water soluble components and you end up with greasy mac and cheese. One solution is to use a béchamel sauce, flour browned in butter and cooked with milk, which acts as an emulsifier to keep the dairy fats mixed into your cheese sauce. But the problem with béchamel sauce is that it’s not a great emulsifier and the more you add in, the more you dilute the flavor of the cheese. This is why most home made mac and cheese recipes call for cheeses with strong flavors, to balance out the béchamel sauce.

Sodium citrate, on the other hand, is a great emulsifier. It’s so good that you don’t even need the béchamel sauce, you can just mix your cheese directly into a hot liquid. You can use milk, beer, or even water as the base of your sauce.

Another use of sodium citrate, the one that’s listed on the jar that I bought, is spherification. In addition to being a really sweet word, spherification is a molecular gastronomy technique that produces little caviar-like bubbles of liquid. Sodium citrate isn’t the main ingredient for this process, but it can be used to change the pH of acidic liquids to make them more amenable to spherification. Which means that you can spherify just about any liquid you can think of. Check out this blog for photos of spherified wine, whisky, and sriracha sauce.

If I’ve sold you on sodium citrate, you can order your own here. Even if you aren’t as enthralled as I am by the idea of spherification, you can still use it to make this mac and cheese recipe.

How little do I know….

Last Friday I got a copy of an email from David Paccioli who is writing an article about Andrew and his research.  Since Andrew was on I-am-off-line-any-questions-please-email-my-assistant status, I looked into the email.  David was hoping to use the story as front cover article, and he was sending a picture of an insect (which he assumed was a mosquito) that he wanted to use on the cover of the magazine.

I looked at the photograph and wondered, “Hmm, is this really a mosquito, and if so, what kind?”  In high school in Venezuela we studied the anopheles because at that time there still was malaria in the country.  I vividly remember the image, and this “thing” didn’t look much like it.  However, I wondered if, due to evolution (it’s been forty years!) maybe nowadays those critters look different.  What do I know?

I printed out the picture (I have to mention in black and white for the benefit of the people I asked) and decided to find out.  The first person to cross my path was Matt Ferrari.  I stopped him and asked, “Matt, what kind of mosquito is this?”  Matt, almost indignantly, replied: “ I work with VIRUSES!!!”  I just laughed and said, “Oh well, excuse ME!”

Off I went to find someone else.  Monica was at her desk.  I showed her the picture and asked, “What kind of mosquito is this?”  She looked at the pic and said, “I have no idea,” and when I turned around and saw Dave she said, “And don’t ask him, he wouldn’t know either.”  Then she reconsidered, grinned and corrected, “Go ask him!”

I went to Dave and asked,  “Dave, what is this?”  He said, “A mosquito?”  I replied, “But what kind?”  Dave admitted, “I have no idea.  All I can tell it is NOT a chicken!”  At that time all of us were laughing.  But I added:  “Listen, guys, I am DEEPLY DISAPPOINTED in all of you.  You are supposed to work on malaria and don’t even know what the transmitting mosquitoes look like!!!”  (OK, Dave works on chickens, but anyway….)

So I thought:  Elsa = Artemisinin = malaria = mosquito.  Maybe she knows?  Answer:  No.  Katey happened to be in the office, and she looked up some pics of malaria transmitting mosquitoes… they didn’t look much like the picture.  Suggestion:  contact the people at the insectary.

That’s what I did.  Lillian replied to my email, “I am 99% sure this is a crane fly…”  Wow! I immediately forwarded that email to Dave who must have been quite happy to get the news.  Shortly thereafter Lillian emailed back confirming her identification and offered to have the photographer over to take some pics of the “real thing.”

I am a non-scientist and as such, I believed that people who research malaria would certainly know what the transmitter looks like.  I never cease to wonder at all that I don’t know and all that I think I know and isn’t correct!  I love working here because I learn so much.  My days here are filled with daily discoveries that, while amazing to me, may be common knowledge to others. It makes my work quite enjoyable.

Investment strategies from George Costanza

With the stock market at historic highs, now seems like a good time to bring up a claim that has always bothered me.  Wall Street analysts will often say that the biggest mistake of individual investors is that they tend to buy high and sell low, because they make decisions based on emotion.  The idea is as follows:  People see others making gains in the stock market and become motivated to invest for fear of missing out on easy money.  They buy stocks, but because many others have beaten them to this they end up paying high prices.  If and when the value of the stock drops they panic and sell their holdings, but again others have beaten them to this and they end up selling at low prices.  The end result, they buy high, sell low, and lose money.

But if this pattern were true more often than expected by chance, employing the George Costanza philosophy — if every instinct you have is wrong, then the opposite would have to be right — would allow smart investors to consistently outperform the market average.  Just do the opposite of what your emotions tell you.  Nevertheless, active traders consistently underperform passive investment strategies, and so the only two possible conclusions are 1) individual investors and sophisticated computer algorithms alike are too stupid to employ a “do the opposite” strategy, or 2) the “buy high and sell low” anecdote is a boogie man story that people happen to believe.  I would guess the latter.

Disclaimer: The above blog post is in no way intended as investment advice.

Time

Dave and I are working on something together. Friday we decide the ball is in my court. He says he won’t be working this weekend. Wow. For once I have the edge. I do my bit and send it to him 5.30pm Sunday, feeling so ahead of the game. Two hours later, he has done his next bit. This generates the following dialog:

AR: I thought you weren’t working on the weekend?
DK: I wasn’t, but I finished my weekend early.
AR: How can you finish a weekend early?
DK: I’m very efficient sometimes.

What do we do all day?

Earlier this summer the Bureau of Labor Statistics published their annual report, known as The American Time Use Survey. The document details what we Americans do with our 24-hours, from the average amount of time we spend sleeping to the time we spend working, lunch breaking, exercising, emailing, being American. So what do we do all day?

We sleep a lot. Sleeping and resting make up a plurality of our day with 8.74 hours, followed by leisure activities which take up 5.26, and working which we seem to only do for about three hours a day. My reaction to three hours a day: where are these people working?

There are other interesting findings when we explore time use and compare different age groups, occupations and genders. Women work less hours, spend less time eating and more time sleeping. Men spend more time watching television and more time on lawn care. We all supposedly spend more time watching T.V. than we do socializing and communicating (2.77 hours compared to 0.72).  Mathematicians take some of the longest and most predictable lunch breaks. Few people work outside of typical 9 a.m. to 5 p.m., the most exceptions are employed in the “protective services” sector, i.e. firemen and police officers. Adults older than 75 spend almost double the time of 35-44 year-olds on leisure activities.

Scientists don’t have their own category in the report. Although I think I don’t need the government to give me a report in order to figure out what it is that we do all day, it would be entertaining to see. I was thinking I could start collecting data on my own time use, like Andrew mentioned in response to Eleanor’s post, but then I realized it would be terrifying to know the fraction of life I’m going to spend counting mosquito eggs. I’d prefer to remain blissfully ignorant on that front.

 

iBeacon and the Internet of Things: how will they change public health?

iBeacon is the newest word in my limited technology vocab but also one strongly trending in the realm of marketing and ecommerce.

Beacons, including Apple’s iBeacon, are wireless location transmitters soon to be used in smartphones and iOS devices that work on Bluetooth low-energy wireless technology (BLE). The premise of beacons is to collect and transfer information on relative location rather than location data collected from lower resolution sources like GPS. The technology also allows devices to communicate with other devices in the absence of user-mediated direction. Your iPhone goes with you to a bar and the cash register has a beacon that communicates with your phone’s beacon which communicates with your stored bank account app and three drinks later you leave the bar with the phone having taken care of paying and tipping. No need to call the bar tender or take out your wallet. At Rathskeller this would be amazing.

Heathrow airport is already on the practical applications of beacons, currently testing their use by partnering with Virgin Atlantic. Passengers walk into Heathrow, their phones alert the check-in counter of their arrival and by the time they look down, the boarding pass is already pre-loaded on the phone. Will we look back on 2014 and say: Why did we ever wait in lines?

Beacons make sense. They lay the foundation for a coming future of “The Internet of Things” and should eliminate many current frustrations of life in 2014 society (i.e. waiting lines, carrying credit cards, piles of receipt papers, losing devices, thefts of devices, the old-fashioned problems of, say 2035).

There is a lot of current talk on how beacons will change the future. Some is negative, predicting intolerable amounts of targeted ads directed at consumers every time one walks by a beacon for a product he or she otherwise would never notice. I understand that complaint and would prefer not to have my phone alert me when, say ketchup is 2.99, and then  have it give me some digitial coupon when I walk through the condiment aisle, especially if I have no interest in ketchup while bee-lining for dairy.

Some of the talk is positive, predicting increases in accessibility and stream-lining of digital information, such as having additional text, product info or history regarding the painting you are staring at in the art museum or the food product you are about to digest.

Incredible to me is that of all the current talk surrounding iBeacon applications, I have yet to find any talk of applications in public health. It seems technology that allows devices to communicate and interact with the outside world and other devices could be extremely valuable in terms of epidemic containment or targeting prophylaxis and treatment. A symptomatic patient comes into a clinic and is either given a beacon to wear or uses the beacon in a phone/mobile device, when a diagnosis is made, the patient’s beacon registers that information and can the pass that information to future beacons it may interact with. Airport beacons would be alerted to check the patient before allowing them on a flight, doctors would be alerted if new patients with beacons had interacted with previous patients with beacons that registered an infection, and behaviors that could be indicative of a person at risk for a certain infection or that indicate sickness (i.e. buying flu medicine at a pharmacy or frequent trips to a public restroom) could be registered by beacons which would then make use of the digital device to provide information on treatment and prevention.

After a recent flight, I passed an LED board displaying health alerts at the U.S. Customs desk, asking passengers to voluntarily declare whether or not they thought they had measles, dengue or a few other diseases. I thought this was interesting that in 2014 we still rely on human judgement and decision making even in cases where it may be flawed. Can we always trust people to come forward if they are sick and allow themselves to be detained? Would it be safer if technology pre-identified and notified travelers with x symptoms or x diagnosis? I think an iBeacon could help.

IMG_1605

Excelling at life

I recently decided that it was time for my biannual closet clean out. Instead of going about it my usual way, i.e. going through each item in my closet and evaluating it based on a poorly remembered, sentimental opinion of how important that item is, I decided to go about it in the most efficient way I could think of. That is, I’ve created a spreadsheet and I will record each item of clothing that I wear each day. After a month, I will have a comprehensive list of (warmer weather) clothing that I wear regularly and I will chuck/donate the rest.

As I was creating my spreadsheet for this cleaning project, it occurred to me that I could collect a lot of data about my life. But I haven’t quite decided what the most interesting things to track of would be. There’s the normal stuff that people tend to keep track of in spreadsheets, like spending or diet. Or I could keep track of the things that I hyperbolically declare The Worst, as in “State College drivers are The Worst” (this would be a very large spreadsheet). I could also keep track of other people’s activities as I observe them. For example, the number of times that Matt comes into the postdoc office per day (but there are certainly some unobservable explanatory variables for this event) or the number of arguments that break out at CIDD seminars (this would probably require a relatively long sampling period).

When I ran this idea past my significant other, he replied “you could use a spreadsheet to track how much time you spend using spreadsheets to track your life.” The possibilities are endless, as are the data.

Deep thoughts with Dave Kennedy

-The person who invented the saying “When the #%@$ hits the fan” either had a vivid imagination or a very dirty living room.

-Six percent of people who have ever been born are still alive, meaning that only about 100 billion people have ever died.  After accounting for reincarnations, heathens, and murderers, there is no way that heaven is overcrowded.

-People who deal in absolutes are always wrong.

-The guy who copyrighted the Happy Birthday song should hurry up and patent the placebo effect before someone else does.

-Luck is just skill leaving the body.

-Last night I dreamt that my shoe lace broke.  This morning I was relieved that I can’t dream the future.

-What would a mirror see if it reflected inward?

Human uniqueness? (includes a brief spoiler for the movie “Lucy”)

I recently watched “Lucy”, which can only be labeled as the most scientifically incorrect movie perhaps of all time. It takes as its premise the incorrect fact that humans only use 10% of their brain’s “capacity” and then proceeds to show us the absolutely mind-boggling progression of a woman’s abilities as she ultimately uses 100% of her brain to travel through time in an office chair to deliver the secrets of the universe to Morgan Freeman, all of which conveniently fit on a USB drive.

I left the theater with a fellow grad student, our brains throbbing from disbelief at how events had proceeded. It was an engrossing, utterly disturbing experience that left us shell shocked. I stand behind my statement that “It wasn’t completely bad per se, but it was completely wrong”. Interestingly enough, however, it sparked a conversation based on the following question: how different are humans from other organisms on Earth?

As common and seemingly simple as this question is, I find myself visiting it frequently. I oftentimes think that as biologists we have a tendency to have a perspective that is unique from that of traditional philosophers or social scientists. We publish accounts of animal behavior that resemble are own human perceived idiosyncrasies, in order to squander arguments that these aspects are part of what make us unique organisms. We seek alternative explanations for patterns in human conflict or the establishment of empires, paying attention to environmental factors that may have played a role, in addition to social explanations.

Rather than finding these examples as damaging to the image of what makes us unique as human beings, I instead find them very comforting. What I find uncomfortable is the fact that people seem to consider Homo sapiens as operating outside the realm of natural laws that dictate the existence of every other organism. I’m not naïve enough to overlook the fact that we certainly are highly divergent in a lot of aspects and have relieved ourselves from multiple selection pressures (at least for the time being) and of course there are examples of other animals that self-medicate, use tools and make war, but we operate at the extreme level in all of these cases (to some degree); however, the tendency for people to be anthropocentric and forget the crucial fact that at some level we still are subject to the same natural laws that all of life is, might be one of the most depressing, annoying things about being human, ultimately contributing to the ambivalence with which we treat the planet and its inhabitants.

I think I only believe this 30% of the time, but soul-sucking Lucy brought out the thoughts.

 

Translating ‘competitive release’ with help from a tree

I just arrived back east and decided to detour through the American Museum of Natural History on my way through New York. I’m chatting with my sister, the artsy, business-y member of my family who’s spending her summer living on the upper west side, taking courses at an art institute here. We speak different languages when it comes to our work and both have trouble understanding how the other one gets paid to do what she does. We are walking down the hall of North American Forests and are staring at tree rings on the Mark Twain tree, a 1,400 year old giant sequoia that took four days to cut a cross section, and weighs something like the amount of multiple tow trucks.

Across from the Mark Twain tree I get excited to see a smaller, but still impressive tree core which helps us in our struggle at crossing the science-art world boundaries.I point at the tree and look at my sister: “This is what I study!”photo

The tree rings of the loblolly pine show the effect of competition on the tree’s growth. When resources are limited and the loblolly was competing for water, sunlight and nutrients with its fellow plant compatriots the tree’s growth was inhibited. Without competitors, the tree’s access to resources become abundant, releasing it from the suppressive pressure of competition. Wiide fat rings mark the moment of competitive release.

My sister looks at me with an expression of an “Aha,” moment, “so this is what you do.” Then she says “That’s beautiful.” And coming from someone in the art world, I feel like I just received a very great compliment.

I’m amazed by the number of things that inspire me to think about work even when I’m in places I wouldn’t expect. Reminders of broad-scale ecological patterns crop of everywhere I start looking and it’s exciting to see reminders of our science’s remarkable applications when I have the time to cross its borders.