“Rabid” by Bill Wasik and Monica Murphy

Rabies is a disease without a public relations firm. In developed countries, human disease is incredibly rare–we see typically one or two deaths from rabies each year. In contrast, lightning is responsible for about 60 deaths each year. However, worldwide, rabies is another matter. Today is World Rabies Day, a reminder that 55,000 people still succumb to this virus every year–most of them in impoverished regions of Africa and Asia. While cases in the U.S. are typically due to wildlife exposure (rabid bats or even beavers or rabid kitten), infected dogs remain the main vector of infection in most rabies-endemic countries.

In a new book, “Rabid”, Bill Wasik and Monica Murphy have penned an ambitious history of rabies. It’s subtitled, “A cultural history of the world’s most diabolical virus,” and this emphasis makes Rabid unique. Indeed, while the recognition of the rabies virus is just a bit over a hundred years old, Wasik and Murphy trace the infection back to antiquity. The first half of the book is, as promised, a cultural history–4,000 years of literature references to rabies, hydrophobia, “rage” disease, and dog- and bat-borne contagion in places as far-flung as various mythologies (Greco-Roman, Christian, and Egyptian, to name a few); medical literature from Aristotle to Pasteur; and even the vampire myths from medieval times up to Sesame Street’s Count. Wasik and Murphy explore the animal metaphors used for millenia and examine them through the lens of rabies infection, as well as colorfully explain the various (mis)understandings of the virus and rabies epidemiology in ancient texts. Though Rabid is certainly a pop-science book, many portions of the book wouldn’t be out of place in various literature, history, and even religion classes, which again lends to the book’s eclectic flavor.

The latter half of the tale, then, focuses more narrowly on the science of rabies, covering Pasteur’s work toward a vaccine; the (rather late) discovery of bats as the ultimate reservoir of the virus; the challenge to mount vaccination campaigns in resource-poor areas, and the lingering fear of rabies to this day, which is sometimes justified and sometimes not. They also cover the controversy over the Milwaukee protocol as a treatment for symptomatic rabies, and the problem of rabies control.

Finally, Wasik and Murphy note that even today, almost 130 years after the development of the rabies vaccine, control of rabies among the biggest human source of disease–infected dogs–is almost as poor in some places as it was during pre-vaccine England. The methods to control it are, in some cases, also equally barbaric. The introduction of rabies into Bali in 2008 led to a mass cull of dogs, shooting many in the street. Eventually, a science-based vaccination strategy was adopted and seems to be helping, but not before well over 100,000 dogs were culled and several hundred people had been killed by the virus. Rabies may be an ancient disease, but it is a scourge that is still threatening us where government lacks the will and the funding to beat back “the world’s most diabolical virus.”

Social media evangelism

It’s that time of the year. Spots for ScienceOnline are a hotter commodity than Justin Bieber concert tickets amongst the pre-teen crowd; The Open Laboratory 2011 has just come out in print; and academics are discussing the utility of social media in full force. This topic has long been an interest of mine; with Shelley Batts and Nick Anthis, I even wrote a peer-reviewed paper on the topic way back in 2008. And it’s fresh on my mind, as last week I braved the world of the University of Iowa’s Internal Medicine Grand Rounds to discuss “Social Media and Medicine,” evangelizing for social media in an auditorium full of (mostly skeptical) physicians. When I came back to my computer, I saw this Twitter conversation about the utility of social media in academia (click to embiggen):

I agree with many of the points Patrick makes here. He describes how a student saw him browsing Twitter while proctoring an exam, and rolled her eyes. He notes though:

Certainly, Twitter and various other social media sites have a reputation for inanity…However, social media is a multifunctional tool that can be used in other, more productive, ways.

I’m an academic and an anthropologist, so I’ve tailored my social media use for those fields. Others may have different experiences. Certainly, I may use it for connecting with friends or family, sharing music or humor, or just venting. This isn’t to dismiss the personal – academics are people too! (so I’ve heard) – but there are more substantive benefits I wish I could have discussed with the eye-rolling student. A partial list includes: sharing news on research, professional networking, and engaging with a wider audience through blogging.

I discussed all of these today in my talk, tailoring them to physicians and med students. However, there are other reasons why I think social media is a good idea for today’s academic. Here’s how it’s benefited me:

1) Networking via social media has been huge for my career. A fellow academic who I met through social media–and probably wouldn’t have been in contact with otherwise because his work is on the very edge of intersecting with mine–ended up writing me a letter of recommendation for my K01 award, as I wanted to get more training in a field where he’s already a noted expert. I’ve had colleagues whom I met via social media read grant drafts and provide me feedback, strengthening my writing and presentation. I’ve religiously followed blogs like DrugMonkey for general advice on the grants game. Hell, even back in the stone ages when I was interviewing for my current job, the first place I went for advice once I had an interview was an online forum I was involved in, which was chock-full of academics.

I got the job, and the K award, on my first try. I think a big part of that success was due to assistance I received via social media.

2) Blogging, too, has paid dividends. My work has been well-cited for my field and has received quite a bit of media attention, and I do think that part of that is because I bring attention to what I’m studying via the this blog and my Twitter account. I’ve also had invitations to speak about my work at a number of academic, government, and “regular citizen” venues, which again I feel partially stem from the publicity my research has received. I’ve also given talks on science denial, social media, and zombies–all of which follow directly from my blogging. I noted above my paper on academic blogging; another on HIV denial and the Internet was published in 2007 with fellow blogger Steven Novella. The latter article has been accessed over 50,000 times, part of that due to the attention it received on social media.

3) Networking via social media has been great for my personal life. (It doesn’t have to all be about work, right?) I’ve met people online and in meatspace who are just fucking cool individuals, which is always a bonus. I even met my Significant Other via nerdy science-related social media, where we talked shop for years before ever meeting in person.

What about downsides?

1) Media attention. I mentioned this as a “pro” above, but it also can have its cons. I certainly still fear the Sagan effect. I do a good number of interviews–about one or two a month currently, but that can go up to one or two a day (or more) when there’s a big new paper out from my group or in my general area of expertise. I’ve tried to cultivate good relationships with journalists, both locally and nationally, and try to practice what I preach to colleagues about being available for interviews and not just blowing off the press. So far, I don’t know that it’s caused me harm in any way, but that concern remains.

2) Perception. Certainly many faculty still don’t “get” social media. The perception remains that it’s a time sink and makes one less productive, or that it’s something that “kids these days” do rather than professional adults. While I don’t think my online activities played a role in my tenure evaluation (either positively or negatively), I’ve felt pressure over the years to exceed expectations for publishing leading up to tenure simply so no one could say my case was borderline, and “if only she hadn’t wasted so much time blogging, she could have published more manuscripts and made a stronger case for herself.” I consider my use of social media a hobby and a service activity, but it’s a public hobby that can be easily viewed by colleagues–unlike hours on the golf course or time spent at home scrapbooking.

3) Creeps/trolls. I’ve had many of the same issues as other female bloggers: threatening emails sent personally to myself or to professional colleagues; commenters responding to my appearance rather than the content of my writing. I even had an HIV denialist show up at my office unannounced. Anytime you put yourself out there in an online venue, criticism is to be expected, but it seems to be nastier for women than for men. Blogger beware.

Even though the cons can be nasty, I still think the pros have far outweighed them over the past 10 years that I’ve been involved in science social media. And so, I will continue to evangelize and hope that other scientists and educators dip their toes into the social media waters–or at the very least, support their colleagues who test them out.