Before and After the Fall: A Meditation on Healthiness

IMG_4128

It was a busy summer. And so, I inevitably got sick.

After a rainy ridge run in the Jura mountains confirmed that me and my friend Steve were more or less compatible overdistance-run partners, we ran across Liechtenstein. I often do these sorts of point-to-point runs in the mountains in summer and fall, but a certain amount of caution is usually maintained in choosing routes when I’m alone. Having a buddy willing to go the crazy places I suggested opened up new route possibilities and with it, more wear and tear (and fitness!) for my body.

The summer was full of opportunity and I was giddy.

We ran a section of the Via Alpina, a 1500-mile trail which traverses the Alps from Monaco to Slovenia, crossing through six other countries on the way. While I’d love to trek the trail some day, these days section runs are the best I can do. We “only” went 25+ kilometers in Switzerland’s Kanton Glarus.

Another day we “ran”, or mostly walked, up an incredibly steep headwall on the far side of Walensee lake, climbing 1500 meters in just three kilometers. The run along a small shelf below the uppermost cliffs was gorgeous, as was the relatively gentle descent back to the other end of the lake. But after that one I told Steve he could pick the route because I had caused us a world of pain.

Later in the summer, I skiwalked up to a glacier with my friend Jonas. I don’t know if Jonas knew how steep it was going to be, but I certainly wasn’t mentally prepared. By the time we hit the ice my legs were rubber. We wisely decided across crossing the glacier.

View from the glacier down to Engelberg.

At some point, I started to feel fit like a warrior, even if each of these individual crazy adventures (and a few more long ones on my own) left me totally exhausted, achy and sore. Pain brings fitness, though. I was hardening up.

But then, when I thought I might be getting onto a bit of a roll, fieldwork started.

I was asked with another PhD student to organize a big joint project for our whole lab this summer. It was/is a really great project – lots of interesting angles, and a cool opportunity to be involved in something so big. I really love fieldwork, too. But organizing everything was incredibly stressful. And I had to be at work extra early to organize everything before the rest of the team showed up, then often stay late to process samples and equipment when the day was over.

I did not run those weeks. On the good days, I forced myself to ride my bike either to work (a measly 9 km) or back, but usually not both. By the end of the day of fieldwork I was so tired that riding home seemed impossible. It was mental exhaustion above all else: after a day of remembering details and always trying to plan two steps ahead, plus perhaps driving for three or four hours, any additional feat of willpower was doomed to fail.

But… I got to see many new corners of Eastern Switzerland. And I love fieldwork! Did I mention that I love fieldwork? Why would you ever work in an office if you could work outside?

Here’s what a block of summer fieldwork for an aquatic ecologist looks like. It looks, despite what I just wrote above, like happiness.

b1

b2

b3

b5

b6

b7

b8

b9

b10

b11

b12

b13

I think that even those of us who love our jobs lie to ourselves a little bit and say that because the work is something we enjoy, it’s easier than it really is. This kind of research is what I dream of doing. I love the science and the questions we ask and (try to) answer; it’s outside; there’s always something new and it never gets dull. For better or for worse, there’s a new challenge every week, sometimes every day, and you end up using a crazy-diverse set of skills and developing new ones. I wouldn’t trade my job for anything (well….). But there’s no denying that it totally saps me to organize such ventures.

As soon as fieldwork was over – no, not really, just in the middle of the experiment when we let nature do its thing and tried not to stress out about what might be happening in our absence – I headed back to North America to visit friends and family, go to a few weddings of people near and dear to my heart, and give a talk at a big ecology conference.

The first night I was back on Eastern Standard Time, I slept for 14 1/2 hours.

It was so, so awesome to see so many people I love! But I was flying constantly across the country to get to one thing or another and I never really settled in. For the most part, it wasn’t a very relaxing vacation. I also had to put out some fires with the experiment from afar, not being able to see anything in person, which is always nerve-wracking.

With my cousins Jess and Emily at a family wedding in Houston.

Some of the most relaxing moments of my "vacation" were walks in the Lyme Town Forest with my mom and our dog during my six days at home.

A rainy-ish hike up Mount Moosilauke with Susan and Jenny was a great way to cap off my time in New Hampshire.

And so yes: I cannot even schedule my “vacation” to be recovery time. There’s so much exciting stuff to do! I’m like a squirrel chasing every fun thing that catches my attention.

As soon as I got back to Switzerland it was back to fieldwork as we took the experiment apart. Again with the organizing and the long days.

Because I had been in the south for a lot of my trip back to the States and I am legendarily bad at exercising in hot weather, I had lost a lot of my fitness – I just hadn’t been running a lot, much less biking or rollerskiing.

Nevertheless, on the yearly goals list I had made myself in the spring I had written “do a mountain running race.” This had been an idea of mine ever since moving to Switzerland: I couldn’t really live in the Alps without doing the mountain running thing, could I?

The previous summer I had chickened out. I was doing some other fieldwork, a bit more ski-specific training, and just generally didn’t feel great about my uphill running chops compared to people who grew up in the Alps. I was sure I would get demolished, which was one thing, but more scared that I just simply wouldn’t have fun.

But in September I thought, I’ve got to pull the trigger on this or else this goal will stay as an unchecked box on my list. If I waited much longer there would be snow in the mountains. So I impulsively signed up for a mountain half-marathon in Arosa and convinced Steve to join me. Then I’d at least have someone to commiserate with, I thought, and I was certainly right about that.

Pre-race in Arosa. This dude guided our way to one of the best hotel breakfasts I've ever had.

Earlier in the summer, I had done a not insignificant number of long trail runs – longer than a half marathon – with a lot of elevation. But that day, I just didn’t have it. I’m not sure if it was simply a bad day (those certainly happen) or whether the difference between self-pacing and trying to guess a sustainable race pace just wrecked me early, but it was a brutal slog. The course climbed 900 meters in about eight kilometers at the start, then dropped off a precipitous face where you felt more like you were free-falling than running. Then it was up a second peak and a long downhill run back to the finish.

Falling off the mountain.

By the time I was running the last few kilometers, I had totally bonked. I was a mess at the finish line: rubber legs, salt-crusted face, salt streaks covering my arms and legs, dehydrated, totally depleted but with no appetite. It took me hours to get back to anything resembling being alive. A crazy thunderstorm rolled into the mountains and we sat drinking a beer and watching the lightning, me just being thrilled that I didn’t have to so much as stand up.

My first mountain running race experience was tough, but I’d probably do it again, with a clearer understanding of how brutal the race was going to be. The event was great and there’s a nice camaraderie to this community. I felt at home. So next year’s goal list: “do a second mountain-running race”…. maybe I’ll be faster?

Work got crazy again as we decided to use a student project to do a pilot test of a new experimental setup. One day I ran home from the office and titled my Strava workout “Trying to avoid a nervous breakdown about the new experiment.” I was literally running away from my problems.

Planning new experiments is so exhilarating, but it’s also frustrating and stressful and involves revision after revision of plans and ideas.

Around that time, Steve and I ran from Zurich to Zug, a nearby city. It was a 34 k run with a surprising amount of elevation gain: more than 3,000 feet, not bad for living in Switzerland’s lowlands.

I didn’t realize it, but it was going to be my last good run for a while. At the end of the next week, I noted that my supposed-to-be-easy evening run “felt like garbage”. I was just tired, I figured.

The next day I got sick. Really sick: going through an entire box of kleenex a day, unable to do much of anything, debilitatingly exhausted. I first blamed allergies but then, after a day and a half of this misery, took some flu medicine. I immediately felt better. Not good, but better. Ah-ha! If medication made me feel better, that meant I was actually sick. Right.

I didn’t get better very quickly, and had to take some time off of work. I didn’t exercise for two weeks. When I did, I felt okay, so I got excited and a few days later did a 16 k point-to-point run with my boyfriend (who was visiting, and who I thus felt I needed to provide with some workout opportunities). Predictable result: setback, more kleenex boxes. After a few days of re-recovery, we tried a 26 k run/hike up to a mountain hut. It was beautiful.

IMG_4403

Because of the snow and ice, we didn’t push the pace that much. My main challenge was staying warm, but my immune system seemed to handle it okay. The 20 k point-to-point going south from Zurich two days later, though, was one push too far. It required another kleenex box, more medicine, a few more days off from running.

You’d think that at almost 30 years old, I would have learned how to take care of myself better than this. But when to go back to training can be a tricky question – and I’m not exactly training for anything. When I go out for a long run in the mountains, it’s because that’s what will make me happy on that particular day. The balance of long- versus short-term planning is quite a bit shifted from my “athlete” days, meaning that I can risk a little more do get out there sooner – but the potential consequence, lying in bed being miserable, isn’t so nice either.

At a certain point, it came down to this for me. After weeks of yo-yoing back and forth between really sick, sort of sick, and sort of healthy, I couldn’t tell what “really healthy” actually was supposed to feel like. Better than yesterday, better than yesterday, better than yesterday; that seems like a good trajectory. When is better good enough?

And how do you value the trade-offs in life when athletic pursuits are essential for your happiness, but “performance” isn’t your job? Doing two jobs (now three, but that’s another story) and trying to get in good blocks of exercise is certainly pushing the limits of what I can do, mentally and physically. Yet my jobs are stimulating and fulfilling; I want to do well at them. I couldn’t quit them. I also couldn’t quit running (or, in the winter, skiing). If I did that, I’d be less stressed and I might not get sick, but I would be unhappy and the lack of exercise would leave me unhealthy for an entirely different set of reasons. I think I’d be less efficient at my jobs. To non-athletes that sounds counterintuitive, but I suspect that any recreational sportsperson knows exactly what I mean.

I’m lucky that I have some role models in this department. Most of my peers don’t pursue sports – I suspect that if I was in a similar graduate program in the U.S. the number might be higher, but without organized college sports teams many in Europe drop out of organized sports when they start their bachelors, and by grad school are focused primarily on academics – but a few do. When we see each other it’s like a relief: yes, I’m not crazy, this is a real thing that people do! And I’m not the only one who feels like doing work and sport together makes me better at each.

But it undeniably comes with costs. And so, occasionally, you run yourself into the ground and you get sick. Then the longer you sit around waiting for your mythical health to arrive, the more you stew in your own unhappiness. But pushing the envelope also might mean longer sitting around, just drawing things out.

I seem to be healthy again, finally, and I’m going to push it – but not too far. Ski season is coming and somehow, I need to break this cycle.

IMG_4410

spring resolutions.

Next winter I want more of this... and only like half as much racing.

Next winter I want more of this… and only like half as much racing.

Some people make resolutions at New Years. But I’m never very successful at keeping them.

This year I had a revelation: for me the calendar doesn’t start on January 1, but when the ski season ends and a new year begins. We’ve all kept track of it this way in our training logs for years and years, but I had never explicitly thought of it seeping into the rest of my life.

After all, semester schedules still go on. Grant cycles don’t depend on the seasons.

But emotionally, the end of the season is the time for me take stock of what happened in the last year, set goals, and decide what I want to do better – how to manage my time through the whole year, culminating in winter.

When I got back from World Championships, I started making resolutions. The first one: next year I’m not going to race as much. I’m going to enjoy skiing for skiing’s sake a bit more, and take some weekends where I just get out on the trails with no bib on.

This winter racing really did take up a lot of mental space, even though I tried to keep things low key. In the end, if you’re racing every single weekend it weighs on you no matter how relaxed any individual race experience is.

For the last two weekends I have had a blast enjoying the spring skiing in Switzerland, and I want to make certain that I do more of this mid-winter, too.

I finally checked out Melchsee Frutt, a ski area that my friend Jonas gushed about all winter. It’s outside of Lucerne and you take a bus up to the bottom of a big gondola. Then, with your skinny cross-country skis, you take the gondola up with all sorts of alpine skiers, snowboarders, and families with toboggans (alpine sledding is a big and awesome thing here – don’t believe me? Read about when I sledded Grindelwald….)

The top of the gondola is at 1900 meters. The snow is sparkling. The sun is strong. There’s a 15 k loop, which isn’t really all that much, but it’s plenty to keep you entertained. It’s one of the first times cross-country skiing in Switzerland where I have really felt, dang, I’m in the Alps.

Melkst Frust.

Melchsee Frust.

There's a dogsled outfit called "Swisskimos", which I find both genius and a little offensive at the same time.

There’s a dogsled outfit called “Swisskimos”, which I find both genius and a little offensive at the same time.

I went there the day before Easter, and then on Easter Monday I skied 40 k in Lenzerheide. The first hour was incredible, but then I started skiing from town towards the biathlon/Tour de Ski stadium and realized that in this direction the cover was terrible and the snow was melting.

Capfeder in Lenzerheide: winter in one direction...

Capfeder in Lenzerheide: winter in one direction…

... with glimpses of summer on the way.

… with glimpses of summer on the way.

It was still an amazing day, though, and I had a blast using the fitness I’d accumulated over a season of racing without the pressure of, you know, racing.

This weekend I went one last time to Melchsee Frutt, with Jonas.

Smile, Jonas! You're skiing in April!

Smile, Jonas! You’re skiing in April!

One of us is more tired than the other one.

One of us is more tired than the other one.

We agreed: this was the last ski of the year. Time to summer wax the skis. Thanks skis. Til next year.

The motley crew. Let nobody accuse me of being anti-diversity.

The motley crew. Let nobody accuse me of being anti-diversity.

My second resolution: cook more diverse and interesting food, and don’t fall into ruts. When I was skiing in Craftsbury I tried to bake a lot of different fancy desserts because I had too much time on my hands. Then when I moved to Eugene, Oregon, to take a job as a research assistant, I lived for the first time in a pretty diverse city. I took advantage of the Asian and Mexican grocery stores and went on a lot of culinary adventures. Since then, I mainly cook new things when I’m at home and making dinner for my parents. It turns out okay.

But at my own apartment, I’ve been a little bit lame. Sure, I make some good food, but often I resort to things that are quick and I usually stick to the safe, central European aisles of the neighborhood grocery.

So a few days after getting back from Oslo I finally, after a year and a half, went to an Asian grocery store in Zurich. I had been saying ever since arriving that I would do it. I found good dark soy sauce, toasted sesame oil, mirin (Japanese cooking wine), rice vinegar, black bean chili garlic sauce (which is the best thing ever), lots of fun noodles, oyster sauce, better tofu, and fresh cilantro.

tumblr_m88diwqPyM1rbuzy0o1_1280

I already need to go back and re-stock up on a few things that I’ve run out of, as well as more things, like peanuts and light soy sauce. I’ve had a blast cooking. A few favorites: this delicious stir fry recipe (using tofu instead of chicken), hot and sour soup, and some stir fried cabbage and rice noodles.

Spicy cucumber salad (this, but using cucumber you have chopped up, salted, and drained the water off of for 15 minutes) is a new side that I’ve been making to go with basically anything.

I also bought a cookbook which I have seen have glowing reviews in multiple places. It’s called Made In India, and it’s awesome. The author, Meera Sodha, lives in Britain and sprinkles the book with stories about her family. The recipes are great, but they’re also designed to be made with ingredients you can find at a normal grocery store.

My housemates have been thrilled that I’ve cooked a bunch of new curries: chickpea curry (Chana masala), potato curry (Aloo tamatar), roasted cauliflower. Buy this book. Your housemates, family, husband, wife, coworkers, whatever – they will love you a little bit more.

My third resolution: do more squats. I actually haven’t been to the gym in over two years. Gym memberships are expensive here. There’s a university facility that I could go to for free, but I hate the atmosphere. In Switzerland sports are already such a man’s world, and not surprisingly, on a college campus the gym feels like a meat market. At 28 years old I no longer feel like participating in this “see and be seen” and “leer at the girls working out in spandex” situation.

Skip the gross gym, but use your shoes.

Skip the gross gym, but use your shoes.

When I was living in Eugene I went to a great high-intensity interval training (HIIT) program called Tabata. The details are a little different from other HIIT programs, but in the end most of these things are similar. If you are on a team or part of a training program, you don’t need any gimmicky workouts: just do what your coach says. For those of us training alone, sometimes we need extra motivation and organization.

Anyway, Jon’s Tabata program is based on just a few exercises: body-weight squats, squat-thrusts, and jumps. The workouts take just 30 minutes and they make you exhausted. You will sweat like a pig. Walking around the rest of the day your legs will shake.

But you will get strong. My legs became more powerful and I was more lean when I was doing those workouts. I have Jon’s “recipe book” for workouts and this year I’m determined to do some regularly. I need some power and fast-twitch muscles in my life.

Lastly: Productivity. It might seem like I’m pretty productive – I guess I am. I manage to do an okay job at being a PhD student and regularly produce content for FasterSkier.

But…. I could do better. In my PhD, things have gotten a bit overwhelming in the last six months. I’m stuck with a lot of data which I don’t really know how to analyze, but I should be analyzing it and writing it into manuscripts, at the same time that I continue to conceive of and execute new experiments. It’s a lot!

Just focus and get it done, for Pete's sake.

Just focus and get it done, for Pete’s sake.

When I get completely overwhelmed, my productivity crashes. It feels like I’ll never accomplish everything even if I try, so where do you even start?

This post from WhatShouldWeCallGradSchool really explains it best.

Spurred on my a comment from my housemate, I started looking into software that would keep me from logging onto facebook or checking my FasterSkier email during the work day.

Have you heard of the Pomodoro method? It sounds really dumb: you break your day into 25 minute chunks, and allow yourself five minutes in between to make a cup of tea, go for a quick walk around the building, check your email, whatever.

I’ve started using an app called Pomello which merges this technique with your to-do list. You pick a task and the timer starts counting your 25 minutes. When the 25 minutes is up, bingo! Reward yourself by replying to an email from a friend. Then you can restart the same task or pick another one. You can compare day-to-day how many of these 25-minute chunks you get done and how youre productivity is doing.

This sounds so completely trivial and pointless, but it has actually really helped me. I think it taps into some part of my innate insane competitiveness, so I do actually focus during the 25-minutes blocks. It’s short enough that you can keep yourself focused, sans distraction, for the whole time and chide yourself when you think about checking facebook. But it’s long enough that you can get some meaningful part of a task done, too.

If you’re a better worker than I am, this will seem irrelevant. But if, like me, you are getting bogged down and discouraged and reading interesting articles on the internet instead of doing your work… maybe give it a try?

Hopefully I can keep this newfound improvement to my focus through the whole year.

I think that part of the reason that “new years resolutions” never stuck for me is that if I make them on January 1, my day-to-day life if much the same before and after this magical cutoff date. I’m still on holiday (for a short time), I’m still balancing skiing and work, it’s still winter.

(The other part of why they don’t stick is that resolutions rarely work, period.)

This year, I’m thinking about them at a time when my life really is different: with the changing of the seasons, I have a real chance to start afresh. Things feel new, and like if I really wanted to change my life, I could succeed.

part 2: suppressed results.

In part one of my writeup on survey results, I talked a lot about the file drawer effect and why we end up not publishing some potentially useful results because we don’t have time. In a high-pressure environment where publication in the best journals is important to advance our careers, we often focus that limited time on the manuscripts with the highest potential impact. In some unfortunate cases, that means that professors do not prioritize giving their students the support necessary to publish results from projects, theses, or dissertations.

There’s no doubt that this can hurt younger scientists’ careers. Helping a student aim high and write higher-quality papers is great…. but it can go too far, too.

“There was no specific pressure NOT to publish, but rather my supervisor could not provide useful and supportive feedback and he was never satisfied with any draft I submitted to him for review,” wrote one respondent. “After numerous iterations of my projects over many years, I became discouraged and decided it wasn’t worth the effort to try and publish my results. Others in my lab have had the same or similar experience.”

Today I will talk about something more insidious: when you are discouraged from publishing something for other reasons, like politics, that your data didn’t support your research group’s hypothesis, or that external partners did not understand the results or the underlying science.

(If you want to know more about the dataset I am working with, its small size, and its various biases, I discussed it in part one: click over here.)

As an ecologist, I didn’t think that this happened a lot in our field, at least not compared to other fields where there’s more often commercial connections and money at stake. Perhaps if you are an environmental consultant or doing impact reports for the government or companies. But in a purely academic community I assumed that it was a fairly rare occurrence for results to be kept out of publication.

One thing quickly became obvious. It does happen, sometimes. There are lots of reasons, some of which are highly case-specific, i.e. the government of the researcher’s native country didn’t allow him/her to import her samples in the end, after all…. but there are some common patterns, too.

With such a small sample size – 40 of the 184 respondents reported this happening – and also the fact that I made it clear online that I really wanted to hear from people who had been discouraged from publishing, it’s impossible to say how prevalent such events actually are. The proportion of responses does not reflect the proportion of total scientists who have had this experience.

I can certainly say that comparatively, many fewer unpublished papers are due to these events than due to the self-created file drawer effect. Two thirds of survey respondents said they had at least one unpublished dataset, if not a handful or more, even though many were just in the first five years of their research careers.

The file drawer effect means that there are tens of thousands of unpublished datasets out there, maybe 100,000. Many probably have no significant results, since some of the most cited reasons for not publishing were inconclusive data, needing to collect more data, and doubting that the results would be accepted by a high impact journal.

Other pressures happen in a smaller number of cases, but primarily for the opposite reason: results did show something interesting, but maybe not what someone – a supervisor or a government employee – wanted to see.

And while I cannot draw any conclusions about prevalence, I can (hopefully) draw some conclusions about why this happens and who it happens to.

A brief table of contents:

First, student-specific challenges.

Second, government and, to a lesser extent, industry challenges.

Third, “internal” and interpersonal political challenges.

Students Bear The Brunt of It

“As a grad student, the concept of this is crazy to me,” wrote one respondent. “In many ways and instances, publications are the currency by which scientists are measured against one another. Thus, not publishing work seems counterintuitive to me. I’d like to hear the reasons behind why it happens.”

Well, dear student, there are many. And being discouraged not to publish in fact seems to happen mostly to students. Here’s who the 40 survey respondents who reported being pressured not to publish their work were:

title at timeMostly students. One explanation is that as we go along in our careers, we get a better concept of what is good and valuable science, and make some of the decisions to jettison a project ourselves rather than being told by a supervisor. We also become more and more crunched for time, meaning that we make more of these types of prioritizations before it gets to the point of having someone else weigh in.

But that’s not the only explanation. Let’s look first at when a direct supervisor was involved. With 32 responses of this type, it was about twice as common in my dataset than when an external person pressured a respondent not to publish. In these supervisor-related cases, it was most frequently a tenured or tenure-track professor discouraging a graduate student from publishing a chapter of their thesis or dissertation.

titles for direct

And as discussed in part one, part of the issue was that these driven supervisors were strapped for time and transferred their own expectations about significance of results and journal quality onto their students, even if students would have been happy settling for a lower-impact publication.

Sometimes this is very appropriate, sometimes less so. Where this line is drawn probably depends on your goals in science.

“A paper was published, but it excluded the results that I found the most interesting because they were not in line with the story that my advisor wished to push,” one respondent wrote of bachelors thesis research. “Instead, results from the same project that I though were not well thought-out were published in a way that made them seem flashy, which seemed to be the main goal for my tenure-track advisor.”

Another respondent had a similar story with a different ending, about work done as part of a masters thesis.

“The situation was not resolved; I just ended up not publishing,” he/she wrote. “I wanted to publish, as I considered the results to be high-quality science and the information very useful to disseminate, but I could not agree to change the research focus entirely to suit my supervisor’s personal interests.”

You can see both sides of the coin in some cases. What is the goal? To advance scientific theory and knowledge, or to share system-specific data that might help someone in the future? Ideally, a manuscript does both, but sometimes that’s not possible and just the second is still a good aim.  In some cases the supervisor is probably guiding the student towards using their data to address some question larger than the one they had initially considered. But, as the bachelors thesis respondent noted, it’s not always appropriate to do so – some people think that overreaching and drawing conclusions based on data not really designed to do so is a big problem in some fields.

“Some datasets and analyses I have collected and analysed don’t tell a clear story that would be readily publishable given the current state of how research articles are assessed for impact thus I tend to move on to things that tell a better story,” wrote another respondent. “This feels disingenuous at times though perhaps it is how science moves forward more quickly.”

A surprising amount of the time, supervisors discouraged students from publishing because the results turned out to not support their hypothesis. This was actually the most common single reason that a supervisor told a student not to publish. I may be naive, but it’s hard for me to think of a situation in which this is not just straight-up bad.

reasons for directI was quite explicit to ask whether the results did not support “our” hypothesis, or whether they did not support a supervisor, department, or company’s hypothesis. Sometimes the two overlapped, but most of the time when this happened the respondent selected the second option: the researcher themself might not have been surprised by the results, but the supervisor, lab group, or company did not like them.

(About 60% of the 32 responses came from ecology and evolution, but many also came from other fields.)

fields for direct

This really surprised me. In our training as scientists it is drilled into us that we might learn as much from a null result or a reversal of our hypothesis as we would if our hypothesis was supported – maybe even more, because it tells us that we have to carefully look at our assumptions and logic, and can lead us down new and more innovative paths.

In the U.S. at least, a substantial proportion of the population just has no respect for science. Whether its climate change deniers or anti-vaxxers, as a science community we tell them: go ahead, prove us wrong! Science is very open to accepting data that disproves something we had previously thought was true. We try to tell the public that we are not close-minded, that we are following evidence, and that if the evidence showed us something else, we’d still accept it.

On some small scale, that might not be true, and it’s very troubling. Without knowing more about the research in question here, it’s impossible to say much more. But it’s not a very inspiring trend. And again: this was happening coming from direct supervisors who were mostly in academia and shouldn’t have had a financial or political conflict of interest or anything like that.

And it also has potentially big implications for the sum of our community’s knowledge. Luckily there are so many researchers out there that probably someone else will ask the same question and publish it eventually, but this sort of attitude can delay learning important and valuable things.

“Unfortunately it’s hard to tell what could become interesting later, or what could be interesting to another researcher, so it’s too bad that these results never see the light of day,” wrote one early-career biologist. “What’s more concerning to me is the tendency of some researchers in my field to ignore or leave out results that they can’t explain, or worse, that contradict their pet hypothesis.”

When pressure came from an external source – someone not supervising the study respondent – the prevalence of this reason for discouraging publication was even higher. The data not supporting someone’s hypothesis rose from roughly two-thirds of respondents citing it, to almost half.

reasons externalAnd relatedly, the person doing the pressuring was afraid that the results would make them, their group, or the government look bad. In other words, these are classic cases of repressing research, the worst case scenario that we think of!

Governments are Not Always Great (for Science)

Sometimes, this external pressure came from within academia, but it was also often from governments.

Screen Shot 2015-10-07 at 1.38.56 PM“Yes, the results were published, yes it created an public uproar, yes all authors were chastised by the agency and external company, and yes all subsequent follow-up research papers on the topic were expressly forbidden,” wrote one federal government employee. “There are considerable research accomplished by state and federal government agencies. Much of those data results never see the light of day because the results may be divergent from what the chain of command’s perspective or directive may be, I.e. support the head official’s alternative energy, logging harvest, endangered species delisting, stream restoration, etc. policy.”

It’s clear that one place where state, local, and federal government officials can be particularly destructive is Canada. Apart from the cuts to research funding which have been hitting many countries, it’s been discussed by people far more knowledgeable than I that the government literally muzzles its scientists by not allowing them to talk to the media, among other policies: see here, here, and here.

Here’s what one anonymous survey respondent had to say: “The Canadian government has been muzzling scientists for years…I was just the latest in their ‘Thou Shalt Not Publish’ scheme. If the research you’re doing will make them look bad in any way, you’re not allowed to publish the results without fear of massive repercussions: job loss, degree removal, job losses of your superiors if they can’t fire you, being blacklisted in the scientific community, being blacklisted for grants, etc.”

Multiple survey respondents cited the Canadian government. So, about those elections coming up….

Consultants and researchers in the corporate/industrial sectors are often muzzled as well, but many of them are aware of this from the time they are hired.

“It is simply understood that if the research results from work we do for clients are inconvenient, they will attempt to redact the reports as trade secrets,” wrote one consultant. “They own the data so they are often able to do this. But not always.”

But even if companies are upfront about data ownership policies, it can still feel tough. One person told me that it was discouraging not to be able to get a patent and get credit for his/her work because a company owned all the intellectual property rights and would use the discovery as proprietary and secret until it was no longer profitable to do so.

In a variety of fields, there’s also some crossover between the industrial and academic sectors of research. Companies often provide funding to students or research groups working in an essential location or on a related topic. The companies shouldn’t be able to use their influence to suppress results, but in some cases they do seem to.

This is actually what happened in the case that inspired me to create my survey: the International Association of Athletics Federations squashed survey results showing that a huge proportion of championships competitors were doping. They were not involved in the research itself, but had provide access to the athletes, and thus felt like it was their prerogative to police the results.

One survey respondent said that he had been let go from his position after publishing research about the effects of pesticides, and had heard a researcher with industry ties imply that the same thing would happen to someone else publishing similar research.

Several people in environmental and earth sciences fields mentioned this happening to people that they knew or had talked to, but it’s hard to pin down other than in news stories.

We Can Be Our Own Worst Enemies

Finally, other politics are more about internal power dynamics, be it within a department or within a research field.

“A person, invited late to the project, was asked to provide simple review in return for coauthor ship,” wrote one respondent. “They hijacked the project and it is still unpublished four years on.”

It’s pretty tragic to see a good experiment, or maybe a whole grant that some agency spent hundreds of thousands of dollars on and researchers spent years of their lives on, get derailed by interpersonal problems and arguments about data ownership or authorship.

In many fields the community of specific experts is fairly small, so you are likely to have to work with people again, or have them review papers, etc etc etc. The problems are hard to resolve once they begin.

It was also clear that sometimes people nixed manuscripts because they didn’t understand the science or the value of this. Sometimes this meant a bureaucrat at a funding agency, but sadly, sometimes it also came from within the scientific community itself.

“Because my scientific community is so small, in some cases only one review has been given by a local expert, and of course the editors don’t have time to fact-check, but my paper will not be accepted because these few experts are, as I perceive it, not wanting recent data contrary to results from their systems to be published, and assume that someone with an M.Sc. cannot be a diligent scientist, in many cases providing lots of evidence in reviews that they have not read the manuscript with care… possible skipping entire sections,” wrote one student.

There’s even outright theft sometimes.

“The results were made partially public at a conference,” wrote one researcher. “Another researcher who has hard feelings towards my former supervisor, and viceversa, started to use the date as if it was a ‘public domain information’ and later my supervisor considered that the publication is not worth going out. The problem has not been resolved yet.”

A Reminder

This has been, in some ways, a worst-ever tour of the scientific research community. We all know someone who has had some terrible experience with their research.

But many of us have had relatively happy tenures in science and research. At least in my field, ecology, I can say that the vast majority of people are good people and fun to work with. It’s part of what I love about my job. If the only people around me were those who stole results, bullied me into not publishing, constantly asked me to change the focus of my research, or demeaned what I did because I was a graduate student, I would quit.

But here I am, and I’m happy! Such people do not make up the majority in our fields. But it’s worth remembering that even one major interaction like this can seriously discourage people from continuing to do research. There are lots of other jobs out there, and if the research environment is malevolent it’s easy to feel that the grass is greener on the other side.

So: with the knowledge that there is some scummy behavior going on, can we try to be nicer and kinder to one another? After all, our goals are to advance scientific knowledge and to create more capable, creative, and conscientious scientists.

Thanks to all who participated in the survey. I hope it has been interesting and helpful to read about.

the contagion of perfectionism & the scientific publication bias.

A few weeks ago I sent out a survey to many of my scientist friends. I wanted to know: why does some research stay unpublished? Those outside academia or research might think that science always proceeds in a linear fashion. A person does a study, they publish it, now it is out there for other scientists to reference. Once research is performed, it is a known quantity. But that’s not necessarily true.

For any number of reasons, a fair chunk of research never makes it to the publication stage. Sometimes it’s because it’s bad research: it is biased or the methods are bad, so during the peer review process the paper is rejected. This might not be through any real fault of the scientists. The problems might have only become apparent after the research was completed. This is pretty inevitable, and can lead a research group to design a second study that is much better and really gets at their question.

But does all the good research even get published? No, definitely not. There’s research out there that remains unpublished even though it probably could have been.

Some possible reasons for this are that the researchers ran out of time to write up the results, or the results just didn’t seem very interesting, or their hypothesis was rejected. For these reasons people might choose to focus on another project they had going at the same time. But that leaves a gap in the record of published science: results with bigger effect sizes are published proportionally more often than null results. Results with no effect might be left in a drawer to be published later, or never.

This phenomena is called the “file drawer effect” and is a major contribution to a bias in publication which is problematic for many reasons, which I’ll discuss later. Here’s a nice paper on the file drawer effect.

With my survey, I wanted to get at why people don’t publish. First I asked about how much research they leave in the file drawer, so to speak, out of their own choice. Then I asked how often other people pressured them to avoid publishing, and why. I’ll get to that second question in part two of this post.

First, as a caveat before getting to what people told me. The responses certainly don’t represent the whole science community, and I can’t draw any conclusions about frequency of the types of things I’m asking about. I had 182 responses, which is not a lot, and the majority were from ecologists and evolutionary biologists relatively early in their careers. The survey was spread by word of mouth so this is just a function of who I know.

Here’s some data on who responded to my call:

fields

Age of researchers

(I’ll also add that most respondents were in academia, but there were also some who worked for government research institutes or companies. I’ll get more into that in part two, but for now I’m going to write primarily from the academia perspective. Just be aware that there are some non-academia responses in here as well.)

Now. One of the first questions I asked was, “How many papers or reports worth of results of your own work remain unpublished, by your own choice?”

It’s obvious now that I could have worded this a little bit better. Some people include all of their unpublished work in this answer, while others said that just because something hadn’t been published yet didn’t mean that it would never be published, and left some work out of the count. They may be right about that: some work does eventually get published years later, when researchers finally have a chunk of free time and nothing “more important”.

(“Pressure was not direct – just lack of support to move the paper forward,” one survey responder wrote of his/her supervisor. “Ultimately he approached me to finally publish the work – after more than 20 years!”)

But that data that you swear you will write up one day can also remain in the file drawer indefinitely.

In any case, here’s what I found:

ownchoice

The other thing that is unclear in the results is how realistic it is to publish all of these datasets – are they each an individual paper, for instance? Some people take a dataset and divide it into as many pieces as possible so that they can get the most publications out of it, when in reality publishing all the data together would have made a more interesting, meaningful, and high-impact single manuscript. So has someone doing research for six years really accumulated ten unpublished datasets? Perhaps they have. Meanwhile, I am impressed by the few people who had been doing research for 25 or even 40 years and had seemingly published every worthwhile dataset they had ever collected. These people must be writing machines. (And I say that as someone who writes quite a lot!)

Adding a regression line is probably inappropriate here, but let’s just say that in the first ten years of their career (depending on how they are counting, this is a bachelors thesis, a masters, a PhD, and maybe a postdoc or two), many people accumulate four or five studies that they could have published, but they didn’t. After that they might accumulate one every five or ten years. It makes sense that more of the unpublished papers come early in the career because people aren’t yet adept or fast at writing papers. They also don’t have as much experience doing research, so data from projects like bachelors theses often go unpublished because of flaws in study design or data collection. These mistakes are what eventually lead us to learn to do better science, but they can keep a piece of research out of a top journal.

As of 2013, there were about 40,000 postdocs in the United States. Add that to the rest of the world and there’s potentially a lot of unpublished research out there – clearly over 100,000 datasets worth! (Is that good or bad? Both, and I’ll get to that later.)

The answers are partially biased, I am sure, by the differences in productivity and funding between different researchers. This might depend on what kind of appointment the researcher has – is their job guaranteed? – and how much funding they have. A bigger lab can generate a lot more results. But someone still needs to write them; labs might go through phases where the writing falls primarily on the PI (primary investigator, a.k.a. lab head) or other phases where there are highly productive postdocs or precocious PhD students who also get a lot of papers out the door.

And one of the biggest constraints is, of course, time. With pressure to publish your best research in order to get that postdoc position, to be competitive for a tenure-track job, and to eventually get tenure, if researchers have to choose between publishing a high-impact paper and low-impact one they will certainly focus their energies on the high-impact results. The results that were confusing or didn’t seem to show much effect of whatever was being investigated might stay in that file drawer.

One thing that was clear is that this problem of time as a limiting resource is contagious. Later, I asked people if they had ever been discouraged from publishing something which they had wanted to publish. Of the 32 cases where a supervisor discouraged publishing, two answers as to why emerged as particularly common.

context_perfectionismLet’s look at the “more data was needed” issue first. What I offered as a potential response in the multiple choice question was, “My supervisor thought that we needed to collect more data before publishing, even though I thought we could have published as-is.”

In some cases, the supervisor might be right. Maybe more data really was needed, maybe the experiment needed to be replicated to ensure the results were really true, maybe the team needed to do a follow-up experiment or correct some design flaws. After all, the supervisor should have more experience and be able to assess whether the research is really good science which will stand up to peer review.

“Simply, what I thought were publishable results were probably not worth the paper it would be printed in,” one responder wrote of research (s)he had done during a bachelors thesis, but which the supervisor had not supported publishing. “The results did serve as the basis for several other successful grant applications.”

But at the same time: as Meghan Duffy recently noted on Dynamic Ecology, perfect is the enemy of good. That can go for writing an email to your lab, and also for doing experiments. In the discussions of her blog post, someone noted that “perfect is the enemy of DONE” and Jeremy Fox wrote that often graduate students can get into the rut of wanting to just add one more experiment to their thesis or dissertation, so that it is complete, but at some point you just have to stop.

“I have not directly been pressed not to publish, but I have 2 paper drafts which have not been published yet,” wrote another respondent. “I wrote them as a PhD student and now I think they will be published, but I have the feeling that for some period, one of my supervisors did not want to publish them because it was just correct but not perfect enough.”

If more data should be collected, but probably never will be, does that mean that the whole study should sit in the file drawer? If it was done correctly, should it still be published so that other people can see the results, and maybe they can do the follow-up work? Different researchers might have different answers to this question depending on how possessive they are of data or an idea, or what level of publication they expect from themselves. But if a student, for example, is the primary person who did the research, their opinions should be taken into account too.

Why is this publication gap a problem?

That gets into the second idea: that as-is, the research won’t be accepted into a top journal.

Ideally, this shouldn’t matter, if the research itself is sound. There are plenty of journals, some of them highly ranked, which accept articles based more on whether the science is good and the methods correct, rather than whether the results are groundbreaking.

(Unfortunately, these journals often have big publication fees, whereas highly-ranked journals have a large time investment but publication is free. Plos One, one of the most well-known journals which focuses on study quality rather than necessarily the outcome, is raising its publication fee from $1350 to $1495, which must be paid by the author. For some labs this doesn’t matter, but for other less-flush research groups the cost of open-access publishing can definitely deter publication.)

It is important to get well-done studies with null results out there in the world. Scientific knowledge is gathered in a stepwise fashion. Other scientists should know about null results, arguably just as much or more than they should know about significant results. We can’t move knowledge forward without also knowing when things don’t work or don’t have an effect.

Here’s two quick examples. First, at least in ecology and evolution, we often rely on meta-analyses to tell us something about whether ideas and theories are correct, or, for example, how natural systems respond to climate change or pollution. The idea is to gather all of the studies that have been done on a particular topic, try to standardize the way the responses were measured, and then do some statistics to see whether, overall, there is a significant trend in the responses in one direction or another. (Or, to get a little bit more sophisticated, to see why different systems might respond in different ways to similar experiments.) This both provides a somewhat definitive answer to a question, and makes it so that we can track down all of the work on a topic in one place rather than each scientist having to scour the literature and try to find every study which might be relevant.

If researchers only 20% of the scientists studying a question find a significant effect, but these are the only results which get published, then literature searches and meta-analyses will show that there is, indeed, a significant effect – even if actually, across all the studies which have been done (including the unpublished ones), it’s a wash. Scientific knowledge is hindered and confounded when this happens.

A second example. When you are designing a study, you search the literature to find out what has been done before. You want to know if someone else has already asked the same question, and if so, what results they found. You also might want to know what methods other people use, so that you can either use the same ones, or improve them. If research is never published, then you might bumble along and make the same mistakes which someone already has made. The same flawed study might be performed several times with each person realizing only later that they should have used a different design, but never bothering to disseminate that information. (And sure, you can ask around to find unpublished results, but if there’s no record of someone ever studying a topic, you’re unlikely to know to ask them!)

Almost everyone in the scientific community acknowledges that the publication bias towards positive or significant results is problematic. But that doesn’t really solve the problem. It’s just a fact that null results are often much harder to publish, and much harder to get into a good journal. And considering the pressure that researchers are under to always shoot for the highest journals, so that they can secure funding and jobs and advance their careers, they are likely to continue neglecting the null results.

“I think a lot of pressure comes from the community rather than individuals to avoid publishing negative results,” one early-career ecologist wrote in a comment. “I think negative results are useful to publish but there needs to be more incentives to do so!”

This pressure can be so great that, I was told in a recent discussion, having publications in low-impact journals can actually detract from your CV, even if you have high-impact publications as well. Two candidates with the same number of Ecology Letters or American Naturalist or even Nature papers (those are good) might be evaluated differently if one of them has a lot of papers in minor regional or topic-specific journals mixed in. Thus, some researchers opt for “quality not quantity” and publish only infrequently, and only their best results. Others continue to publish datasets that they feel are valuable even if they know a search or tenure committee might not see that value, but consider leaving some things off their CV.

One thing I’d like to mention here is that with the “contagion”, students are sometimes affected by their supervisors’ standards of journal quality. While a tenure-track supervisor may only consider a publication worthwhile if it’s in a top journal, a masters student may be greatly benefited by having any publication (well not any, but you see my point) on their CV when applying for PhD positions. I also know from my own experience that there is incredible value, as a student, in going through the publication process as the corresponding author: learning to write cover letters, respond to reviewer comments, prepare publication-quality figures, etc. Doing so at an early stage with a less-important manuscript might be highly beneficial when, a few years later, you have something with a lot of impact that you need to shepherd through the publication process.

There are many good supervisors who balance these two competing needs: to get top publications for themselves, but to also do what is needed to help their students and employees who might be at very different career stages. In many cases, of course, supervisors are indeed the ones pushing a reluctant graduate to publish their results!

Unfortunately, this is not always the case. Again, because of the low number and biased identity of survey participants I can’t say anything about how frequently supervisors hinder their students in publishing. But I think almost everyone has some friend who has experienced this, even if they haven’t themselves.

“I have been in the situation where a supervisor assumed that I would not publish and showed no interest in helping me publish,” wrote one responder. “As a student, being left hanging out to dry like that is rough – might as well have told me not to publish.”

“Depending on the lab I’ve been in, the supervisory filter is strong in that only things deemed interesting and important by them get the go ahead to go towards publication,” wrote another. “Thus, the independence to determine what to publish of the multiple threads of my research is lacking in certain labs and management structures.”

That obviously feeds in to the publication bias. So how do we get past it, in the name of science? There aren’t a lot of answers.

Why is the publication gap maybe not so bad?

At the same time, it’s clear that if all this research (100,000 or more papers!) was submitted for publication there would be some additional problems. Scientific output is roughly doubling every nine years. There are more and more papers being published; there are more postdocs (although less tenure-track professor positions) in today’s scientific world, and I’m pretty sure the number of graduate students increased after the “Great Recession”, about the time when I was finishing my bachelors degree and all of a sudden many of my classmates’ seemingly guaranteed jobs disappeared.

This puts a lot of stress on the peer review system. Scientists are not paid to review research for journals, and reviewing may or may not be included as a performance metric in their evaluations (if it is, it’s certainly not as important as publishing or teaching). With more and more papers being submitted more and more reviews are needed. That cuts time out of, you guessed it, doing their own research. It’s a problem lots of people talk about.

Screen Shot 2015-10-04 at 10.21.16 AM

Others lament that with so many papers out there, it’s getting harder and harder to find the one you need. Science is swamped with papers.

Even without publishing in a journal, there are other ways to find data. For instance, masters theses and PhD dissertations are often published online by their institutions, even if the individual chapters never make it into a peer-reviewed journal (perhaps because the student leaves science and has no motivation to go through the grueling publication process). But this type of literature can be harder to find, and is not indexed in Web of Knowledge, for example. So if it’s the data or methods you need, you might not find it.

Reconciliation?

I’m not particularly convinced by the argument that there’s too much science out there. Research is still filtered by journal quality. Personally, I read journal tables of contents for the best and most relevant journals in my field. I also have google scholar alerts set for a few topics relevant to my research, so that when someone publishes something in a place that would be harder to find I know about it. This has been useful. I’m glad they published it, even if it’s in an obscure place.

With that in mind, I wonder if there is a way to publish datasets with a methods description and some metadata but without having to write a full paper.

There are, of course, many online data repositories. But I don’t believe people use them for this purpose as much as they could. It is now becoming common for journals to require that data be archived when a paper is published, so much of the data in these repositories is simply data that actually already has been published. In other cases people only bother with publishing a dataset as-is if it is large or has taken a lot of time to curate, and might be of particular interest and use to the community. Smaller datasets of pilot projects or null results are not often given the same treatment.

And while published datasets are searchable within the individual repositories archives, they don’t show up in the most common literature search tools, because they aren’t literature: they are just data.

Is there a way that we could integrate the two? If you have five papers-worth of data that you don’t think you’ll ever publish, why can’t we have a data repository system which includes a robust methods and metadata section, but skips the other parts of a traditional manuscript? If this were searchable like other kinds of literature, it could contribute to more accurate meta-analyses and a faster advancement of science, because people would be able to see what had been done before, whether it “worked” and was published with high impact or not. The peer review process could also be minimal and, as with code or already existing data archives, these data papers could have DOI’s and be citable.

But I’m not sure if this is realistic (and honestly, I haven’t thought through the specifics!). Science seems slow to change in a lot of ways. Methods change fast. Open access and online-only publishing have swept through to success. But creative ideas like post-publication review, preprints, and other innovations have been slower to catch on. These types of ideas tend to generate a group of fierce supporters, but to have a difficult time really permeating the scientific “mainstream”.

The scientific community is big – how can we change the culture to prevent our large and growing file drawers full of unpublished results from biasing the literature?

Stay tuned for part two of this series, about other reasons that people are pressured not to publish results – for instance, internal or external politics, competing hypotheses, stolen data. Part two will be published later this week. If you want to take the survey before it goes up, click here.

recent sciencing.

After a conference in Uppsala, Sweden, I had the chance to catch up with a lot of friends from my masters program, many of whom recognized how totally awesome Sweden is and decided to stay in Uppsala to do their PhD's! left-right: Willian from Brazil, me, Lore from Mexico, Sergio from Colombia. It's fun to be part of such an international bunch.

After a conference in Uppsala, Sweden, I had the chance to catch up with a lot of friends from my masters program, many of whom recognized how totally awesome Sweden is and decided to stay in Uppsala to do their PhD’s! left-right: Willian from Brazil, me, Lore from Mexico, Sergio from Colombia. It’s fun to be part of such an international bunch and I was so, so happy to see them all again.

I always tell people when I’m interviewing for a job, ski racing prepared me very well for the constant criticism and failure you experience in science. Every time you try to publish something, you receive harsh critiques during the peer review process, even if the paper is eventually accepted. The only way to improve is to continually solicit these beatdowns, then lick your wounds and try harder.

Skier or worker (or both), we're all just busy little worker bees in the end.... trying industriously to make ourselves better at what we do.

Skier or worker (or both), we’re all just busy little worker bees in the end…. trying industriously to make ourselves better at what we do. This bumblebee was having a fantastic time in the Uppsala botanical garden.

When that happens, I think back to training with the Craftsbury Green Racing Project, and surviving things I didn’t think I could survive. For instance: the one time Pepa asked us to do max-level 2-minute intervals on the SkiErg before eating anything in the morning, then gave us a quick breakfast, and then we rollerskied for like, four hours. Maybe some people went for five? When we got our instructions, I thought it was impossible that I’d finish. But I did.

So when I science hard and everything feels (temporarily) like a failure, I remind myself: you can get to the finish line. You are tough. Workouts like that one taught me that while I might never be the best (at ski racing, that’s for sure), I can do a lot of suffering and finish tasks that might seem impossible. Positive self-talk helps!

Workshops and conferences are generally a lot more fun than that workout was. If immersing yourself in scientific research – whether that’s by long and grueling trips to the field, toiling for hours over the lab bench, or frying your eyeballs coding on a computer – is like training, then emerging out of that world into a conference is like the first ski race of the year.

All of a sudden you get to check out what other people have been up to, and test your ideas and your data against them. Of course there’s no single winner at a workshop or conference, but that nervousness and excitement about seeing the community again, and revealing your activities to the experts in the hopes that they will be impressed, really does give a bit of the same feeling as West Yellowstone, the first Eastern Cup, or the first college carnival of the year. Along with the job at hand is also fun socializing, hearing about the new job someone has taken, seeing how much their kid has grown in a year, and rehashing stories from the past.

A bonus of the course in Fribourg was that we ate raclette for dinner in the botanical garden.

A bonus of the course in Fribourg was that we ate raclette for dinner in the botanical garden.

As a PhD student, I still feel like I have to prove myself every time I meet a big name. And at the workshop, in Fribourg, Switzerland, there were times when I felt like, gosh, I’m never going to make it in this world.

One of the lecturers does research about some of the same questions that I study, and is even using some of the same study organisms. He has done a lot of cool research in the past, and I’ve read many of his papers and cite them. It was a great opportunity to see him give a two-hour lecture on things that apply to me so directly.

But at the end of his presentation, he talked about a major new grant he had received to replicate his research in four other countries, to use experimental ponds all throughout Europe, and do a few other things. He was answering my questions, but on a vast, globally-replicated scale!

(Briefly, we study freshwater science, so organisms like insect larvae, crustaceans, and fish that live in streams. Also ponds and lakes and rivers, but mostly streams.)

And here I am in Switzerland. He will have dozens of streams all over the world; I am working so hard to check on ten streams near the German border. How can I possibly compete?

“Maybe I should just go get a job at McDonalds,” I joked to another workshop participant.

And yet, the workshop was really helpful and at other times made me feel like all of my ideas were coming together into something that could be really meaningful. I began to see how my data from those ten streams could link together with experiments I am doing in the lab, and go into one big framework to assess ecosystem functioning and explore various future land use schemes.

There were also some comical moments. Students were given the opportunity to bring a poster to the workshop, but in the end nobody else did besides me. That’s like getting to a race and realizing you’re the only entrant. Would you still do it?

Probably you would – we’re all masochists, looking for a good workout, and chances are you would have driven a long way to get to the race – but it would definitely feel silly.

Instead of having 20 people clustered around one poster, the organizers asked me to put the powerpoint of my poster up on the projector and give a ten-minute presentation in front of the whole group in the lecture hall. I hadn’t prepared anything, but I tried to give a concise and not-too-rambling summary of what I’m up to. The responses were positive, even from the guy with millions of dollars in research funding and study sites all over the world.

So I left the workshop feeling that I hadn’t “won”, but that I now had a much better idea of what else I needed to do when I got home and what I needed to focus on in “training”. I had a vision for how to succeed.

presentation

This talk won me $50!

Fast forward a few days and I was in Sweden for the International Tundra Experiment (ITEX) 2015 meeting. Climate change researchers from around the world gather every year or two to report results and work on synthesis papers where they pool a lot of data from alpine, low arctic, and high arctic research to make solid conclusions about the effects of climate change on these plant communities.

The conference was a bit of an emotional rollercoaster. I was giving a talk, and was quite nervous: after all, it’s a quite specialized conference, so you know that everyone in the audience knows as much or more than you do about the tundra. If you do something wrong, or don’t know what you’re talking about, you can’t really hide.

At a lot of bigger conferences, people might be more expert than you in general theories and ideas, but probably only a few know your study system inside out. But at a conference like ITEX, every single one of them does. It ups the stakes a little bit. The nerves!

On top of that, the first speaker in my session talked for twice his allotted time period! The moderator was trying to cut him off, but he just kept going. By the time I gave my talk people had been sitting in the room for an hour and a half and were supposed to be on a coffee break already. There’s no way they will stay focused, I thought. I’m screwed.

But the talk, titled “Changes in process, not pattern, after a decade of warming in Adventdalen tundra vegetation” and based on research I did for my masters degree on the polar island of Svalbard, went really well. I actually won the second prize for student talks.

(The winner talked about release of biogenic volatile organic compounds – the chemicals that make fruit smell good, attract pollinators to flowers, or deter herbivores – and for successfully explaining biochemistry to plant ecologists, she definitely deserved the prize! I’m going for the win next time though.)

The next day, the paper I had written based on that same data came back from a prominent ecology journal…. with a rejection. I mean, I hadn’t thought the paper was the greatest thing in the world, but I had been happy with the draft I had submitted and proud that I had made the whole thing myself. I thought I had good ideas. The message was that… I didn’t.

But here’s the thing about science: even while eviscerating my narrative, the reviewers gave some incredibly helpful suggestions and made it clear that they thought that the work was valuable and should be published. There was the stick, but also the carrot. It’s like a coach saying, “I see potential!”

And that gets back to the positive self-talk. In ski racing and in life, isn’t so much ultimately about convincing ourselves that we have potential?

first big-girl paper!

In case you missed my facebook/twitter/researchgate/everything blitz, I finally published my first first-authored paper! It is in Oecologia, a good general ecology journal. I’m really happy and proud of myself, and a number of people have told me that this paper you’ll be happiest and most satisfied to publish, ever. I’m certainly enjoying the new addition to my CV.

Here’s a link to the paper, and here’s an abstract:

“Alpine plant communities are predicted to face range shifts and possibly extinctions with climate change. Fine-scale environmental variation such as nutrient availability or snowmelt timing may contribute to the ability of plant species to persist locally; however, variation in nutrient availability in alpine landscapes is largely unmeasured. On three mountains around Davos, Switzerland, we deployed Plant Root Simulator probes around 58 Salix herbacea plants along an elevational and microhabitat gradient to measure nutrient availability during the first 5 weeks of the summer growing season, and used in situ temperature loggers and observational data to determine date of spring snowmelt. We also visited the plants weekly to assess performance, as measured by stem number, fruiting, and herbivory damage. We found a wide snowmelt gradient which determined growing season length, as well as variations of an order of magnitude or more in the accumulation of 12 nutrients between different microhabitats. Higher nutrient availability had negative effects on most shrub performance metrics, for instance decreasing stem number and the proportion of stems producing fruits. High nutrient availability was associated with increased herbivory damage in early-melting microhabitats, but among late-emerging plants this pattern was reversed. We demonstrate that nutrient availability is highly variable in alpine settings, and that it strongly influences performance in an alpine dwarf shrub, sometimes modifying the response of shrubs to snowmelt timing. As the climate warms and human-induced nitrogen deposition continues in the Alps, these factors may contribute to patterns of local plants persistence.”

DSCN0427

new paper.

A new paper I co-authored with my masters supervisor Juha Alatalo is out in Scientific Reports (he’s the first author, but my day is coming soon! stay tuned in the next few months!). It’s called “Vascular plant abundance and diversity in an alpine heath under observed and simulated global change.” Because SR is open access, you can read it! Click here for the PDF.

It’s based on an old dataset from Latnjajaure, Sweden, which I analyzed as part of a 15-credit “research training” course in my masters. I only later had the chance to spend a few weeks at the Latnja field station, and it was absolutely one of the most beautiful places I’ll ever have the chance to do fieldwork. Getting this email that the paper was published made me think back on my summer experience there! Here’s a few photos to get you in the mood.

IMGP6421

IMGP6385

IMGP6460

IMGP6485