Looking for meaningful (mostly auditory) things in the human brain, and then going to the pub. #mri #meg #neuroscience #pub
19 stories
·
1 follower

You don’t have my permission

1 Share

Don’t ask for permission, come with intent.

I don’t read many business books, but last year I read one that had a profound effect on me: “Turn The Ship Around” by L. David Marquet. I guess it’s not really a business book, which is probably why I liked it.

Here’s how it’s described on Amazon:

“Leadership should mean giving control rather than taking control and creating leaders rather than forging followers.” David Marquet, an experienced Navy officer, was used to giving orders. As newly appointed captain of the USS Santa Fe, a nuclear-powered submarine, he was responsible for more than a hundred sailors, deep in the sea. In this high-stress environment, where there is no margin for error, it was crucial his men did their job and did it well. But the ship was dogged by poor morale, poor performance, and the worst retention in the fleet.
Marquet acted like any other captain until, one day, he unknowingly gave an impossible order, and his crew tried to follow it anyway. When he asked why the order wasn’t challenged, the answer was “Because you told me to.” Marquet realized he was leading in a culture of followers, and they were all in danger unless they fundamentally changed the way they did things. That’s when Marquet took matters into his own hands and pushed for leadership at every level.
Turn the Ship Around! is the true story of how the Santa Fe skyrocketed from worst to first in the fleet by challenging the U.S. Navy’s traditional leader-follower approach. Struggling against his own instincts to take control, he instead achieved the vastly more powerful model of giving control. Before long, each member of Marquet’s crew became a leader and assumed responsibility for everything he did, from clerical tasks to crucial combat decisions. The crew became fully engaged, contributing their full intellectual capacity every day, and the Santa Fe started winning awards and promoting a highly disproportionate number of officers to submarine command.

The fundamental premise is that he realized that when people come to you for orders, or ask your permission to do something, they don’t bring any of their own responsibility to the request. They’re asking you if they can xyz. That puts it on you. They don’t have to fully consider their ask because they still need you to OK it. You’re their door stop just in case. So it’s not about them and what they want to do, it’s about what you are OK with them doing. And even if you OK it, it only happened because you said it could happen. That creates too many dependencies, and — like Marquet — I believe people and teams within an organization should be able to move independently of one another. Fewer dependencies, not more.

So instead of asking permission and or seeking orders, he told his sailors to come to him with intent. Instead of “Captain, may I turn the ship starboard 30 degrees?” (which asks for the Captain’s permission to OK the command), he wanted people to come to him saying “Captain, I’m going to turn the ship starboard 30 degrees.” In just a few words, everything’s different.

“May I?” pushes all the power and responsibility to the person granting the permission. “I’m going to” squarely puts the responsibility on the person who’s going to carry out the action. When the person doing the work is the person that has to live with the consequences, they tend to think more completely about what they’re about to do. They see it from more angles, they consider it differently, they’re more thoughtful about it because it’s ultimately on them. When you don’t have permission, it’s on you to make the call.

Now, Captain Marquet still often wanted to hear the intent — especially when the outcome affected the whole ship, and especially early on as he was implementing the new system — but ultimately this is about involving one brain less (Marquet’s), and a hundred brains more (the rest of the people on the ship).

As the CEO of Basecamp, I’ve taken this to heart. I’m still working on it — it’s a big shift at times and I often have to push back against some long-standing habits— but I don’t want people asking me for permission. In almost every case, if someone asks me for permission, something’s wrong.

People shouldn’t ask me if they can do this or that. I want people to tell me what they intend to do. If they want to hear my thoughts about their intention, let’s talk! Let’s riff! Let’s work through it. But don’t ask me if you can this or that — tell me what you’re going to so I can cheer you on, help you out, ask a question, or suggest another approach that may be worth considering. But if it won’t happen unless I say so, I won’t say so.

None of this means I don’t provide feedback, or direction, or guidance, or vision, or purpose. None of this means I can’t disagree — strongly at times. And none of this means if I see you’re about to jump off a cliff I won’t stop you.

But it does mean that generally, most of the time — and hopefully more and more of the time — people will get better and better at thinking things through completely, building the confidence to stand behind their convictions, and take full responsibility for the calls they make.

I’m not talking about a free-for-all. I’m talking about a think-for-all.

Of course there are always exceptions. Captain Marquet reserved one order for him and him alone — the order to kill. If the sub had to fire a weapon, if someone could die, it was his call. I’m still trying to figure out what my reserved orders are, but hopefully there are fewer and fewer over time.

If you haven’t read Turn The Ship Around, please do. It’s a wonderful, important book with great lessons and honest writing. I think you’ll really enjoy it.


You don’t have my permission was originally published in Signal v. Noise on Medium, where people are continuing the conversation by highlighting and responding to this story.

 

Read the responses to this story on Medium.

Read the whole story
jpeelle
2861 days ago
reply
St. Louis, USA
Share this story
Delete

Today on the grammar rodeo: that vs. which

2 Comments and 4 Shares

New Yorker copy editor Mary Norris explains when the magazine uses "which" and when it uses "that", a distinction I confess I had little knowledge of until just now.1 A cheeky example of the difference by E.B. White:

The New Yorker is a magazine, which likes "that."
The New Yorker is the magazine that likes "which."

(via df)

  1. This is why, when anyone asks me what I do for a living, the answer is never "writer". Writing for me is a brute-force operation; I'll use whatever is necessary to make it sound like I'm talking with you in person. (Wait, is a semicolon appropriate there? Should I have used "as though" instead of "like"? Who gives a shit!) I use too many commas (but often not between multiple adjectives in front of nouns), too many "I"s, too many "that"s (OMG, the thats), too many weirdo pacing mechanisms like ellipses, dashes, & parentheses, mix tenses, overuse the passive voice, and place unquoted periods and commas outside quotation marks like the Brits, although I was doing it before I learned they did because it just seemed to make sense. So, anyway, hi, I'm not a writer...who writes a lot.

Tags: E.B. White   language   Mary Norris   video
Read the whole story
jpeelle
2872 days ago
reply
St. Louis, USA
Share this story
Delete
2 public comments
bjtitus
2871 days ago
reply
I'm not sure I fully comprehend the which/that difference here. In both examples it seems like the clause could stand alone with extra nouns referring to the previous sentences.
Denver, CO
satadru
2872 days ago
reply
The best example of a restrictive vs non-restrictive clause I could find: https://img.buzzfeed.com/buzzfeed-static/static/2016-05/26/14/asset/buzzfeed-prod-fastlane01/sub-buzz-14227-1464288454-11.jpg?no-auto
New York, NY

Hot off the presses! Check out the cover of my new book: Women...

1 Share


Hot off the presses! Check out the cover of my new book: Women In Science. Thanks to ten speed press for sending me a copy! Very proud and excited. Hits stores July 26 but you can pre-order it here:
readwomeninscience.com

Read the whole story
jpeelle
2902 days ago
reply
St. Louis, USA
Share this story
Delete

Slow Science

1 Share

I love this painting by Carl Larsson. Here is a domestic scene of a mother and two children shelling fresh peas into an earthenware pot, pods heaping up on the floor. They are immersed in their work in companionable silence. They can anticipate a tasty seasonal meal. This is not  opening a bag of frozen peas and boiling it for a few minutes. This is slow food, savoured for its own sake. The slow food movement, according to Wikipedia,  started in the 1980s as a protest to resist the spread of fast food. It rapidly spread with the aim to promote local foods and traditional gastronomy. In August 2012 in Aarhus I first hit on the idea of slow science. Just as with food production slowness can be a virtue: it can be a way to improve quality and resist quantity.
painting1I had to give a short speech to celebrate the end of the Interacting Minds Project and the launch of the Interacting Minds Centre. I was looking back on the preceding 5 years and wondered whether the Project would have been predicted to succeed or to fail. I found that there had been far more reason to predict failure than success. One reason was that procedures for getting started were very slow, so slow – that they made us alternately laugh and throw up our hands in disbelief.

But, what if the success of the Project was not despite the slowness, but because of it? Chris and I had been plunged from the fast moving competitive UCL environment in London into a completely different intellectual environment. This was an environment where curiosity driven research was encouraged and competition did not count for much. After coming to Denmark for some extended stays and for at least one month every year since 2007 we have been converted. We are almost in awe of slowness now. We celebrate Slow Science.

Can Slow Science be an alternative to the prevalent culture of publish or perish?Modern life has put time pressure on scientists, particularly in the way they communicate: e-mail, phones and cheap travel have made communication almost instant. I still sometimes marvel at the ease of typing and editing papers with search, delete, replace, copy and paste. Even more astonishing is the speed of searching libraries and performing data analysis. What effect have these changes in work habits had on our thinking habits?

Slow Food and Slow Science is not slowing down for its own sake, but increasing quality. Slow science means getting into the nitty gritty, just as the podding of fresh peas with your fingers is part of the production of a high quality meal. Science is a slow, steady, methodical process, and scientists should not be expected to provide quick fixes for society’s problems.

I tweeted these questions and soon got a response from Jonas Obleser who sent me the manifesto of slowscience.org from 2010. He had already put into words what I had been vaguely thinking about.

Science needs time to think. Science needs time to read, and time to fail. Science does not always know what it might be at right now. Science develops unsteadily, with jerky moves and unpredictable leaps forward—at the same time, however, it creeps about on a very slow time scale, for which there must be room and to which justice must be done.

Slow science was pretty much the only science conceivable for hundreds of years; today, we argue, it deserves revival and needs protection. … We do need time to think. We do need time to digest. We do need time to misunderstand each other, especially when fostering lost dialogue between humanities and natural sciences. We cannot continuously tell you what our science means; what it will be good for; because we simply don’t know yet. 

These ideas have resurfaced again and again. Science journalist John Horgan posted this on his blog in 2011 “The Slow Science Movement Must be Crushed” with the punch line that if Slow Science caught on, and scientists started publishing only high quality data that have been double- and triple-checked, then he would have nothing to write about anymore.

Does science sometimes move too fast for own good? Or anyone’s good? Do scientists, in their eagerness for fame, fortune, promotions and tenure, rush results into print? Tout them too aggressively? Do they make mistakes? Exaggerate? Cut corners? Even commit outright fraud? Do journals publish articles that should have been buried? Do journalists like me too often trumpet flimsy findings? Yes, yes, yes. Absolutely.

I liked this, but not much more was discussed on blogs until it came to the more recent so-called replication crisis. I wonder if it possibly has converted some more scientists to Slow Science. Earlier this year, Dynamic Ecology  Blog had a post “In praise of slow science” and attracted many comments:

It’s a rush rush world out there. We expect to be able to talk (or text) anybody anytime anywhere. When we order something from half a continent away we expect it on our doorstep in a day or two. We’re even walking faster than we used to.

Science is no exception. The number of papers being published is still growing exponentially at a rate of over 5% per year (i.e. doubling every 10 years or so). Statistics on growth in number of scientists are harder to come by … but it appears …the individual rate of publication (papers/year) is going up.

There has been much unease about salami slicing to create as many papers as possible; about publishing ephemeral results in journals with scanty peer review. Clearly if we want to improve quality, there are some hard questions to be answered:

How do we judge quality science? Everyone believes their science is of high quality. It’s like judging works of art. But deep down we know that some pieces of our research are just better than others. What are the hallmarks? More pain? More critical mass of data? Perhaps you yourself are the best judge of what are your best papers. In some competitive schemes you are required to submit or defend only your best three/four/five papers. This is a good way of making comparisons fairer between candidates who may have had a different career paths and shown different productivity. More is not always better.

How to improve quality in science? That’s an impossible question, especially if we can’t measure quality, and if quality may become apparent only years later. Even if there was an answer, it would have to be different for different people, and different subjects. Science is an ongoing process of reinvention. Some have suggested that it is necessary to create the right social environment to incubate new ideas and approaches, the right mix of people talking to each other. When new tender plants are to be grown, a whole greenhouse has to be there for them to thrive in. Patience is required when there is unrelenting focus on methodological excellence.

Who would benefit? Three types of scientists: First, scientists who are tending the new shoots and have to exercise patience. These are people with new ideas in new subjects. These ideas often fall between disciplines but might eventually crystallising into a discipline of their own. In this case getting grants and getting papers published in traditional journals is difficult and takes longer. Second, scientists who have to take time out for good reasons, often temporarily. If they are put under pressure, they are tempted to write up preliminary studies, and by salami slicing bigger studies. Third, fast science is a barrier for scientists who have succeeded against the odds,  suffering from neuro-developmental disorders, such as dyslexia, autism spectrum disorder, or ADHD. It is well known that they need extra time for the reviewing and writing-up part of research. The extra time can reveal a totally hidden brilliance and originality that might be otherwise lost.

We also should consider when Slow Science is not beneficial.  If there is a race on and priority is all, then speed is essential. Sometimes you cannot wait for the usual safety procedures of double checking and replication.This may be the case if you have to find a cure for a new highly contagious illness. In this case be prepared for many sleepless nights. Sometimes a short effortful spurt can produce results and the pain is worth it. But it is not possible to maintain such a pace. Extra effort mean extra hours, and hence exhaustion and eventually poorer productivity.

An excuse for being lazy? Idling, procrastinating, and plain old worrying can sometimes bring forth bright flashes of brilliance. Just going over data again and again can produce the satisfying artisanal feelings one might expect to find in a ceramic potter or furniture maker. Of course, the thoughts inspired by quiet down time will be lost if they are not put into effect. Since slow science is all about quality,  this is never achieved by idling and taking short cuts, or over-promising. Slow science is not a way of avoiding competition and not a refuge for the ultra-critical who can’t leave well enough alone. Papers don’t need to be perfect to be published.

What about the competitive nature of science? Competition cannot be avoided in a time of restricted funding and more people chasing after fewer jobs. In competition there is a high premium on coming first. I was impressed by a clever experiment by Phillips, Hertwig, Kareev and Avrahami (2014): Rivals in the dark: How competition influences search in decisions under uncertainty [Cognition, 133(1).104-119]. These authors used a visual search task, where it mattered to spot a target as quickly as possible and indicate their decision with a button press. The twist was that players  were in a competitive situation and did not know when their competitors would make their decision. If they searched carefully they might lose out because another player might get there first? If they searched only very cursorily, they might be lucky. It turned out that for optimal performance it was adaptive to search only minimally. To me this is a metaphor of the current problem of fast science.

A solution to publish or perish? There may be a way out.  Game theory comes to our aid. The publish or perish culture is like the prisoner’s dilemma. You need to be slow to have more complete results and you need to be fast to make a priority claim, all at the same time.  Erren, Shaw & Morfeld (2015) draw out this scenario between two scientists who can either ‘defect’ (publish early and flimsy data) or ‘cooperate’ (publish late and more complete data). suggest a possible escape. Rational prisoners would defect. And this seems to be confirmed by the command publish or perish. The authors suggest that it should be possible to allow researchers to establish priority using the equivalent of the sealed envelope, a practice used by the Paris Académie des Sciences in the 18th century.  Meanwhile, prestigious institutions would need to foster rules that favour the publication of high quality rather than merely novel work. If both these conditions were met the rules of the game would change. Perhaps there is a way to improve quality through slow science.

 

Read the whole story
jpeelle
3111 days ago
reply
St. Louis, USA
Share this story
Delete

Dorothy Bishop on her PrePrint Experiences at PeerJ

1 Share
PrePrints continue to increase in popularity among academics, with a number of recent blog posts highlighting their utility (from ourselves, Stephen Curry, Liz Martin-Silverstone, Tim Gowers and Mike Taylor). Given this level of interest,...
Read the whole story
jpeelle
3123 days ago
reply
St. Louis, USA
Share this story
Delete

Reflections about our first Open-Science-Committee’s meeting

1 Share
Today, we had the first meeting of our department’s Open Science Committee. I am happy that the committee has 20 members, representing every research unit of the department, and all groups from PhD students to full professors.
In the meeting, I first gave a quick overview about the replication crisis in psychology. I had the impression that we had a large consensus about the fact that we indeed have a problem, and that we should think about possible consequences. Then we started an open discussion where we collected questions, reservations, and ideas.
Here are some unordered topics of our discussions. Not all of them could be resolved at that meeting (which was not the goal), but eventually these stubs could result in a FAQ:
  • It is important to acknowledge the diversity of our (sub)fields. Even if we agree on the overarching values of open science, the specific implementations might differ. The current discussion often is focused on experimental laboratory (or online) research. What about existing large-scale data sets? What about sensitive video data from infant studies? What about I&O research in companies, where agreements with the work council forbid open data? Feasible protocols and solutions have to be developed in these fields.
  • Does the new focus on power lead to boring and low-risk “Mturk-research”? This is certainly not the goal, but we should be aware that this could happen as an unintended side effect. For example, all ManyLab projects focused on easy-to-implement computer studies. Given the focus of the projects this is understandable; but we should not forget “the other” research.
  • We had a longer discussion which could be framed as “strategic choices vs. intrinsic motivation (and social mandate) to increase knowledge”. From a moral point of view, the choice is clear. But we all are also individuals who have to feed our families (or, at least, ourselves), and the strategic perspective has an existential relevance to us.
    Related to this question is also the next point:
  • What about the “middle” generation who soon will look for a job in academia? Can we really recommend to go the open way? Without the possibility to p-hack, and with the goal to run high-powered studies (which typically have a larger n), individual productivity (aka: # of published papers) will decline. (Of course, “productivity” in terms of “increase in valid knowledge” will rise).
    This would be my current answer: I expect that the gain in reputation outweighs the potential loss in # of published papers. Furthermore, we now have several techniques which allow us to assess the likelihood of p-hacking and the evidential value of a set of studies. If we present a paper with 4 studies and ps = .03, .04, .04, and .05, chances are high that we do not earn a lot of respect but rather sceptical frowns. Hence, with the increasing knowledge of healthy p-curves and other indicators, the old strategy of packing together too-good-to-be-true studies might soon backfire.
    Finally, I’d advocate for an agentic position: It’s not some omnipotent external force that imposes an evil incentive structure on us. We are the incentive structure! At least at our department we can make sure that we do not incentivize massive p-hacking, but reward scientists that produce transparent, replicable, and honest research.
  • The new focus on replicability and transparency criteria does not imply that other quality indicators (such as good theoretical foundations) are less important.
  • Some change can been achieved by positive, voluntary incentives. For example, the Open Science Badges led to 40% of papers having open data in the journal Psychological Science. In other situations, we might need mandatory rules. Concerning this voluntary/mandatory dimension: When is which approach appropriate and more constructive?
  • An experience: Registered reports can take a long time in the review process – A problem for publication-based dissertations?
  • We have to teach the field on what methods we base the conclusion that we have a replicability problem. In some discussions (not in our OSC ;-)) you can hear something like: “Some young greenhorns invent some fancy statistical index, and tell us that everything we have done is crap. First they should show that their method is valid!” It is our responsibility to explain the methods to researchers that do not follow the current replicability discussion so closely or are not so statistically savvy.
  • Idea: Should we give an annual Open-Science-Award at our department?
We have no ready-made answers for many of these questions. Most of them have to be tackled at multiple levels. Any comments and other perspectives on these open questions are appreciated!

Talks and Workshops

One goal of the committee is to train our researchers in new tools and topics. I am happy to announce that we will host at least 4 talks/workshops in our department in the remainder of 2015:
  • Sep 30, 2015: Jonathon Love (Amsterdam): JASP – A Fresh Way to Do Statistics (14-16, room 3322)
  • Nov 5, 2015: Daniel Lakens (TU Eindhoven): Practical Recommendations to Increase the Informational Value of Studies
  • End of November 2015 (TBA): Etienne LeBel: Introducing Curate Science (curatescience.org)
  • Dec 2015 (TBA): Felix Schönbrodt: How to detect p-hacking and publication bias: A practical introduction to p-curve, R-index, etc.
The plan for the next meeting is to discuss our voluntary commitment to research transparency.

Related posts:
Read the whole story
jpeelle
3123 days ago
reply
St. Louis, USA
Share this story
Delete
Next Page of Stories