secure data

Who Will Safeguard the Data?

 

The most basic estimations and predictions all count on accurate data as the start point. People routinely use information generated by research to make decisions of all kinds. Heck, some even use it to advise other people. It’s assumed that researchers are experts in their fields, so should know what they’re talking about. But how do we know for sure that they’re using data, methodology, validation and interpretive tools properly?

As each specialty grows, interpretation of data increasingly leaves those outside the area of expertise struggling to understand. No matter which niche you or I are in, there’s little choice but to make more and more use of explanations from specialists – or people who report in their stead. Building in the protections needed to make sure the information is solid, beginning from the raw data stage through to the final interpretation, is a process who’s time has come.

And here’s an example of why it might be a good idea to hurry things up:

 

“Decreases in Global Beer Supply Due to Extreme Drought and Heat”

 

published in Nature on October 15th, 2018

 

According to the abstract:

 

“Although not the most concerning impact of future climate change, climate-related weather extremes may threaten the availability and economic accessibility of beer.”

Let’s get some details not mentioned in the publishing journal’s introduction out of the way first. In case you’re not familiar with how beer is made, barley is often included in the recipes, not just for flavor, but to supply amylase. This enzyme speeds the break-down of starch into sugars the alcohol producing yeasts can make best use of.

I couldn’t count how many beers and similar beverages there are in the world that don’t include this grain. There’s lots of other sources of amylase. Master brewers have been looking in to what to do about barley shortages for at least a decade now.  They seem to have things well in hand.

Since most grains, and a fair number of other plants, enjoy the same climatic conditions as barley, if this crop had a major failure, so too would the rest. A beer shortage would be the least of anyone’s worries.

And even though this particular paper might not trigger havoc in publicly traded beer, agricultural, or related industry stocks, it’s not unheard of for faulty use and interpretation of data to have an influence on what’s happening around Wall St.

 

Beer VS Data

 

 

Accurately predicting severe weather impacts is key to devising effective ways to minimize expected consequences. The “Beer” study doesn’t offer much toward the genuine need to figure out matters of food security and public safety. There’s also the matter of space taken up in supposedly professional publications that shuts out quality information that could be of help, to keep in mind.

Using data responsibly is undeniably essential to credible and safe research outcomes. It isn’t that the need for improvement hasn’t been recognized. A few attempts to put higher standards in place are currently in the works.

 

Preregistration: Sorting the Barley from the Chaff

 

Research in every specialty is under a considerable amount of scrutiny these days. Errors are mounting and the need for bare minimums in standards to be put in place are the same for all:

 

 

A methodological revolution is underway in psychology, with preregistration at the forefront. Methodologists have made the case for the value of preregistration — the specification of a research design, hypotheses, and analysis plan prior to observing the outcomes of a study. And indeed, it is hardly radical to hold that predictions should be specified before looking at the data. Preregistration improves research in two ways. First, preregistration provides a clear distinction between confirmatory research that uses data to test hypotheses and exploratory research that uses data to generate hypotheses. Mistaking exploratory results for confirmatory tests leads to misplaced confidence in the replicability of reported results.

 

Psychologicalscience.org

 

 

The beer study falls into the exploratory category, but the abstract doesn’t make this clear. Preregistration would eliminate this kind of confusion and more. Those who provide funding for research could insist on it. And if journals would adopt the practice of making preregistration a condition of publishing, gains in quality would quickly spread.

 

 

The Future of Data

 

There are more than a few complaints that preregistration would slow research down, but the need for speed isn’t all there is to it. There’s a limit to how quickly science and technology can go without risk of major errors. Errors destroy the value of the work done, and if they aren’t caught, they become the basis for more data interpretation – and decision-making – that then can’t help but be inaccurate, no matter how much care is taken. Going more slowly now is very likely to make faster progress possible in the long run.

Preregistration is just a preliminary measure. The call for consistent replication of studies gets louder by the day. But even though it’s the logical “next step” to take, it’s implementation is also meeting resistance. For your interest, here’s a link to a recent example from Retraction Watch.

The Center for Open Science is leading the way with preregistration. It requires that researchers put their cards on the table by publishing their plans and intentions even before they start gathering data. For those looking for more assurance of quality, at least there’s now a start-point in the hunt.

And we can cross our fingers that some day soon funders and journals will do their bit to safeguard data by at least making preregistration, if not replication, a condition of publication.

Leave a Reply