Practice English Speaking&Listening with: Dr. Ira Leifer - 'The Arctic (and Antarctic) Data Problem'

Normal
(0)
Difficulty: 0

Peter and I have have discussed more than once the bias inherent in current research

preferring modeling to expeditionary or in-situ science that is, you know,

rather than send somebody there to do the measurements, you want to model what must be happening.

That's certainly a problem.

Models are by definition approximation of reality.

They're a best-effort approximation.

There also are always known errors and uncertainties in them, but the problem one runs into,

to use ...Oh... Rumfield's language,

there are the known unknowns and then the unknown unknowns

and a model cannot incorporate something we don't know about.

Reality will incorporate it because it's real. It's a real unknown unknown that happens and,

you know, I guess that's John Lennon, "Life is what happens when you're making plans."

So, reality is what happens when a model is making a prediction and things evolve the way they do.

Again, they don't care about our model, but there's no way to incorporate

processes we don't know about, even though there are, and this is what science is all about

is discovering new processes that we don't know about.

So, you can't model them.

The only way you can effectively push the boundaries of our knowledge is by

measuring and then taking your measurements to the modelers and saying,

"Incorporate this, please".

Yeah, to which they have to agree and so needed

that you're presenting them has to somehow

If not agree with their worldview it has to fit into their worldview. Um

It's always a tough question. What you're really referring to is data can be wrong. Models can be wrong.

People who collect data can ignore models and people who run models can ignore data.

The question is, at what point is the amount of information enough

to convince a community to affect the change in

what they're doing with their way of looking at things

or the paradigm and

sometimes, you know, we do hope that things can change in a timely manner.

But there's a fascinating story of continental drift

that basically showed that

50... 60 years of data that the continents were drifted were ignored

until all of the scientists of the old day died of old age.

And only then did the paradigm shift and the models of continental drift suddenly were accepted.

So, humanity has, unfortunately, this characteristic of

refusing to acknowledge new paradigms and,

I mean, not that I'm looking forward to my mortality, but I think it's one of the reasons why evolution includes death.

Well, there's ... there's one thing that I talked with with Peter Wadhams about, which is I have a six minute clip of

Natalia Shakhova who's

voicing a complaint in a private interview with someone. I don't think it was ever published,

about her attempt to get a paper published in Nature Geoscience.

When they publish other papers that say, oh, there's nothing going on in the Arctic based on no data,

you know, so anyhow, but the bias in the community and the scientific community is the topic.

So there are a couple of issues there, but one of them is that

there is a bias towards models in that they are, I guess,

rated a bit more strongly or highly in terms of publication and reference,

So, it turns out that it is much easier to run a model than it is to go and collect data in someplace like the Arctic.

Less expensive.

That is one of the main reasons, but also the weather doesn't have a veto on whether your model output gets some output ... data ...

the weather can just completely veto and you get nothing that whole year even though you were already and you got the money

just especially again in the Arctic, which has some of the most extreme weather on the planet...

Antarctic being the other, but anyway.

So, the other thing is that models are extremely complicated and

so a lot of the potential areas

in which the model may not be accurately representing reality

can be very hard to,

even for a reviewer, to actually identify

and so this is how in various cases in the past,

models have been very well published and yet gotten it wrong.

And that's because where the

assumption that was not quite accurate was based,

is not very evident or easy to find through the peer review process.

You know, so on data,

they're much more transparent,

as in where the problem might be if the data is flawed.

It's usually much easier to review a data, but now you run into the problem

that if someone has a model that shows something and the data disagrees,

which one is going to be believed and

so this is where this bias comes into and can happen.

And when people have turfs that they're defending all the papers they've done in the past within the old, they are

reticent sometimes to give way to the data which says they're wrong.

Yeah, and this is just the structure of the way science works these days...

you know, the flip side and this is one of the areas of,

you know, concern with models of the Arctic is

if we use as our experience

the weather,

the US has a phenomenally dense network of weather measurements.

And yet everyone knows the weather forecast still have problem.

But there's an enormous amount of input data

from weather stations and NOAA and so on. Now, let's jump up to the Arctic.

There's almost no data. The only data that there is such as it is comes from satellite, who,

I will just say right out, there's always an invalidation or

interpretation question with satellite data if there isn't

in-situ data to compare it with. And so in the area which is on our most recent study, the Barants-Kara Sea,

it's millions of square kilometers of the Arctic,

which has vast deposits of methane in the form of hydrates, natural gas, and oil,

reservoirs and so on.

There are two weather stations.

So you have ... so

you know, where is the data to validate

any model transport winds and so on?

It doesn't exist. There's almost no infrastructure in the entire area.

There's no people living there. The polar bears do not measure wind for us.

And so you end up with, to phrase it politically correctly,

far greater uncertainty in the interpretation of a model in the Arctic when it's based in parts of the Arctic

where there's virtually no data. You ensure the model reflects reality.

The Description of Dr. Ira Leifer - 'The Arctic (and Antarctic) Data Problem'