Snotbubbles said:
Significance and statistical significance are not the same thing. I'm not aware of what the distinction is, maybe someone who paid attention in statistics class could opine.
When researchers say that they found a "significant" link between X and Y, they always mean "statistically significant." As you've noted, that term has a very specific, technical meaning. Basically it means that their data show a relationship between X and Y that's very unlikely to be due to random noise in the data themselves.
Consider flipping a fair coin -- it comes up heads with probability 0.5 and tails with probability 0.5, but we all know that if you flip that coin ten times, you probably aren't going to get five heads and five tails. If you flip that coin ten times and get seven heads, your data are telling you "the probability of this coin coming up heads is 0.7" but that doesn't really mean anything because that's just how luck works. Our data are too noisy to draw any meaningful conclusion at all from that sort of result. In technical terms, our result is well within the window of outcomes you might expect from chance, so it's not a statistically significant result.
If I flip that same coin a billion times and it comes up heads 51% of the time and tails 49% of the time, the sheer number of tosses makes me more confident that something's wrong with this coin. 51/49 is a lot closer to 50/50 than 7/3, but the former result
probably isn't due to chance. (There's a formula somewhere for this that I'm too lazy to look up, and it doesn't matter for purposes of explanation).
So a coin that comes up heads 51% of the time in a billion tosses is "significantly" biased in the sense that I'm confident that it's not 50/50. But it's not "significantly" biased in the normal English sense of being "largely" biased -- it's just slightly off.
I hope that helps. TLDR a result can be statistically significant but still really small, and results can also be really large but statistically insignificant.