variance of a dataset?

Hi there!
Sorry if this question has been asked before, but I couldn't find it, probably because I'm lacking terminology.

I have a dataset, where one parameter is measured over time for two different groups. Right now I'm trying to find a way to see wich dataset varies more. Standard deviation is obviously a good parameter for certain points in time, though I'm trying to find a way to compare the two groups over certain time period.
does anybody know a good way to compare them and is there a specific parameter for something like that?

thanks!

This can be tricky, since it depends on what you mean by "varies more". If the two data sets are stationary (no predictable/explainable changes over time), then it would make sense to ask which has the higher coefficient of variation. (I would use CV rather than standard deviation if the two data sets differ in magnitude.) On the other hand, if either or both series exhibit trend or seasonality, then it gets more complicated, and the answer may depend on whether you are interested in overall variation or randomness. For instance, a series with steep trend but little noise would change a lot from beginning to end of the time horizon but not have much randomness.

You may have already done this, but my first suggestion would be to plot the data and see what a visual inspection shows.

thank you, that already helps with the direction!
all parameters have an optimal value but differ from that. in most parameters, there is a slight negative trend, but nothing tremendous, like you said it's basically about the magnitude i would say. Using the CV definitely makes sense! Thank you.

yes I have, i just wanted something more solid and if possible validate what I see with a value.

Have you considered using measures other than standard deviation for variability comparison, such as variance or interquartile range?