Can this error be fixed with lifetimes parameters, or am I forced to define intermediate variables like let y = &[&x, &x, &x]? I would like all references to live like x, but cannot find out how to specify it with lifetime parameters.

This can’t be fixed by adding lifetimes. It has to be changed in the compiler so that these temporaries are not dropped. (To be honest, I’ve often felt that the compiler drops temporaries too early in certain cases, and the precise circumstances in which it does so are difficult to understand)

Lifetimes don’t do anything, and can’t change program behavior/generated code. They only describe what the program is doing anyway.

And in this case the program is dropping a temporary value early.

It wouldn’t be a problem if you used owned values in the structs, since then your code could keep the values for as long as it needs, instead of being attached to temporary borrows from the calling environment.

Yes I understand that I can avoid this error either using owned values or temporaries, but I hoped there was a better solution. I am implementing a plotting library (which I hope to publish on crates.io one day), and I think those two solutions have the following shortcomings:

Owned values imply allocations. Since I am using those structs just to organize data before passing them to the lib, and the passed data could be in general extensive, I would like to avoid copying all values to the heap.

Using temporaries means that all users of the plotting lib would have to write 5 lines of code for each plot instead of 3… however if you confirm that there is no better solution to this error (including implementing the structs and/or constructors differently) then I will need to follow this path.

I agree with @ExpHP: I also feel that the compiler often drops temporaries too early (the code above is just one case I hit recently, which I reported because I hoped there was a better solution). Maybe references created as a function argument could be assigned to the parent’s scope? (after all, they could be considered as created and passed there). Or otherwise maybe there could be a way to instruct the compiler to extend the life of a reference to the parent’s scope, something like a 'super lifetime parameter, or a super() method, which would allow for instance to write:

let dataset = DataSet::new("dataset", &'super [&x, &x, &x]);

or:

let dataset = DataSet::new("dataset", &[&x, &x, &x].super());

If there is any agreement on this, should I open a discussion on Rust Internals?

The lifetime of temporaries (and the ensuing “too early drop”) is a known issue - e.g. see this comment and the linked document there.

AIUI, the core issue seems to be around the place where the temp would end up being dropped, and that being somewhat invisible/non-explicit in code; this can have ramifications for unsafe code, for example. Given the fix is to insert an explicit let binding, I suspect this hasn’t been considered too big of a problem - it’s a bit of an ergonomic hit in some cases, but there’s an argument to be made that being explicit with such things is desirable.

Interesting reading! So now that we have NLL I’m really looking forward for “Better Temporary Lifetimes”, as this is one of the issues I hit more often, and I think it is an actual obstacle for new programmers approaching Rust.

Not quite. Vec<&Foo> usually still requires caller to have another Vec or array of <Foo> somewhere to borrow from.

Also on modern architectures references are relatively expensive, because they may reduce cache locality, indirect access is costly when CPU can’t predict/speculate it, and they don’t get autovectorized.

Don’t use Vec of references to such tiny objects, unless you have to use polymorphism. You can even make DataVar a Copy type, because this type itself is nothing more than just a couple of references. Use faster, more efficient Vec<DataVar>.

If you expect users to use one dataset multiple times, then Vec<&DataSet> might be OK (since cloning of the Vec inside it would duplicate its heap data). But if users would typically use each dataset once, then Vec<DataSet> is fine too and it saves a layer of indirection.

This is the data model that I envision for the plotting library I’m implementing:

Users first define several DataVars, each one referencing a series of data.

Then they define one or more DataSets, which are collections of DataVars, with the requirement that they have same length.

Finally they define a DataPlot, which contains at least one DataSet, or more DataSets for instance when not all DataVars have the same length.

To create several plots, a user has to define a DataPlot for each plot, and doing so I expect he would typically reuse several DataVars (for instance the time series), and maybe also some DataSets.

I used references to avoid asking the user to clone DataVars when reusing them. Now that you know the use case scenario, do you still think it is acceptable/advisable to use Vec<DataVar> and Vec<DataSet>? In this case, do you think I should make both DataVar and DataSetCopy types, or only DataVar, or neither one and ask the user to clone them when reusing?

In this case you can’t use the T generic type, because it means one type, but you want each element of the Vec to have its own type (like a hypothetical vec![T, U, V, W…], but there are not enough alphabets in the world to give letters to all possible Vec elements ;))

If the set of types that can be allowed in datavars is known and limited (e.g. either string or number, and nothing else), then you can use datavars: Vec<EnumOfAllowedTypes> + enum EnumOfAllowedTypes { String(DataVar<String>), Number(DataVar<f64>) }. See serde_json::Value for example.

If the set of types is open (technically, infinitely large), you’ll need to use polymorphism. datavars: Vec<Box<dyn DataVar>>. Note that you’ll have to make one universal DataVar trait that works for all types at the same time.

I considered using serde_json::Value, but isn’t serde_json::to_value(datavar.val).unwrap() copying all slice values into the new Value type variable? Moreover I thought that calling method .as_array().unwrap() on all Values at every row iteration cycle was not efficient. Do you think it is a good solution instaed?

OK, I think I’m walking my way through… every time I have to do something with zset data, I need to pattern match through all variants and elaborate the extracted data in order to always output the same type (e.g. String). I hope it will work…

I have however another question. With the code above I need to pass slices explicitly, for instance:

let xvar = DataVar::new("x", &x[..]);

How can I make it work either when passing slices or references to arrays? In other words, I would like both of the following to work: