? I'm not so experienced with the particular details of how R handles lists in memory, but my limited understanding is that it tends to be copy-happy; what would be ideal for me would be that the first option doesn't involve essentially creating another list in memory, but just results in setting aside a new place in memory for the appended value. Essentially, if I have a big list, I don't want R to make another copy of it if I just want to append something to it.

If the behaviour I want is not what is given here, is there any other way I can get the desired effect?

(+1), The second case isn't appending, or an example of something I was proposing, but rather an example of something I don't want R to be doing behind the scenes.
–
guyOct 7 '12 at 22:57

Ahh, I misread your question, it first read to me as you were asking whether x <- list(10,20), was the equivalent (in terms of memory) to x <- list(10); x[[2]] <- 20. On rereading I see that it was more nuanced than that.
–
mnelOct 7 '12 at 23:18

Yes but in that linked answer x was a data.frame. In this question x is a list. Copying behaviour of list can be different. Note that there is no [<-.list method but there is a [<-.data.frame. Use .Internal(inspect(x)) to check.
–
Matt DowleOct 8 '12 at 16:00

To help me figure out whether or not modifying a list makes a deep copy or a shallow copy, I set up a small experiment. If modifying a list makes a deep copy, then it should be slower when you're modifying a list that contains a large object compared to a list that contains a small object:

Accepted flodel's answer, but Chase's tip was good so I confirmed that I have the desired behavior using his suggestion of using tracemem(). Here is the first example, where we just append to the list: