Tag Info

If the only difference between the quads is their physical size in window coordinates ("one is bigger than the other on-screen"), and all other things (shader, textures, et cetera) are equal, then the only significant difference in the pipeline will be that the rasterizer must fill more fragments for the quad that is larger on-screen.
This means that if the ...

I haven't done any benchmarking to compare but there's a relatively unknown feature introduced to core in 4.3 that might be of interest to you. glVertexAttribPointer has somewhat silently been deprecated in favor of a new suite of functions:
glVertexAttribFormat provides your attribute location, size, type, and offset (for interleaved vertex data)
...

I made a test about integer operation witch looped a million times on x64_64 , reach brief conclusion like below,
add ---116 microseconds
sub----116 microseconds
mul----1036 microseconds
div----13037 microseconds
the data above have already reduced the overhead induced by loop,

However, when using this technique with dynamically loaded, continuously changing scenes (e.g. very large tiled worlds)
You wouldn't use this for the level geometry.
Your chunks should map in their own static buffers on demand, just like you normally would.
View frustum and occlusion culling force you to continuously rewrite instancing and command ...

Yes!
This is exactly the sort of thing that can be done very well on the GPU, since it is processing one pixel at a time.
There's two ways you might address this:
Incorporate this processing directly in your shader, so you don't need to preprocess the texture at all. just process it on the fly as you apply it. To do this, you'd incorporate your ...

It looks like hardware instancing isn't supported by MonoGame yet, which is ideal. That leaves us with the answer to MonoGame: Draw thousands of quads without hardware instancing. Still, I've included an explanation of how it would be done if they get around to it.
Hardware Instancing (pure XNA HiDef only for now)
It's been a while since I've done this, so ...

Yes, lightmaps are textures. Lightmaps are a record of the lighting value for a surface at a particular point, and the way we represent "surface data for a particular point" in modern 3D graphics is with textures.
Unity's lightmapping implementation uses Beast. The default lightmap atlas size limit is 1024x1024. Terrain, per the documentation, allows you to ...

The other answers here are good (maybe better) but I wanted to add my 2 cents.
One approach is for objects that have static verts to all share one VBO. Just pile on the data from each objects into one huge array and make it a VBO. Track offsets and sizes for each object within the VBO and then draw each individual one like this:
...