Using textures instead of points in a map OpenGl-ES 2.0

At present point I have built a map of road center-lines and property boundaries. Which you can zoom and pan around in. There are also 6 points of interest (drawn as green points in the image below). This possibly could hold up to 6000 points.

What I am trying to accomplish is, to change the points to circular textures in a square. I would like the positions to move with the zoom, however the scaling acts differently. I do not want the squared textures to scale at the same rate as the rest of drawing.

In fact, what I was thinking was to set the image to something like 5pt at 54000m and above, and scale slowly to 25pt until it gets to 10000m, anything below that remains at 25pt. Just using pt as an example unit.

I already have the center of each point loaded into a floatbuffer that is how I drew the initial points.
I add a square into the constructor, with position data and texture coord data, color is already in the texture png.

I have started to create the vertex and fragment shaders in getPointVertexShader() and getPointFragmentShader(), however I'm am not sure on how to finish them off, so I could use a little help there.

The other pieces that are confusing me are setting position, scale of the squares, and piece it all together in the draw.

Below is the renderer code so far. Any help will be greatly appreciated!

What I am trying to accomplish is, to change the points to circular textures in a square. I would like the positions to move with the zoom, however the scaling acts differently. I do not want the squared textures to scale at the same rate as the rest of drawing.

Can you use GL_POINTS? The vertex shader writes the point size to gl_PointSize, and the fragment shader gets invoked for every fragment within a square of that size centred on the projected vertex coordinate. The fragment shader input variable gl_PointCoord holds the coordinates within the point (in the range (0,0) to (1,1), so (0.5,0.5) is the centre).

The only drawback is that the implementation is allowed to impose an upper limit on the size of points, and that limit can be as low as one pixel (i.e. gl_PointSize isn't guaranteed to be supported in a meaningful sense). However, if you're targeting a specific platform, then you can check whether that platform's point size limit is sufficient for your needs.

If you can't use points, then the way I would go about it would be to give all four vertices the coordinates of the point itself, then "fix" them in the shader based upon the texture coordinate, e.g.:

At the moment runtime is throwing an error:Error compiling shader: pointsVertexShader Compile failed.
ERROR: 0:10: '*' : Wrong operand types. No operation '*' exists that takes a left-hand operand of type '2-component vector of float' and a right operand of type 'const int' (and there is no acceptable conversion)

Now I believe there is just one final thing. My textures are coming out as follows (You can see the thin black triangle being rendered):

Below is a refresher from above of what it should resemble.

I believe that it is something to do with my position buffer and the fact that I'm trying to draw triangles, it seems to be linking up the individual points and it just happens that I have six. From what I understand I want to be creating triangles to render my png texture but only want one point coord to go through at a time allowing for the shader to do the rest?

I believe that it is something to do with my position buffer and the fact that I'm trying to draw triangles, it seems to be linking up the individual points and it just happens that I have six. From what I understand I want to be creating triangles to render my png texture but only want one point coord to go through at a time allowing for the shader to do the rest?

For rendering with GL_TRIANGLES, you need to pass 6 vertices (two triangles) for each point. You can pass the same vertex coordinates for all 6, and have the shader move the vertex coordinates outwards according to the point size, but the shader can't turn a single vertex into multiple vertices (a geometry shader can do that, but OpenGL ES doesn't have those).

If you use glDrawElements() rather than glDrawArrays(), each point only needs 4 distinct vertices and 6 indices, rather than 6 complete vertices.

Ok, so it was simpler to create the six vertices, so I've gone down that track.

However there must be something in the drawTexturedPoint() that I'm either not setting or setting incorrectly. At the moment the lines are still rendering however I cannot see any rendering for the points.