If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

But is it true that if pps is 30000 that THIS MEANS that
the DAC delivers the analog x,y,r,g,b to the laser for exactly 1/30 millisec before giving it the next point?

In the ideal world, yes. But as mixedgas stated (and I alluded to) 30K pps is a sustainable rate IF the succession of adjacent points are very close together, but not when they are far apart and the jump between two points is a long one. It all depends on how the image's points are "put together". So think of it as a generality.

well, the DAC will definitely send the x,y,r,g,b for that duration (1/30 of a millisec).

But whether the galvos actually reach the next point (and aren't all freaked from the LAST point)
depends on optimizing the line segment length and dup dots so the galvos actually make it to the dot properly.

I mean, I've not found many links about optimizing those dots properly.

It is also possible to use a DAC with a fixed clock set to 48KHz, aka a sound card.

It's all a matter of how the vector art is converted into something that can be properly scanned.

LaserBoy has extensive optimization functions that can be applied individually to a frame or frame set. So you can see what each one does. You can also do all of the optimization techniques at once. You can set values in LaserBoy value settings that control every aspect of how LaserBoy optimizes your vector frames and save the results in any format of ILDA. You can also save a frame set as a LaserBoy Formatted Wave file that can be played through a modified sound card to produce the (analog) signals.

Optimization is more of an art than a hard science. I figured out how to do it in a way that works. It's in my open source code.

well, the DAC will definitely send the x,y,r,g,b for that duration (1/30 of a millisec).

But whether the galvos actually reach the next point (and aren't all freaked from the LAST point)
depends on optimizing the line segment length and dup dots so the galvos actually make it to the dot properly.

I mean, I've not found many links about optimizing those dots properly.

Has it historically just been done manually i guess?

Being an 'ol fart from ye ol' days, we just learned through trial and error how many "wait XY points" had to added to give the galvos time to catch up to that point to maintain the shape or to make sure the beam OFF or ON code wasn't too late or early. It didn't take long before it was 2nd nature to know how many duplicate XY points to add based on the distance between points to maintain the shape or make the beam blanking timing just right. I mean we had to do it manually because the computers we used didn't have the computing power to calculate this on the fly.

Well, you can certainly look at the code as it is now, but I've been grinding on it pretty much every day since the last release. I've fixed so much stuff it's like a version 2.0.

The basic idea is the core memory model. It's all based on the C++ STL container class vector <template>. I wrote all the classes and figured out how to do all the math that makes it all work. Everything it does is in the code. The only thing libSDL does for me is give me the address of the first pixel of the display window and tell me what key just got hit in any OS.

That's why there is so much code. A fairly large part of it is the text menu system!

I'm always glad to answer questions about any of it.

If you're really interested you can PM me for a daily build of the exe and a current copy of the code.

Some background on me - my main app is pianocheetah - a weird midi sequencer.

Once a year, my neighbors have "Hogmanay". That's what they call new years in Scottland.
One of us grew up there. We have it Feb 23 in between Christmas and Spring when nobody
is doin NOTHin. It involves kind of what it involves in Scottland - a lot of drinking, going to peoples' houses.

We carry around a huge paper mache herring so that people really know we can party.
Even at our age.

Anyways, its 4 weeks away so I gotta show off my laser. At least I'm almost to the point of
rifling thru ilda files to the beat of some dang midi files i find.
Eventually it'll tie in to the music a LOT better. But i only got 4 weeks.

Last year the theme was LASER OLYMPICS.
All i managed to do was some rotating shrinking triangles, random dots, random lines, and
other kiiiiiinda boring stuff. Lots of distortion going onto my ceiling.
Oh and the olympic rings.
I did get the optimising pretttttty ok. But imperfect.
I set off the smoke alarms in my house and a LOT of drunk people got VERY annoyed ha ha.

So things can only be better this year. We'll see how it goes.

The reason i wanna write this stuff is cuz i can.
And so it'll tie into my midi sequencer. soooome day.

i find myself a dinosaur. but i'm on windows10 using c++ and win32 api.

i am totally frustrated with what microsoft and it's os have turned into.
but i won't go mac (too elitist). So that leaves me linux and chromeos.

i mean, chromeos is linux. but it's linux supported by google...

and you just can NOT do a proper user interface in win32 anymore.
people have been spoiled by css on the web.
so since the web has taken over almost all front ends...
eventually, once webassembly fulfills it's promise, i'll try to cram my whole midi sequencer on the web.

the chrome webmidi api ain't perfect, but it does actually work...

And i bet there's a sockets api that could get my etherdream hooked up...

i dunno. window10 is probably gonna be eaten by chrome os for the average home user.
Then the web will be all we have and all the APIs that google jams into chrome.

I dunno. I like to pontificate. But so far i'm still stuck dealin with microsofts slow slow death.

Well, I've been around for a while too. I learned C and C++ in 95 and 96. I was immediately attracted to Linux because those languages are native to UNIX like operating systems. They do exist in Windows, but they can't really do everything they were designed to do, mostly because of the differences in file IO.

When I first started on LaserBoy I wrote the whole thing. I even wrote all of the code that directly communicates with the video card. The only thing I didn't write was the Linux kernel and the GCC compiler!

I think MS Windows has really gone down hill. Windows 10 just plain sucks. It's HUGE, bloated and eats resources for no real benefit. The best version of Windows ever put out there was 2000. That was almost 20 years ago! I don't know what MS will do to stay relevant. I think most people will drift away from MS and not even notice it. People will probably move toward Android.