2 Answers

+1 vote

What exactly are the serial messages you are trying to send from one camera to another? Because unless that serial string is recognized by the camera as a command, it will register that string as an error.

hamac2003,
The messages we have been trying to send are a JSON string containing 4 sets of values.
You are correct in that they are not commands, but the JeVois engine "should" therefor pass them to the parseSerial method. This actually does happen as desired, but we also then see an error that points to the line that contains "inimg = inframe.getCvBGR()". That makes no sense to me, but at least it is consistent.

Just to clarify, do you get this error when trying to run your python module? Or does this error only manifest itself if you try to send data from another camera to the camera you are running the module on? I'd assume the former, seeing as most errors thrown by the python interpreter are syntax / usage errors. If you are willing to post your code, I can try to reproduce the error on my camera if I have time.

This error only shows up once streaming starts from the main tracking camera. If the main camera stops streaming, the error goes away and the normal video stream resumes.
Just to help understand this more, we substituted a simple text string of "Hahahahaha" in place of the JSON string and had exactly the same behavior.
The odd part is, we use parseSerial in several other scripts we use without issue.

Hmm...I'll do some looking and let you know what I find out. What are you trying to accomplish by wiring the two cameras together? If you are using two cameras on your robot (I'm also on an FRC team), you could just wire both cameras to a serial switch, and wire the output of the switch to your roboRio's RS-232 port.

OK, in a nutshell, The primary tracking camera provides data to the roboRio to auto align the robot to the Hatch and/or Cargo ports. The same serial stream is being sent to the live stream camera, the second camera, and will be used to generate visual cues on the drivers station for the driver. This is somewhat of a redundant process, but it gives the drive team extra info to use without having to search for it.

We had thought of doing something similar, only we were going to stream the x coordinate of our target to our dash board, then draw a set of cross-hairs on the video from the usb camera displayed on the dashboard for driver vision. So basically, we would get the coordinates we want from our JeVois camera, then draw whatever visual cues we want on the drivers station using whatever language your dash board program runs off of.

Adding to the other great replies, you should use a prefix so that the JeVois engine will indeed forward your string to parseSerial(). Anything that the engine can interpret, it will parse itself, and it will give you errors if it thought it could interpret it but in the end it cannot (e.g., a malformed message like "setpar param" when "setpar param value" would be needed).

So, on the sender I would send:

JSON { "name":"John" }

then in parseSerial() on the receiver you look for that "JSON " prefix and strip it out, then decode the rest of the string (your JSON data).

There is no limit on serial message size (until you max out on the 256MB of internal RAM). But make sure you do not have special chars, newlines, etc in your JSON data. It should be just one line with the whole data.

Thanks for the suggestion. It did in fact work when I tried it from the console. I then adjusted the code from our sending camera to include the "JSON=....." This made it very simple to split the string on "=" and strip off the indicator.
The problem did not go away though. After several different adjustments to the code and FPS, I was finally able to get it running, but at a very low frame rate. We normally target at about 30 FPS and stream at 24 FPS.
The bottom line is, using the hardware serial link and parsing the string, loading it to JSON, and then adjusting the display parameters just takes too long. There just isn't enough bandwidth at 115200 baud for this to work. Our code is very light, but the link just isn't sufficient to keep up.
So, we will use USB to send targeting parameters to the robot controller and have the robot controller send single character packets back to the streaming camera to create the overlays.

All that said, I will see if I can get the hardware ports to run faster than 115200. If they can, we may revisit this again.