@Lazar-Demin as always, thanks for the response! I literally keep this open waiting for developments as they occur ;)
I am assuming that this information will come, but you seemed to point out something I was worried about:
I have a bunch of static and/or dynamic libraries I want to link with, but without control of a makefile, I can't ensure the resultant application is built/linked properly. This is a slight downside, but, I can just build something that requires static libs myself, at that point, as I'd have more capabilities myself then cloud compile would actually give me.
To that end, I just realized that I could extend the docker container I made up for nodejs compiling to utilize the toolchain and create a arbitrary programs given a directory filled with makefiles/sources/static libs.
Nominally, I could mount a directory of a project via -v flag to draw in any git repo for building in my toolchain container. I could then have a build hook/script within the container or in the git repo that, post-build step, POSTs the results to onion cloud for loading and installation to the omega. This would entail that I also, along with mounting the project directory, perhaps mount the credentials directory for the specific API key I want to post to. . .
Love it! Now that onion cloud/ubus/api integration is up, I feel the community can move at a much faster pace. I feel, though, that perhaps you should invite and open up your design meetings to members of the community (as much as you can without sacrificing your competitive edge) so that community advocates can do this sort of documentation and whiteboarding for you ;) I mean, I'm happy to spitball all my good ideas and all, but I feel you all should get to your core business and allow critical functions to be taken over by your dedicated and fanatical community.
I just realized that specific software driver profile loadouts could be housed on custom and small flash drives that could ship all needed packages (which opkg could later update once installed) for a specific hardware kit ordered from the store. Really, for the above to make sense, I do believe that we'd need to tweak the OS to allow for running packages installed outside of root, such that the 16MB root partition can house only the absolute essential bootstrapping tech, while the external usb hub is required to run the actual application/project software necessary.
I have only used motion on the raspberry pi, but it's function is to take a photo or short video when motion is detected. It is a much simpler algorithm than opencv. You can actually use opencv to detect objects and faces. I haven't used it yet, but now have a book on the project and intend to try to get it to recognize members of my family.
I was able to get the omega to stream video pretty easily (there are other threads). It was a pain to get the streaming library compiled on the pi.
Logitech QuickCam® Sphere AF works using mjpg_streamer and the webcam app on the latest firmware. Plugged an old one I had and worked out of the box but only when connected directly to the Omega (didn't work when trying it on a non-powered USB hub). Now on to look into controlling pan and tilt.