View Single Post
  #89  
Old December 11th 17, 05:10 PM posted to rec.bicycles.tech
Jeff Liebermann
external usenet poster
 
Posts: 4,018
Default New B&M 100lux headlight.

On Sun, 10 Dec 2017 23:46:48 -0800 (PST), Oculus Lights
wrote:

@Jeff, do you want the original NASA Roverscape instead of what
you pulled off the website? Might get more even false coloration.


Sure, I'll give it a try. Email address is in the message signature.
Using ImageJ is easy enough and using JVA should run on Windoze, Mac,
Linux, etc. The program came from the NIH (National Institute of
Health) and was used primarily for analyzing medical microscope
photos. There is a HUGE selection of plugins to do weird things with
photographs.
https://imagej.nih.gov/ij/plugins/index.html
False coloring is only one plugin in the programs bag of tricks. The
hard part will be calibrating the intensity to false-color levels, so
that the images make sense and can be compared. I have some ideas on
how to do it, but so far, have done nothing. Also, I tried to setup a
color LUT (look up table) that will give a finer intensity resolution
in the area of interest, but the results were awful.

The NASA setup is a special layout of multiple beams to create
an even spread above a needed threshold, to fill a desired goal
downrange and for a given width. The initial evenness of the
Oculus beam allows them to be 'woven' together to get that
level of evenness over a larger field. Combining multiple
round beams leaves hot spots, and also needs much more power
than the rover's power budget to create for the needed brightness
for that large a light field.


I can guess(tm) why they need a large even beam directly in front of
the rover. There are videos of how they "drive" the machine.
https://www.jpl.nasa.gov/edu/learn/video/mars-in-a-minute-how-do-rovers-drive-on-mars/
It's not like an automobile, where everything is done in real time.
The driver (or whatever he or she is called), plots the intended path
of the rover on a computah screen. The rover then follows the dotted
line. The idea is to avoid driving into any pitfalls or running over
any obstacles. Stereo cameras give them a 3D view of the ground,
which highlights these holes and rocks. That might explain why the
lighting footprint in your photo shows lots of even light directly in
front of the rover, while everything in the distance is barely
illuminated. They need to see terrain details. From the video, the
programming seems to done when Mars is dark. If they want to program
the rover to go a longer distance, they need to see further forward.
My guess(tm) is uniform lighting at a distance is what they're asking.

You might want to modify your description under the photo on your web
pile.
"To re-light where the moon has been dark for three billion
years, NASA's Rover can see 60% farther with Oculus optics."
is not what's happening. I suggest
"NASA's rover can see details and obstacles better with
Oculus optics uniform lighting".
or something like that. Lose the 60% or someone like me might ask you
to explain why it's 60% further than what unspecified lighting system.

As for power budget, you might ask NASA if they can synchronize their
cameras to the PWM (pulse width modulation) dimming of your light.
Instead of a 100% duty cycle, the light would be on full for perhaps
10% of the time, thus drawing 1/10th the power. Essentially, it would
act like a strobe flash. If the camera frame rate is synchronized
with the strobe, it sees full brightness for a short interval, and
then nothing until the next pulse. On a real time TV, it would look
awful, but on a system designed to capture single video frames, each
image would be complete and look like it was photographed at full
power.

I hate to admit it, but this is my current idea of fun. Good luck.

--
Jeff Liebermann
150 Felker St #D
http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann AE6KS 831-336-2558
Ads
 

Home - Home - Home - Home - Home