Thanks so much for the response. I hesitated to ask you directly about
this because you have given so much time and effort already, but I'm
very happy you responded.
The description of the dome visualizations on the HubbleSOURCE site
say that the
visualizations are supplied as a series of sequentially numbered,
polar-projection, still frames. I imagine one would take this series
and turn it into a video. I think that in order to make a "warped"
fisheye video you would need to just first individually warp each
frame. I thought a batch process in PhotoShop could do this easily. I
just had no idea how to do the individual warping. Real time warping
would not be needed here. Anything you could do via custom software,
or a PhotoShop plug-in, would be a great help.
If I understand the other points correctly, video sequences could also
be converted, just not in real-time? No problem there. The feature
would be a great help.
About the real-time warping... I have not contacted anyone. I just
have no idea what to say to a programmer. Do you know anyone who might
have experience with these mplayers? The real-time warping would be
on my future wish list, but being able to pre-convert individual
frames, and video sequences would be just fine.
Thanks for everything you are doing for the community!
--- In firstname.lastname@example.org
, Johannes Gajdosik
> Hello Greg,
> I just discovered this message again in my mail-program.
> I do not fully understand, what a "frame stack" is,
> but distorting a single image from fisheye to spheric_mirror
> is fairly easy. It does not have to be realtime.
> So if you just want to have a program that converts one
> fishey input image into a spheric mirror output image,
> then I can make this quite easily. For simplicity I would
> use the .ppm file format, which stores uncompressed RGB
> images. Converting from and to other image formats would
> be done with another program. This way also entire video sequences
> could be converted, for instance using mplayer/mencode and lavtools.
> Making a player that converts the images during playback is harder,
> because it must be in realtime. So the OpenGl aproach like in
> stellarium 0.8.0 would be necessary. But I suppose that an
> mplayer-programmer could do this quite easily.
> Did you ever ask the mplayer people?