56823Re: Magic Lantern Raw 360 videos -- twin dslrs
- Aug 2, 2013Wow. Quality looks astonishing compared to what we have with other cameras and h264 compression !
I'm not sure smartblend will give you any good result : it will try to find the optimal seams at every image, and you'll have some kind of moving stitching line over time.
And I think you'll likely never has perfectly accurate sync with this kind of camera, a timecode is needed for this, but is only present on professional camera.
--- In PanoToolsNG@yahoogroups.com, "panovrx" <mediavr@...> wrote:
> I have been doing some Raw ML tests lately with (level) twin 5DMkIIs with 10.5mm Nikkors -- with cameras horizontal (the bottom one inverted), base to base. eg.
> http://www.youtube.com/watch?v=LJnzTun8a74&hd=1 (try the 1080p version, -- the original stitched video looks at least twice as sharp as this 1080p version btw I think). Before cropping there is vertical coverage across the 360 stitched area of a minimum of 115 degrees approx (which means they work (level) in the Oculus Rift without visible vertical limits). The Oculus Rift is 90degrees wide, 110 degrees high in the default (level, straight ahead) view.
> This rig has a a few cms vertical and horizontal offset from where the lenses should be but the 190+ coverage of the 10.5mm in full circle terms means you have quite a bit of overlap for stitching. You get a clear vertical coverage of about 115 degrees across the 360 view with twin 5DIIs and those lenses.
> Here the cameras are only about 2.5m from the ground so there is a lot of misstitching for close objects in the blending zone. The black lines are a remote cable I didnt move out of the view. With a high pole you can get very good stitching I think.
> Magic Lantern Raw video on the 5d mk II can be shot at 24, 25 or 30 (or <24) fps. These were shot at 24fps at 1600 by 1280 pixels per frame (this actually uses more the sensor extent(= image circle here) than the standard 1920,1080 1080p
> non-raw video setting. (ie. So you can get more vertical video coverage with ML than you can with standard firmware.) Processing of the dngs from the MLRaw was done in ACR which as we know is great adjusting raw files for CA, vignetting etc
> Sync is problematic sometimes (often). I use radio remotes with a single trigger. I have an idea that if the audio beeps for video start sounds synced then the videos will be in sync. So sometimes I start videoing again if the start beeps sound distinctly unsynced (ie.not totally simultaneous). But it could be wishful thinking.
> Stitching was with PTGui with the default blender. I will have to experiment with masking and Smartblend or maybe blending in After Effects with mobile masks.
- << Previous post in topic