• Login
    • Submit
    View Item 
    •   Repository Home
    • UT Electronic Theses and Dissertations
    • UT Electronic Theses and Dissertations
    • View Item
    • Repository Home
    • UT Electronic Theses and Dissertations
    • UT Electronic Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Automotive top-view image generation using orthogonally diverging fisheye cameras

    Icon
    View/Open
    PAN-MASTERSREPORT-2016.pdf (14.80Mb)
    Date
    2016-05
    Author
    Pan, Janice Shuay-ann
    0000-0003-1801-9896
    Share
     Facebook
     Twitter
     LinkedIn
    Metadata
    Show full item record
    Abstract
    Advanced Driver Assistance Systems in vehicles can be a great assistance to drivers by providing them a quick and easy way to visualize their entire 360-degree surroundings. We introduce a new camera set-up for a surround-view imaging system that may be part of an ADAS. This set-up involves four wide-angle fisheye cameras with orthogonally diverging camera axes, which allows for capturing the entire 360 degrees around a vehicle in four images, captured from the lateral, front, and rear views. Simple perspective transforms can be used to convert these images into a synthesized top-view image, which displays the scene as viewed from above the vehicle. These transforms, however, are typically derived using a basic calibration procedure that is only capable of correctly mapping ground-plane points in captured images to their corresponding locations in the top-view image, and subsequently, all off-the-ground points look distorted. We present a new method for calibrating a top-view image, in which objects and off-the-ground points are accurately represented. We also present a method for using specifically designed disparity search bands to segment the scene in the overlapping field-of-view (FOV) regions between adjacent cameras, each pair of which is effectively a stereo imaging system. Such wide-baseline stereo systems with orthogonally diverging camera axes make stereo matching difficult, and traditional correspondence algorithms cannot reliably generate the dense disparity maps that might be computed in a parallel stereo set-up involving cameras that follow a rectilinear model. We segment the scene into the ground plane, objects of interest, and the background, and show that our new virtual camera calibration parameters can be applied to represent objects in the scene in a more realistic manner.
    Department
    Electrical and Computer Engineering
    Subject
    Fisheye
    Orthogonally diverging stereo
    Virtual view calibration
    Scene segmentation
    URI
    http://hdl.handle.net/2152/43606
    Collections
    • UT Electronic Theses and Dissertations
    University of Texas at Austin Libraries
    • facebook
    • twitter
    • instagram
    • youtube
    • CONTACT US
    • MAPS & DIRECTIONS
    • JOB OPPORTUNITIES
    • UT Austin Home
    • Emergency Information
    • Site Policies
    • Web Accessibility Policy
    • Web Privacy Policy
    • Adobe Reader
    Subscribe to our NewsletterGive to the Libraries

    © The University of Texas at Austin

    Browse

    Entire RepositoryCommunities & CollectionsDate IssuedAuthorsTitlesSubjectsDepartmentThis CollectionDate IssuedAuthorsTitlesSubjectsDepartment

    My Account

    Login

    Information

    AboutContactPoliciesGetting StartedGlossaryHelpFAQs

    Statistics

    View Usage Statistics
    University of Texas at Austin Libraries
    • facebook
    • twitter
    • instagram
    • youtube
    • CONTACT US
    • MAPS & DIRECTIONS
    • JOB OPPORTUNITIES
    • UT Austin Home
    • Emergency Information
    • Site Policies
    • Web Accessibility Policy
    • Web Privacy Policy
    • Adobe Reader
    Subscribe to our NewsletterGive to the Libraries

    © The University of Texas at Austin