• Login
    • University Home
    • Library Home
    • Lib Catalogue
    • Advance Search
    View Item 
    •   KDU-Repository Home
    • INTERNATIONAL RESEARCH CONFERENCE ARTICLES (KDU IRC)
    • 2022 IRC Abstracts
    • Technology
    • View Item
    •   KDU-Repository Home
    • INTERNATIONAL RESEARCH CONFERENCE ARTICLES (KDU IRC)
    • 2022 IRC Abstracts
    • Technology
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Improving Visibility at Night with Cross Domain Image Translation for Advance Driver Assistance Systems

    Thumbnail
    View/Open
    Abstract_book_IRC_2022_T-16.pdf (86.88Kb)
    Date
    2022-09-29
    Author
    Lakmal, HKIS
    Dissanayake, MB
    Metadata
    Show full item record
    Abstract
    The most difficult time for driving is at night because of the dreadful lighting conditions. It was identified that 50% of the traffic deaths happen at night, even though only one-quarter of our driving happens at night. Therefore, having clear visibility at night is crucial for a safe drive at night. Most Advanced Driver Assistance Systems (ADAS) also fail at night due to poor lighting. Considering this matter, this study will explore the possibility of translating night-time images to clear and detailed images with day-time lighting (i.e., equivalent daylight images). This can be identified as a cross-domain image translation problem between the day-time domain and the night-time domain. Even though many deep-learning-based techniques to transform images between domains exist, most of them require pixelto- pixel paired datasets for training. However, it is challenging to develop such a dataset in this scenario, since roads are dynamic and uncontrolled environments. As a solution, this study utilised a well-known Cycle-GAN model, which can be trained using an unsupervised training approach. Therefore, this study explores the possibility of transforming images between day-time and night-time using Cycle- GAN. The other challenging task of this study is to access the quality of the Cycle- GAN generated images, since there is no pixel-to-pixel paired image to compare against. Therefore, this study utilizes a reference-less image quality evaluation technique called Blind Reference-less Image Spatial Quality Evaluator (BRISQUE). The day-time images synthesised by the trained Cycle-GAN indicated a 28.0416 average BRISQUE score, whereas the original day-time images indicated a 26.2156 BRISQUE score, which indicates that there is only a 0.069% deviation. Dataset and the source code used for this study are available at https://github.com/isurushanaka/GANresearch/tree/main/Night2Day/Experime nts/Unpaired
    URI
    http://ir.kdu.ac.lk/handle/345/6019
    Collections
    • Technology [13]

    Library copyright © 2017  General Sir John Kotelawala Defence University, Sri Lanka
    Contact Us | Send Feedback
     

     

    Browse

    All of KDU RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsFacultyDocument TypeThis CollectionBy Issue DateAuthorsTitlesSubjectsFacultyDocument Type

    My Account

    LoginRegister

    Library copyright © 2017  General Sir John Kotelawala Defence University, Sri Lanka
    Contact Us | Send Feedback