TransWikia.com

How to feed in the bounding box annotations to MobileNet (preferably v3) for training using keras?

Data Science Asked by t T s on March 17, 2021

I am trying to train an object detection model using transfer learning with MobileNet as the base model in AWS sagemaker. (The answer doesn’t have to related to sagemaker. I can manage to add it if I can learn a method to do it in keras/tensorflow)

In keras ImageDataGenerator, we can easily create the folder structure and feed the images to the training job using that generator for a "classification" tasks. The flow_from_directory function works fine for that. But in my case I need to use bounding box annotations. Is there a similar class or method to feed the image data along with annotations for my "object detection" task from directory ?

My dataset subfolders have the following structure:


    .
    │
    ├── train
    │   ├── 000001.jpg
    │   :   ...
    │   └── 07000.jpg
    │
    ├── train_annotation
    │   ├── 00001.json
    │   :   ...
    │   └── 07000.sjon
    │
    ├── validation
    │   ├── 01000.jpg
    │   :   ...
    │   └── 05000.jpg
    │
    └── validation_annotation
        ├── 01000.json
        :   ...
        └── 05000.json

I saw a method where annotations are put to a .csv (basically a dataframe) and something with TFrecords. I am not sure how it’s done though.
Is there a convenient method like a generator or something in tensorflow or keras ?
Can someone please explain me how to do this or direct me in the right direction?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP