batch inference supported?

#7
by chenkq - opened

Thanks for your amazing work! It's awesome!

I am wondering if batch inference is currently supported, as I noticed there’s a function called model.generate_from_batch in the example. The example code in the README file only demonstrates inference using a single sample.

Specifically, I am not sure how to concatenate the "images," "image_input_idx," and "image_masks" fields in the "inputs" provided to the model when dealing with multiple images of different sizes.

Batch inference is supported, you would need to run the process on several inputs and then concatenate the fields together while padding them with -1 so they have the same shape. We will look at adding automatic functionality for that.

Sign up or log in to comment