Skip to content
This repository was archived by the owner on Feb 24, 2025. It is now read-only.

Make inference batch size configurable #3

Open
jasonrig opened this issue Jun 25, 2019 · 0 comments
Open

Make inference batch size configurable #3

jasonrig opened this issue Jun 25, 2019 · 0 comments
Labels
enhancement New feature or request

Comments

@jasonrig
Copy link
Owner

The inference batch size is currently fixed to 1. This should be configurable.

See: https://github.com/jasonrig/address-net/blob/master/addressnet/dataset.py#L499

@jasonrig jasonrig added the enhancement New feature or request label Aug 29, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant