Skip to content

tezignlab/LayoutNet-TF2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LayoutNet in TensorFlow 2.3

Reimplement of LayoutNet in TensorFLow 2.3.

Reference:
Zheng, Xinru, et al. "Content-aware generative modeling of graphic design layouts." ACM Transactions on Graphics (TOG) 38.4 (2019): 1-15.

Introduction

LayoutNet is a content-aware graphic design layout generation model. It is able to synthesize layout designs based on the visual and textual semantics of user inputs.

Example Results

LayoutNet on Colab

We have put demo and the whole training pipeline on Colab, you can click here to play with it.

Prepare Dataset

Download the following files from here. And place them in the repo's folder, the same level as main.py.

  • checkpoints (~200M): this is the pre-trained LayoutGAN model
  • dataset (~1G): the processed dataset in TFRecords format
  • sample (5.2M): the processed sample visual (visfea), text (texfea), semantic (semvec) features and some sample generation results

The folder structure should be:

├── README.md
├── checkpoints
│   ├── checkpoint
│   ├── ckpt-300.data-00000-of-00001
│   └── ckpt-300.index
├── config.py
├── dataset
│   └── layout_1205.tfrecords
├── dataset.py
├── demo.py
├── main.py
├── model.py
├── preprocessing.py
├── requirements.txt
└── sample
    ├── imgSel_128.txt
    ├── layout
    ...

The path to the datasets are specified in config.py:

# configuration for the supervisor
logdir = "./log"
sampledir = "./example"
max_steps = 30000
sample_every_n_steps = 100
summary_every_n_steps = 1
save_model_secs = 120
checkpoint_basename = "layout"
checkpoint_dir = "./checkpoints"
filenamequeue = "./dataset/layout_1205.tfrecords"
min_after_dequeue = 5000
num_threads = 4

Getting Started

With Docker (GPU, Linux only)

With Docker, only NVIDIA driver is required, no need to configure CUDA and cuDNN.

  • Firstly, install NVIDIA Container Toolkit following the official guidance.

  • Build your image using Dockerfile.gpu.

docker build -t layoutnet -f Dockerfile.gpu .
  • Run nvidia-smi in your container to check if everything goes right.
docker run --rm --gpus all layout nvidia-smi

With Docker (CPU, Linux / Windows / MacOS)

  • Build image using Dockerfile.
docker build -t layoutnet .
  • Run your image
docker run -it --volume `pwd`:/LayoutNet layoutnet /bin/bash

On Local Machine

If you want to use GPU on local machine, make sure you have installed right version of CUDA and cuDNN before running the following command. You can check it from here.

python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Train Model

python main.py --train

Test Model

python main.py --test

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages