File size: 1,725 Bytes
801501a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62

### <a href="https://amirhertz.github.io/spaghetti">SPAGHETTI: Editing Implicit Shapes through Part Aware Generation</a>

![](./assets/readme_resources/teaser-01.png)


### Installation

```
git clone https://github.com/amirhertz/spaghetti && cd spaghetti
conda env create -f environment.yml
conda activate spaghetti
```

Install [Pytorch](https://pytorch.org/). The installation during development and testing was pytorch==1.9.0  cudatoolkit=11.1 


### Demo
- Download pre-trained models
```
python download_weights.py
```
- Run demo 
```
python demo.py --model_name chairs_large --shape_dir samples
```
or
```
python demo.py --model_name airplanes --shape_dir samples
```

- User controls
  - Select shapes from the collection on bottom.
  - right click to select / deselect parts
  - Click the pencil button will toggle between selection /  deselection.
  - Transform selected parts is similar to Blender short-keys.
  Pressing 'G' / 'R', will start translation / rotation mode. Toggle axis by pressing 'X' / 'Y' / 'Z'. Press 'Esc' to cancel transform.
  - Click the broom to reset.
  

### Adding shapes to demo
- From training data
```
python shape_inversion.py --model_name  <model_name>  --source training --mesh_path --num_samples <num_samples>
```
- Random generation
```
python shape_inversion.py --model_name  <model_name>  --source random --mesh_path --num_samples <num_samples>
```
- From existing watertight mesh: 
```
python shape_inversion.py --model_name  <model_name>  --mesh_path <mesh_path>
```
For example, to add the provided sample chair to the exiting chairs in the demo: 
```
python shape_inversion.py --model_name chairs_large --mesh_path ./assets/mesh/example.obj
```

### Training
Coming soon.