Hello. Cool project. Finally was able to get the right torch versions installed, and tryed to run, but Im getting an issue I think related to the cli?
Running python example_texture_nca.py --train_steps=500 run --lr=0.0005 as per the docs gives error:
GPU available: True, used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
Traceback (most recent call last):
File "example_texture_nca.py", line 250, in <module>
fire.Fire(FireRunner, name='runner')
File "C:\Users\Primary User\anaconda3\envs\na3\lib\site-packages\fire\core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "C:\Users\Primary User\anaconda3\envs\na3\lib\site-packages\fire\core.py", line 466, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "C:\Users\Primary User\anaconda3\envs\na3\lib\site-packages\fire\core.py", line 681, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
TypeError: run() takes 1 positional argument but 18 were given```
any thoughts?
python 3.8.16
For a quick workaround I just did this:
Additional Note:
looks like the tensors are of type torch.int32. I needed to change them using .long().
with torch.no_grad():
self._organism_pool[idxs.long()] = y.to('cpu' if self.hparams.pool_on_cpu else self.device)
with torch.no_grad():
# reset runs in the pool
if i % self.hparams.pool_reset_element_period == 0:
self._organism_pool[idxs[0]] = self.nca.make_start_batch(batch_size=1, size=self.hparams.nca_img_size, device=self._organism_pool.device)[0]
# load incomplete runs from the pool
x = self._organism_pool[idxs.long()].to(self.device)
x_img = self.nca.extract_rgb(x)
Hello. Cool project. Finally was able to get the right torch versions installed, and tryed to run, but Im getting an issue I think related to the cli?
Running
python example_texture_nca.py --train_steps=500 run --lr=0.0005as per the docs gives error:any thoughts?
python 3.8.16
For a quick workaround I just did this:
Additional Note:
looks like the tensors are of type
torch.int32. I needed to change them using.long().