GPUs

https://d2l.ai/chapter_builders-guide/use-gpu.html

https://mxnet.apache.org/versions/1.6/api/python/docs/tutorials/getting-started/crash-course/6-use_gpus.html

Hi:
May I know MXNET by default will store all variable in memory instead of GPU memory even I have install CUDA version MXnet, unless I specifically assign ctx=gpu(0)?
Is there any way to make it by default everything store in GPU memory, just like TensorFlow default behaviour?

Thanks
//Johnny

When I’m storing variables to GPU I get the following notification:
C:\Jenkins\workspace\mxnet\mxnet\src\storage\storage.cc:199: Using Pooled (Naive) StorageManager for GPU
Is this normal, or is it bad?