Directory not found

Install of torchrec lib - impossible to use notebook on my account. I’m blocked. Anything to do?

We had some issues with underlying storage infrastructure around that time (Stale file handle error is a visible symptom of it here), that has been addressed and everything should be back to normal now. Do you still experience this issues?

I can’t test coz even the free version does not allow me to install Torch, due to memory limit. It’s very limited I cant test anything and install a simple torch lib is impossible then.

Could you tell us what is your UUID? (You can find it between /user/ and /lab in the url of your notebook instance.)


Thanks! According to what I see in the system you’re using a basic account with 5GB of storage, is that correct? If so, that’s unfortunately true that you would be not able to install torch due to storage limitations. We’re working on deploying additional conda AI env with packages like torch preinstalled. I hope that would solve your issue.

Thanks. As I won’t be able to test your product with Torch which is a common module. I obviously will use another tool than yours, it’s a pity. Let me know when u’ll be ready for the market. My bests

Fair enough – we’ll ping you when it’s deployed.

I had this issue today 2nd of May 2024. After running a model for several hours, the “Directory not found” error popped up. It is attached here
Any help will be much appreciated.

Hi @kbasu, thanks for reaching out and hope that you’re enjoying the tutorial!

It looks like the connection has been lost or the kernel has crashed at some stage as the CPU usage at the top is still at 0, whereas if the model had been running for hours I would have expected some usage there.

Assuming you’ve tried refreshing the page? Failing that, is there a chance the model that you’re running is using more than 6GB of RAM, As each kernel is limited to that. To test, maybe reduce the data size?

Let us know how you get on, any additional screenshots always useful as well!

Hi @JackEvans ,
Thanks for getting back. The CPU usage when I last checked before crashing was much higher, but maybe after crashing/lost connection it got reset to 0 when I took the screenshot.
I was not aware of the 6GB RAM restriction. The overall dataset got over 18,000 samples.
Is there any mechanism to visually check the CPU usage or progress when the notebook is running?