Very large data csv file of 18Gb

Please how do i load a very large dataset of 18Gb csv into jupyter without crashing

This is the typical use case for what is called ‘data-chunking’, where you read the data file in chunks into memory, ‘n’ lines at a time using essentially a loop. You can use the ‘chunksize’ argument to read say 100000 lines at a time.

See this post for an example:

1 Like

You will most likely also need to increase the memory available to your jupyter notebook:

Thank you, i will try it out.