Install pyspark for Python3 in Anaconda JupyterLab

Hi Anaconda community,

I have used the Anaconda platform for fastAPI development using Pycharm IDE.
I am now learning how to start pyspark tasks in Anaconda JupyterLab in Python3

From the Udacity online course, I notice JupyterNotebook configuration there only need to select Pyhton3 as kernel. Pyspark dependent libraries can be imported in ipynb cells as needed.
However, the default JupyterLab in Anaconda kernel: Python3 (ipykernel).
It cannot find module pyspark. I could not replicate the env. as Udacity workspace.

I found the URL and followed the steps, it does not help: Install PySpark in Anaconda & Jupyter Notebook - Spark by {Examples}
It provides more kernels: Python

conda list of my env as below:
anaconda custom py39_1
anaconda-client 1.9.0 py39haa95532_0
anaconda-navigator 2.2.0 py39haa95532_0
anaconda-project 0.10.2 pyhd3eb1b0_0
ipykernel 6.9.1 py39haa95532_0
ipython 8.2.0 py39haa95532_0
ipython_genutils 0.2.0 pyhd3eb1b0_1
ipywidgets 7.6.5 pyhd3eb1b0_1

Anyone knows how to install pyspark in Anaconda JupyterLab such that pyspark appears “incorporate” into Python3 kernel in JupyterLab as Udacity workspace does.
Thanks in anticipation.