Key Takeaways
- JupyterLab is the modern interface: JupyterLab is the successor to classic Jupyter Notebook. It adds a file browser, multiple tabs, terminal, and extension ecosystem. Use JupyterLab for new work.
- Keyboard shortcuts save hours: Shift+Enter runs a cell and advances. ESC enters command mode. A/B insert cells. DD deletes. M converts to Markdown. These cover 90% of your interactions.
- Restart kernel before sharing: Always Restart Kernel & Run All Cells before sharing a notebook. This ensures linear execution from top to bottom without hidden state from out-of-order cell execution.
- VS Code notebooks are a strong alternative: VS Code's Jupyter extension provides a full notebook experience with better Git integration and IntelliSense. Many practitioners have switched for serious development work.
Jupyter Notebooks are where data science actually happens. Exploration, EDA, prototyping, and communicating findings all happen in notebooks before code moves to production pipelines. This guide covers what you need to be productive.
Setup and Launch
# Install and launch pip install jupyterlab ipykernel # Register a virtual environment as a kernel python -m ipykernel install --user --name myproject --display-name "My Project" # Launch JupyterLab jupyter lab # opens at http://localhost:8888
Create a separate virtual environment for each data project. Register it as a named kernel so you can switch environments from the JupyterLab kernel selector without leaving the browser.
Keyboard Shortcuts
Running cells:
- Shift+Enter: Run cell and move to next
- Ctrl+Enter: Run cell in place
- Alt+Enter: Run cell and insert new cell below
Command mode (press ESC first):
- A: Insert cell above | B: Insert cell below
- DD: Delete cell | Z: Undo delete
- M: Convert to Markdown | Y: Convert to code
- Shift+L: Toggle line numbers
Edit mode (press Enter):
- Tab: Autocomplete | Shift+Tab: Show docstring
- Ctrl+/: Toggle comment
Magic Commands
# Time a cell %%time # Time a single expression %timeit df.groupby('category').sum() # Shell commands !pip install new-package !ls -la data/ # Auto-reload imported modules (critical for development) %load_ext autoreload %autoreload 2 # Show plots inline %matplotlib inline
Data Science Workflow Structure
Structure every notebook with these sections:
- Imports and configuration (all imports in cell 1)
- Data loading and inspection (shape, dtypes, head, describe)
- Data cleaning (nulls, types, duplicates)
- Exploratory Data Analysis (distributions, correlations, outliers)
- Feature engineering (new features, encoding)
- Modeling (train/test split, model, evaluation)
- Conclusions (Markdown summary of key findings)
Use Markdown cells between code sections to explain what you are doing and why. A notebook that is just code is documentation for your future self that is already half-forgotten.
Frequently Asked Questions
What is the difference between Jupyter Notebook and JupyterLab?
JupyterLab is the modern successor to the classic Jupyter Notebook interface. JupyterLab adds a file browser, multiple notebook tabs, split view, a text editor, a terminal, and an extension ecosystem. New users should start with JupyterLab.
How do I share a Jupyter Notebook?
Export as HTML (File > Export > HTML) for a static document. Share the .ipynb file directly for others with Jupyter. Use nbconvert for PDF or Markdown export. Host on GitHub (notebooks render automatically). Use Google Colab for cloud-hosted collaborative notebooks.
How do I use Jupyter with Git?
Jupyter notebooks store outputs (plots, print statements) in the .ipynb JSON, creating noisy diffs. Use nbstripout as a pre-commit hook to strip outputs before committing, or use Jupytext to save notebooks as Python files with cell markers that diff cleanly.
What is Google Colab?
Google Colab is a free cloud-hosted Jupyter environment with free GPU/TPU access, pre-installed ML libraries, and no local setup required. It is the best option for learning ML on a machine without a dedicated GPU. Colab Pro ($10/month) provides longer runtimes and more powerful GPUs.
Notebooks are where data science ideas become reality. Get the skills.
Join professionals from Denver, NYC, Dallas, LA, and Chicago for two days of hands-on AI and tech training. $1,490. October 2026. Seats are limited.
Reserve Your SeatNote: Information reflects early 2026.