As the field of Generative AI (GenAI) continues to rapidly evolve, it's essential to have the right tools and setup to explore and experiment with these cutting-edge technologies. In this tutorial, we'll walk through the three key components you'll need to get started with GenAI on your local machine.
1. Installing Python Python is a versatile and widely used programming language that serves as the backbone for many GenAI tools and frameworks. Before diving into GenAI, you'll need to ensure that Python is installed on your system. Head over to the official Python website (https://www.python.org/downloads/) and download the latest version compatible with your operating system. Follow the installation instructions carefully, and don't forget to add Python to your system's PATH variable for easy access from the command line.
2. Installing Jupyter Notebook or use Google Colab Jupyter Notebook is an open-source web application that allows you to create and share documents containing live code, visualizations, and narrative text. It's a powerful tool for exploring GenAI models, as it provides an interactive environment for writing and executing code, as well as visualizing and analyzing the results. To install Jupyter Notebook, open your terminal or command prompt and run the following command:
# Create a virtualenv
python -m venv env
# Activate virtual env on linux or mac
source ./env/bin/activate
# Activating virtualenv on windows
.\env\Scripts\activate
# Install jupyter notebook
pip install notebook
Once installed, you can start the Jupyter Notebook server by running jupyter-notebook
from the command line.
3. Local LLM: Installing Local LLM Tool (e.g., jan.ai or LMStudio) While there are several options for local LLM (Large Language Model) tools, we'll highlight two popular choices: jan.ai and LMStudio. Visit one of the below links and download whichever you want.
I know you might be thinking why do I even need to install any of these? I can visit Chatgpt or Gemini. However, we are going to use OpenAI API in the upcoming part of the course. If you do not want to spend money on the API, local LLMs can be a very easy solution.