Welcome to this comprehensive guide on using DeepSeek to create Nepali jokes through Ollama. In this tutorial, we’ll explore how to set up and use DeepSeek-R1, a powerful language model optimized for complex reasoning tasks, to generate Nepali jokes directly on your computer. The goal is to help you become familiar with Ollama, including downloading models and using the API as you would in a real-world application.
This tutorial walks you through:
By the end, you’ll have a working application that generates custom jokes about Nepali film stars like Rajesh Hamal and Bhuwan KC—all running on your personal hardware.
Let’s understand the key components we’ll be using:
Ollama: An interface that simplifies running large language models locally. It manages model execution and provides an API for application integration.
DeepSeek-R1: A sophisticated language model designed for creative content generation and complex reasoning tasks. It’s perfect for our joke generator because it can understand cultural context and generate creative content.
First, let’s install Ollama:
With Ollama installed, let’s add the DeepSeek model:
# Download the DeepSeek-R1 model
ollama pull deepseek-r1:latest
# Verify installation
ollama list
The model is ready when you see deepseek-r1:latest
in the list.
Now for the fun part! We’ll create a Python application that generates contextually relevant jokes about famous Nepali actors.
Create a file named nepali_jokes.py
with this code:
import asyncio
import httpx
import sys
from typing import Dict
# Configure stdout for proper UTF-8 handling
sys.stdout.reconfigure(encoding="utf-8")
# Terminal formatting
BOLD = "\033[1m"
RESET = "\033[0m"
GREEN = "\033[32m"
RED = "\033[31m"
# Model configuration
MODEL_NAME = "deepseek-r1"
# Available joke categories
categories: Dict[str, str] = {
"1": "Rajesh Hamal Jokes",
"2": "Bhuwan KC Jokes",
}
async def get_joke(category: str) -> str:
"""Generate culturally relevant jokes about Nepali actors."""
actor = "Rajesh Hamal" if category == "1" else "Bhuwan KC"
# Actor-specific context provides cultural relevance
context = {
"Rajesh Hamal": {
"style": "dramatic dialogue delivery and heroic poses",
"famous_role": "Yug Dekhi Yug Samma",
"trope": "invincible hero who can stop bullets with his mustache",
"signature": "flamboyant scarves and intense eye contact"
},
"Bhuwan KC": {
"style": "action-comedy mix with exaggerated expressions",
"famous_role": "Kusume Rumal",
"trope": "street-smart hero who fights with comic timing",
"signature": "playing dual roles in family dramas"
}
}
# Construct prompt with actor-specific context
prompt = (
f"Create a funny 2-sentence joke about {actor}, the Nepali film superstar. "
f"Use their {context[actor]['style']} and reference {context[actor]['famous_role']}. "
f"Play with their {context[actor]['trope']} and {context[actor]['signature']}. "
"Keep it lighthearted and family-friendly. Avoid sensitive topics. "
"Example structure: 'Why did [Actor]...? Because...'"
)
# Configure model parameters for optimal joke generation
request_data = {
"model": MODEL_NAME,
"prompt": prompt,
"stream": False,
"options": {
"temperature": 0.8, # Balances creativity and coherence
"max_tokens": 200,
"top_p": 0.95
}
}
# Send request to local Ollama API
async with httpx.AsyncClient() as client:
try:
response = await client.post(
"http://localhost:11434/api/generate",
json=request_data,
timeout=850.0
)
if response.status_code == 200:
data = response.json()
return data.get("response", "Error: No response in data").strip()
return f"Error: Server returned status {response.status_code}"
except httpx.TimeoutException:
return f"{RED}Error: Request timed out. Please try again.{RESET}"
except Exception as e:
return f"{RED}Error: {str(e)}\nMake sure Ollama is running with {MODEL_NAME} model installed.{RESET}"
async def check_model_availability() -> bool:
"""Verify DeepSeek model is available before attempting to use it."""
try:
async with httpx.AsyncClient() as client:
response = await client.get("http://localhost:11434/api/tags")
if response.status_code == 200:
models = response.json().get("models", [])
return any(model["name"].startswith(MODEL_NAME) for model in models)
return False
except Exception:
return False
async def main() -> None:
"""Interactive joke generation interface."""
print(f"\n{BOLD}Nepali Actor Jokes Generator (Powered by {MODEL_NAME}) {RESET}")
# Check Ollama connection and model availability
try:
async with httpx.AsyncClient() as client:
await client.get("http://localhost:11434/")
print(f"{GREEN}Successfully connected to Ollama!{RESET}")
if not await check_model_availability():
print(f"{RED}Model {MODEL_NAME} not found. Please install with:{RESET}")
print(f"\nollama pull {MODEL_NAME}")
return
except Exception as e:
print(f"{RED}Connection failed: {str(e)}{RESET}")
print("Ensure Ollama is running on http://localhost:11434")
return
# Main interaction loop
while True:
print("\nChoose an actor:")
for num, category in categories.items():
print(f"{num}. {category}")
choice = input(f"\nSelect (1-2) or '{BOLD}exit{RESET}': ").strip()
if choice.lower() == "exit":
print(f"\n{GREEN}{BOLD}Thanks for laughing with us!{RESET}")
break
if choice in categories:
print(f"\nGenerating {categories[choice]}...")
joke = await get_joke(choice)
print(f"\n{BOLD}😄 Joke:{RESET}\n{joke}")
else:
print(f"\n{RED}Invalid input. Choose 1 or 2.{RESET}")
if __name__ == "__main__":
try:
asyncio.run(main())
except KeyboardInterrupt:
print(f"\n{GREEN}Session ended.{RESET}")
What makes this application special isn’t just the technology—it’s the way we teach the AI about Nepali cinema culture. By feeding the model rich cultural context, we transform generic humor into jokes that feel authentically Nepali. Let’s see how this works:
context = {
"Rajesh Hamal": {
"style": "dramatic dialogue delivery and heroic poses",
"famous_role": "Yug Dekhi Yug Samma",
"trope": "invincible hero who can stop bullets with his mustache",
"signature": "flamboyant scarves and intense eye contact"
},
"Bhuwan KC": {
"style": "action-comedy mix with exaggerated expressions",
"famous_role": "Kusume Rumal",
"trope": "street-smart hero who fights with comic timing",
"signature": "playing dual roles in family dramas"
}
}
This dictionary functions as a cultural knowledge base, containing essential information that even people outside Nepal might not know:
The real innovation happens when we transform this cultural knowledge into precise instructions for the AI:
prompt = (
f"Create a funny 2-sentence joke about {actor}, the Nepali film superstar. "
f"Use their {context[actor]['style']} and reference {context[actor]['famous_role']}. "
f"Play with their {context[actor]['trope']} and {context[actor]['signature']}. "
"Keep it lighthearted and family-friendly. Avoid sensitive topics. "
"Example structure: 'Why did [Actor]...? Because...'"
)
Without this cultural context, the AI might generate jokes that could apply to any actor from any country. With our approach, the system creates humor that:
The result is jokes that feel like they were written by someone familiar with Nepali cinema, not by a generic AI system.
To try it yourself:
# Install the required HTTP client
pip install httpx
# Run the joke generator
python nepali_jokes.py
Follow the prompts to select an actor and watch as DeepSeek-R1 generates customized jokes.
Want to improve your jokes? Try these adjustments:
Fine-tune the temperature parameter: The temperature value (currently 0.8) controls creativity. Higher values (0.9-1.0) produce more unexpected jokes, while lower values (0.6-0.7) generate more predictable ones.
Expand the context dictionary: Add more specific details about the actors for richer, more accurate jokes.
Modify the prompt template: Experiment with different joke structures or humor styles by changing the prompt instructions.
Running AI locally offers several advantages:
Remember that local AI models can be resource-intensive. For optimal performance, close unnecessary applications and ensure your system meets the minimum hardware requirements for running large language models.
At Adinovi, we specialize in helping Nepali businesses implement AI solutions that respect local culture and address regional challenges. Our services include:
Ready to explore how AI can transform your business?
Visit Adinovi.com to learn more about our services.