Generate Nepali Jokes with AI - Running Deepseek on Your Computer
Artificial intelligence Adinovi February 12, 2025

Generate Nepali Jokes with AI - Running Deepseek on Your Computer

Welcome to this comprehensive guide on using DeepSeek to create Nepali jokes through Ollama. In this tutorial, we’ll explore how to set up and use DeepSeek-R1, a powerful language model optimized for complex reasoning tasks, to generate Nepali jokes directly on your computer. The goal is to help you become familiar with Ollama, including downloading models and using the API as you would in a real-world application.

What You’ll Learn

This tutorial walks you through:

  1. Setting up DeepSeek-R1 locally - Run sophisticated AI without cloud dependencies
  2. Configuring Ollama - Manage local language models with ease
  3. Building a joke generator application - Ollama while running locally exposes API. We will consume that API to Create culturally relevant humor about Nepali celebrities.

By the end, you’ll have a working application that generates custom jokes about Nepali film stars like Rajesh Hamal and Bhuwan KC—all running on your personal hardware.

The Technology Stack

Let’s understand the key components we’ll be using:

  • Ollama: An interface that simplifies running large language models locally. It manages model execution and provides an API for application integration.

  • DeepSeek-R1: A sophisticated language model designed for creative content generation and complex reasoning tasks. It’s perfect for our joke generator because it can understand cultural context and generate creative content.

Installation: Getting Ollama Running

First, let’s install Ollama:

  1. Visit Ollama’s download page
  2. Select and download the version for your operating system
  3. Complete the installation process
  4. Verify the installation by opening http://localhost:11434 in your browser—you should see “Ollama is running”

Setting Up DeepSeek-R1

With Ollama installed, let’s add the DeepSeek model:

# Download the DeepSeek-R1 model
ollama pull deepseek-r1:latest

# Verify installation
ollama list

The model is ready when you see deepseek-r1:latest in the list.

Building Our Nepali Joke Generator

Now for the fun part! We’ll create a Python application that generates contextually relevant jokes about famous Nepali actors.

Create a file named nepali_jokes.py with this code:

import asyncio
import httpx
import sys
from typing import Dict

# Configure stdout for proper UTF-8 handling
sys.stdout.reconfigure(encoding="utf-8")

# Terminal formatting
BOLD = "\033[1m"
RESET = "\033[0m"
GREEN = "\033[32m"
RED = "\033[31m"

# Model configuration
MODEL_NAME = "deepseek-r1"

# Available joke categories
categories: Dict[str, str] = {
    "1": "Rajesh Hamal Jokes",
    "2": "Bhuwan KC Jokes",
}

async def get_joke(category: str) -> str:
    """Generate culturally relevant jokes about Nepali actors."""
    actor = "Rajesh Hamal" if category == "1" else "Bhuwan KC"
    
    # Actor-specific context provides cultural relevance
    context = {
        "Rajesh Hamal": {
            "style": "dramatic dialogue delivery and heroic poses",
            "famous_role": "Yug Dekhi Yug Samma",
            "trope": "invincible hero who can stop bullets with his mustache",
            "signature": "flamboyant scarves and intense eye contact"
        },
        "Bhuwan KC": {
            "style": "action-comedy mix with exaggerated expressions",
            "famous_role": "Kusume Rumal",
            "trope": "street-smart hero who fights with comic timing",
            "signature": "playing dual roles in family dramas"
        }
    }
    
    # Construct prompt with actor-specific context
    prompt = (
        f"Create a funny 2-sentence joke about {actor}, the Nepali film superstar. "
        f"Use their {context[actor]['style']} and reference {context[actor]['famous_role']}. "
        f"Play with their {context[actor]['trope']} and {context[actor]['signature']}. "
        "Keep it lighthearted and family-friendly. Avoid sensitive topics. "
        "Example structure: 'Why did [Actor]...? Because...'"
    )

    # Configure model parameters for optimal joke generation
    request_data = {
        "model": MODEL_NAME,
        "prompt": prompt,
        "stream": False,
        "options": {
            "temperature": 0.8,  # Balances creativity and coherence
            "max_tokens": 200,
            "top_p": 0.95
        }
    }
    
    # Send request to local Ollama API
    async with httpx.AsyncClient() as client:
        try:
            response = await client.post(
                "http://localhost:11434/api/generate",
                json=request_data,
                timeout=850.0
            )
            
            if response.status_code == 200:
                data = response.json()
                return data.get("response", "Error: No response in data").strip()
            return f"Error: Server returned status {response.status_code}"
            
        except httpx.TimeoutException:
            return f"{RED}Error: Request timed out. Please try again.{RESET}"
        except Exception as e:
            return f"{RED}Error: {str(e)}\nMake sure Ollama is running with {MODEL_NAME} model installed.{RESET}"

async def check_model_availability() -> bool:
    """Verify DeepSeek model is available before attempting to use it."""
    try:
        async with httpx.AsyncClient() as client:
            response = await client.get("http://localhost:11434/api/tags")
            if response.status_code == 200:
                models = response.json().get("models", [])
                return any(model["name"].startswith(MODEL_NAME) for model in models)
            return False
    except Exception:
        return False

async def main() -> None:
    """Interactive joke generation interface."""
    print(f"\n{BOLD}Nepali Actor Jokes Generator (Powered by {MODEL_NAME}) {RESET}")
    
    # Check Ollama connection and model availability
    try:
        async with httpx.AsyncClient() as client:
            await client.get("http://localhost:11434/")
            print(f"{GREEN}Successfully connected to Ollama!{RESET}")
            
            if not await check_model_availability():
                print(f"{RED}Model {MODEL_NAME} not found. Please install with:{RESET}")
                print(f"\nollama pull {MODEL_NAME}")
                return
            
    except Exception as e:
        print(f"{RED}Connection failed: {str(e)}{RESET}")
        print("Ensure Ollama is running on http://localhost:11434")
        return

    # Main interaction loop
    while True:
        print("\nChoose an actor:")
        for num, category in categories.items():
            print(f"{num}. {category}")
        
        choice = input(f"\nSelect (1-2) or '{BOLD}exit{RESET}': ").strip()
        
        if choice.lower() == "exit":
            print(f"\n{GREEN}{BOLD}Thanks for laughing with us!{RESET}")
            break
            
        if choice in categories:
            print(f"\nGenerating {categories[choice]}...")
            joke = await get_joke(choice)
            print(f"\n{BOLD}😄 Joke:{RESET}\n{joke}")
        else:
            print(f"\n{RED}Invalid input. Choose 1 or 2.{RESET}")

if __name__ == "__main__":
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print(f"\n{GREEN}Session ended.{RESET}")

Enriching AI with Cultural Context: The Secret to Authentic Nepali Jokes

What makes this application special isn’t just the technology—it’s the way we teach the AI about Nepali cinema culture. By feeding the model rich cultural context, we transform generic humor into jokes that feel authentically Nepali. Let’s see how this works:

Building a Cultural Knowledge Base

context = {
    "Rajesh Hamal": {
        "style": "dramatic dialogue delivery and heroic poses",
        "famous_role": "Yug Dekhi Yug Samma",
        "trope": "invincible hero who can stop bullets with his mustache",
        "signature": "flamboyant scarves and intense eye contact"
    },
    "Bhuwan KC": {
        "style": "action-comedy mix with exaggerated expressions",
        "famous_role": "Kusume Rumal",
        "trope": "street-smart hero who fights with comic timing",
        "signature": "playing dual roles in family dramas"
    }
}

This dictionary functions as a cultural knowledge base, containing essential information that even people outside Nepal might not know:

  • Acting style - The distinctive performance approach that makes each actor recognizable
  • Famous roles - Authentic film titles that locals would immediately recognize
  • Character tropes - The recurring character types that made these actors famous
  • Signature mannerisms - Visual or behavioral traits that fans associate with each star

Crafting Culturally-Informed AI Instructions

The real innovation happens when we transform this cultural knowledge into precise instructions for the AI:

prompt = (
    f"Create a funny 2-sentence joke about {actor}, the Nepali film superstar. "
    f"Use their {context[actor]['style']} and reference {context[actor]['famous_role']}. "
    f"Play with their {context[actor]['trope']} and {context[actor]['signature']}. "
    "Keep it lighthearted and family-friendly. Avoid sensitive topics. "
    "Example structure: 'Why did [Actor]...? Because...'"
)

Without this cultural context, the AI might generate jokes that could apply to any actor from any country. With our approach, the system creates humor that:

  1. References actual Nepali films rather than made-up titles
  2. Incorporates authentic character traits recognized by local audiences
  3. Captures the essence of each actor’s unique performance style
  4. Includes cultural touchpoints that resonate with Nepali cinema fans

The result is jokes that feel like they were written by someone familiar with Nepali cinema, not by a generic AI system.

Running the Application

To try it yourself:

# Install the required HTTP client
pip install httpx

# Run the joke generator
python nepali_jokes.py

Follow the prompts to select an actor and watch as DeepSeek-R1 generates customized jokes.

Optimizing Your Results

Want to improve your jokes? Try these adjustments:

  1. Fine-tune the temperature parameter: The temperature value (currently 0.8) controls creativity. Higher values (0.9-1.0) produce more unexpected jokes, while lower values (0.6-0.7) generate more predictable ones.

  2. Expand the context dictionary: Add more specific details about the actors for richer, more accurate jokes.

  3. Modify the prompt template: Experiment with different joke structures or humor styles by changing the prompt instructions.

Why Local AI Matters

Running AI locally offers several advantages:

  • Privacy: Your data never leaves your computer
  • No subscription costs: Once downloaded, the model is yours to use
  • Customization: You can fine-tune the model for specific Nepali cultural contexts
  • Offline operation: Generate content without internet connectivity

Resources for Further Learning

Remember that local AI models can be resource-intensive. For optimal performance, close unnecessary applications and ensure your system meets the minimum hardware requirements for running large language models.

About Adinovi: Your AI Partner in Nepal

At Adinovi, we specialize in helping Nepali businesses implement AI solutions that respect local culture and address regional challenges. Our services include:

  • AI Strategy & Consulting: Roadmaps for AI adoption tailored to Nepal’s business environment
  • Custom AI Development: Solutions designed for Nepali industries and languages
  • AI Compliance: Ensuring adherence to Nepal’s emerging AI regulations
  • Implementation Support: End-to-end guidance for successful AI deployment

Ready to explore how AI can transform your business?

Visit Adinovi.com to learn more about our services.