Browse Source

feat: Add BrochureBot - AI-powered brochure builder with OpenAI, Ollama, and DeepSeek support

- Added web content fetching and link formatting
- Integrated OpenAI, Ollama (API/lib), and DeepSeek for AI-driven content generation
- Implemented brochure generation with markdown output
- Updated README with project details, installation, and usage instructions

chore: Move AI-Web-Summarizer project folder from week3 to week1

- Relocated the AI-Web-Summarizer project folder to week1 for better organization
- No functional changes made to the codebase
pull/134/head
arafat 3 months ago
parent
commit
2338e14cf0
  1. 33
      week1/community-contributions/ai-brochure-bot/.gitignore
  2. 161
      week1/community-contributions/ai-brochure-bot/README.md
  3. 33
      week1/community-contributions/ai-brochure-bot/main.py
  4. 6
      week1/community-contributions/ai-brochure-bot/requirements.txt
  5. 0
      week1/community-contributions/ai-brochure-bot/summarizer/__init__.py
  6. 20
      week1/community-contributions/ai-brochure-bot/summarizer/brochure.py
  7. 34
      week1/community-contributions/ai-brochure-bot/summarizer/fetcher.py
  8. 41
      week1/community-contributions/ai-brochure-bot/summarizer/llm_handler.py
  9. 20
      week1/community-contributions/ai-brochure-bot/summarizer/summarizer.py
  10. 0
      week1/community-contributions/ai-brochure-bot/utils/__init__.py
  11. 13
      week1/community-contributions/ai-brochure-bot/utils/config.py
  12. 16
      week1/community-contributions/ai-brochure-bot/utils/logger.py
  13. 33
      week1/community-contributions/ai-web-summarizer/.gitignore
  14. 144
      week1/community-contributions/ai-web-summarizer/README.md
  15. 28
      week1/community-contributions/ai-web-summarizer/main.py
  16. 4
      week1/community-contributions/ai-web-summarizer/requirements.txt
  17. 0
      week1/community-contributions/ai-web-summarizer/summarizer/__init__.py
  18. 23
      week1/community-contributions/ai-web-summarizer/summarizer/fetcher.py
  19. 85
      week1/community-contributions/ai-web-summarizer/summarizer/summarizer.py
  20. 0
      week1/community-contributions/ai-web-summarizer/utils/__init__.py
  21. 11
      week1/community-contributions/ai-web-summarizer/utils/config.py
  22. 16
      week1/community-contributions/ai-web-summarizer/utils/logger.py

33
week1/community-contributions/ai-brochure-bot/.gitignore vendored

@ -0,0 +1,33 @@
# Python
__pycache__/
*.py[cod]
*.pyo
*.pyd
.Python
env/
venv/
*.env
*.ini
*.log
# VSCode
.vscode/
# IDE files
.idea/
# System files
.DS_Store
Thumbs.db
# Environment variables
.env
# Jupyter notebook checkpoints
.ipynb_checkpoints
# Dependencies
*.egg-info/
dist/
build/

161
week1/community-contributions/ai-brochure-bot/README.md

@ -0,0 +1,161 @@
# BrochureBot 🖥📄 – AI-Powered Brochure Builder
BrochureBot is an AI-powered tool designed to help businesses, investors, and recruits create stunning brochures effortlessly. With smart templates, AI-driven content generation, and an easy-to-use interface, BrochureBot allows you to export professional brochures as PDFs or print-ready formats in seconds. Fast, simple, and professional.
## Features
- AI-Driven Content Generation: Automatically generate brochure content using advanced AI models like GPT-4 and DeepSeek.
- Flexible LLM Providers: Choose between OpenAI, Ollama (via API or library), and DeepSeek for content generation.
- Web Content Extraction: Fetch and format links from company websites to gather relevant information.
- Customizable Brochures: Create structured brochures with sections like overview, culture, customers, and career opportunities.
- Error Handling: Robust error handling for reliable performance.
## Project Structure
```
ai-brochure-bot/
│-- summarizer/
│ │-- __init__.py
│ │-- fetcher.py # Web content fetching logic
│ │-- summarizer.py # Main summarization logic
│ │-- brochure.py # Brochure generation logic
│ │-- llm_handler.py # Generic LLM handling logic
│-- utils/
│ │-- __init__.py
│ │-- config.py # Environment configuration
│-- main.py # Entry point of the app
│-- .env # Environment variables
│-- requirements.txt # Python dependencies
│-- README.md # Project documentation
```
## Prerequisites
- Python 3.8 or higher
- OpenAI API Key (You can obtain it from [OpenAI](https://platform.openai.com/signup))
- Ollama installed locally ([Installation Guide](https://ollama.ai))
- `conda` for managing environments (optional)
## Installation
1. **Clone the repository:**
```bash
git clone https://github.com/your-username/ai-brochure-bot.git
cd ai-brochure-bot
```
2. **Create a virtual environment (optional but recommended):**
```bash
conda create --name ai-brochure-bot-env python=3.9
conda activate ai-brochure-bot-env
```
3. **Install dependencies:**
```bash
pip install -r requirements.txt
```
4. **Set up environment variables:**
Create a `.env` file in the project root and add your OpenAI API key (if using OpenAI):
```env
OPENAI_API_KEY=your-openai-api-key
OLLAMA_API_URL=http://127.0.0.1:11434/api/chat
DEEPSEEK_API_KEY=your-deepseek-api-key
```
## Usage
1. **Run the BrochureBot:**
```bash
python main.py
```
2. **Sample Prompts:**
```shell
Enter the company name (default: "ABC").
Enter the company website URL (default: "https://example.com").
Choose the LLM model (default: "deepseek-r1:1.5B" or "gpt-4").
Select the provider (default: "ollama_api").
AI refers to machines demonstrating intelligence similar to humans and animals.
```
3. **Sample Output:**
```shell
Enter company name: AB
Enter company website: https://example.com
Enter LLM model (default: deepseek-r1:1.5B, gpt-4): gpt-4
Enter provider (openai/ollama(ollama_lib/ollama_api), default: ollama_api): openai
Generated Brochure:
# ABC Brochure
## Overview
ABC is a leading AI company specializing in natural language processing and transformer models.
## Culture
Our culture is built on collaboration, innovation, and a passion for AI.
## Customers
We serve a wide range of customers, from startups to Fortune 500 companies.
## Career Opportunities
Join our team and work on cutting-edge AI technologies.
```
## Configuration
You can modify the model, provider, and other settings in main.py:
```python
model_choice = input("Enter LLM model (default: deepseek-r1:1.5B, gpt-4): ") or "deepseek-r1:1.5B"
provider_choice = input("Enter provider (openai/ollama(ollama_lib/ollama_api), default: ollama_api): ") or "ollama_api"
```
## Error Handling
If any issues occur, the script will print an error message, for example:
```
Error: No links found. Exiting...
```
## Dependencies
The required dependencies are listed in `requirements.txt`:
```
openai
python-dotenv
requests
ollama
```
Install them using:
```bash
pip install -r requirements.txt
```
## Contributing
Contributions are welcome! Feel free to fork the repository and submit pull requests.
## License
This project is licensed under the MIT License. See the `LICENSE` file for more details.
## Contact
For any inquiries, please reach out to:
- Linkedin: https://www.linkedin.com/in/khanarafat/
- GitHub: https://github.com/raoarafat

33
week1/community-contributions/ai-brochure-bot/main.py

@ -0,0 +1,33 @@
from summarizer.fetcher import fetch_web_content, format_links
from summarizer.summarizer import get_relevant_links
from summarizer.brochure import generate_brochure
import logging
logger = logging.getLogger(__name__)
def main():
company_name = input("Enter company name: ") or "HuggingFace"
url = input("Enter company website: ") or "https://huggingface.co"
model_choice = input("Enter LLM model (default:deepseek-r1:1.5B, gpt-4): ") or "deepseek-r1:1.5B"
provider_choice = input("Enter provider (openai/ollama(ollama_lib/ollama_api), default: ollama_lib): ") or "ollama_api"
logger.info(f"Fetching links from {url}...")
links = fetch_web_content(url)
if not links:
logger.error("No links found. Exiting...")
return
formatted_links = format_links(url, links)
logger.info(f"Extracted and formatted {len(formatted_links)} links.")
relevant_links = get_relevant_links(company_name, formatted_links, model=model_choice, provider=provider_choice)
logger.info("Filtered relevant links.")
brochure = generate_brochure(company_name, relevant_links, model=model_choice, provider=provider_choice)
print("\nGenerated Brochure:\n")
print(brochure)
if __name__ == "__main__":
main()

6
week1/community-contributions/ai-brochure-bot/requirements.txt

@ -0,0 +1,6 @@
openai
python-dotenv
requests
beautifulsoup4
ollama

0
week1/community-contributions/ai-brochure-bot/summarizer/__init__.py

20
week1/community-contributions/ai-brochure-bot/summarizer/brochure.py

@ -0,0 +1,20 @@
from .llm_handler import call_llm
import logging
logger = logging.getLogger(__name__)
def generate_brochure(company_name, links, model="gpt-4", provider="openai"):
"""Creates a structured markdown brochure using the specified LLM model."""
system_prompt = """You are an AI that generates a structured company brochure in markdown format. Include an overview, culture, customers, and career opportunities."""
user_prompt = f"""
Company: {company_name}
Website Links: {links}
"""
messages = [
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt}
]
return call_llm(messages, model=model, provider=provider)

34
week1/community-contributions/ai-brochure-bot/summarizer/fetcher.py

@ -0,0 +1,34 @@
import requests
from bs4 import BeautifulSoup
import logging
import os
# Logging setup
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
def fetch_web_content(url):
"""Fetches the webpage content and extracts links."""
try:
response = requests.get(url, timeout=10)
response.raise_for_status() # Raise error for failed requests
soup = BeautifulSoup(response.text, 'html.parser')
# Extract all links
links = [a['href'] for a in soup.find_all('a', href=True)]
logger.info(f"Fetched {len(links)} links from {url}")
return links
except requests.RequestException as e:
logger.error(f"Failed to fetch content from {url}: {e}")
return []
def format_links(base_url, links):
"""Converts relative links to absolute URLs and filters irrelevant ones."""
filtered_links = []
for link in links:
if link.startswith("/"):
link = base_url.rstrip("/") + link
if "contact" not in link.lower() and "privacy" not in link.lower():
filtered_links.append(link)
return filtered_links

41
week1/community-contributions/ai-brochure-bot/summarizer/llm_handler.py

@ -0,0 +1,41 @@
import openai # type: ignore
import ollama # type: ignore
from utils.config import Config
import requests # type: ignore
# Initialize clients
openai_client = openai.Client(api_key=Config.OPENAI_API_KEY)
ollama_api_url = Config.OLLAMA_API_URL
def call_llm(messages, model="gpt-4", provider="openai"):
"""
Generic function to call the appropriate LLM provider.
Supports: openai, deepseek, llama.
"""
if provider == "openai":
response = openai_client.chat.completions.create(
model=model,
messages=messages
)
return response.choices[0].message.content
elif provider == "ollama_lib":
response = ollama.chat(
model=model,
messages=messages
)
return response['message']['content']
elif provider == "ollama_api":
payload = {
"model": model,
"messages": messages,
"stream": False # Set to True for streaming responses
}
response = requests.post(ollama_api_url, json=payload)
response_data = response.json()
return response_data.get('message', {}).get('content', 'No summary generated')
else:
raise ValueError("Unsupported provider. Choose 'openai', 'deepseek', or 'llama'.")

20
week1/community-contributions/ai-brochure-bot/summarizer/summarizer.py

@ -0,0 +1,20 @@
from .llm_handler import call_llm
import logging
logger = logging.getLogger(__name__)
def get_relevant_links(website_name, links, model="gpt-4", provider="openai"):
"""Uses the specified LLM model to decide which links are relevant for a brochure."""
system_prompt = "You are an AI assistant that selects the most relevant links for a company brochure."
user_prompt = f"""
Here are links found on {website_name}'s website. Identify the relevant ones:
{links}
"""
messages = [
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt}
]
return call_llm(messages, model=model, provider=provider)

0
week1/community-contributions/ai-brochure-bot/utils/__init__.py

13
week1/community-contributions/ai-brochure-bot/utils/config.py

@ -0,0 +1,13 @@
import os
from dotenv import load_dotenv # type: ignore
# Load environment variables from .env file
load_dotenv()
class Config:
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
OLLAMA_API_URL = os.getenv("OLLAMA_API_URL")
if __name__ == "__main__":
print("OpenAI Key is:", Config.OPENAI_API_KEY)
print("Ollama Api Url is:", Config.OLLAMA_API_URL)

16
week1/community-contributions/ai-brochure-bot/utils/logger.py

@ -0,0 +1,16 @@
import logging
# Setup logging configuration
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
handlers=[
logging.FileHandler("app.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
if __name__ == "__main__":
logger.info("Logger is working correctly.")

33
week1/community-contributions/ai-web-summarizer/.gitignore vendored

@ -0,0 +1,33 @@
# Python
__pycache__/
*.py[cod]
*.pyo
*.pyd
.Python
env/
venv/
*.env
*.ini
*.log
# VSCode
.vscode/
# IDE files
.idea/
# System files
.DS_Store
Thumbs.db
# Environment variables
.env
# Jupyter notebook checkpoints
.ipynb_checkpoints
# Dependencies
*.egg-info/
dist/
build/

144
week1/community-contributions/ai-web-summarizer/README.md

@ -0,0 +1,144 @@
# AI Web Page Summarizer
This project is a simple AI-powered web page summarizer that leverages OpenAI's GPT models and local inference with Ollama to generate concise summaries of given text. The goal is to create a "Reader's Digest of the Internet" by summarizing web content efficiently.
## Features
- Summarize text using OpenAI's GPT models or local Ollama models.
- Flexible summarization engine selection (OpenAI API, Ollama API, or Ollama library).
- Simple and modular code structure.
- Error handling for better reliability.
## Project Structure
```
ai-summarizer/
│-- summarizer/
│ │-- __init__.py
│ │-- fetcher.py # Web content fetching logic
│ │-- summarizer.py # Main summarization logic
│-- utils/
│ │-- __init__.py
│ │-- logger.py # Logging configuration
│ │-- config.py # env configuration
│-- main.py # Entry point of the app
│-- .env # Environment variables
│-- requirements.txt # Python dependencies
│-- README.md # Project documentation
```
## Prerequisites
- Python 3.8 or higher
- OpenAI API Key (You can obtain it from [OpenAI](https://platform.openai.com/signup))
- Ollama installed locally ([Installation Guide](https://ollama.ai))
- `conda` for managing environments (optional)
## Installation
1. **Clone the repository:**
```bash
git clone https://github.com/your-username/ai-summarizer.git
cd ai-summarizer
```
2. **Create a virtual environment (optional but recommended):**
```bash
conda create --name summarizer-env python=3.9
conda activate summarizer-env
```
3. **Install dependencies:**
```bash
pip install -r requirements.txt
```
4. **Set up environment variables:**
Create a `.env` file in the project root and add your OpenAI API key (if using OpenAI):
```env
OPENAI_API_KEY=your-api-key-here
```
## Usage
1. **Run the summarizer:**
```bash
python main.py
```
2. **Sample Output:**
```shell
Enter a URL to summarize: https://example.com
Summary of the page:
AI refers to machines demonstrating intelligence similar to humans and animals.
```
3. **Engine Selection:**
The summarizer supports multiple engines. Modify `main.py` to select your preferred model:
```python
summary = summarize_text(content, 'gpt-4o-mini', engine="openai")
summary = summarize_text(content, 'deepseek-r1:1.5B', engine="ollama-api")
summary = summarize_text(content, 'deepseek-r1:1.5B', engine="ollama-lib")
```
## Configuration
You can modify the model, max tokens, and temperature in `summarizer/summarizer.py`:
```python
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[...],
max_tokens=300,
temperature=0.7
)
```
## Error Handling
If any issues occur, the script will print an error message, for example:
```
Error during summarization: Invalid API key or Ollama not running.
```
## Dependencies
The required dependencies are listed in `requirements.txt`:
```
openai
python-dotenv
requests
ollama-api
```
Install them using:
```bash
pip install -r requirements.txt
```
## Contributing
Contributions are welcome! Feel free to fork the repository and submit pull requests.
## License
This project is licensed under the MIT License. See the `LICENSE` file for more details.
## Contact
For any inquiries, please reach out to:
- Linkedin: https://www.linkedin.com/in/khanarafat/
- GitHub: https://github.com/raoarafat

28
week1/community-contributions/ai-web-summarizer/main.py

@ -0,0 +1,28 @@
from summarizer.fetcher import fetch_web_content
from summarizer.summarizer import summarize_text
from utils.logger import logger
def main():
url = input("Enter a URL to summarize: ")
logger.info(f"Fetching content from: {url}")
content = fetch_web_content(url)
if content:
logger.info("Content fetched successfully. Sending to OpenAI for summarization...")
# summary = summarize_text(content,'gpt-4o-mini', engine="openai")
# summary = summarize_text(content, 'deepseek-r1:1.5B', engine="ollama-lib")
summary = summarize_text(content, 'deepseek-r1:1.5B', engine="ollama-api")
if summary:
logger.info("Summary generated successfully.")
print("\nSummary of the page:\n")
print(summary)
else:
logger.error("Failed to generate summary.")
else:
logger.error("Failed to fetch web content.")
if __name__ == "__main__":
main()

4
week1/community-contributions/ai-web-summarizer/requirements.txt

@ -0,0 +1,4 @@
openai
requests
beautifulsoup4
python-dotenv

0
week1/community-contributions/ai-web-summarizer/summarizer/__init__.py

23
week1/community-contributions/ai-web-summarizer/summarizer/fetcher.py

@ -0,0 +1,23 @@
import requests
from bs4 import BeautifulSoup
def fetch_web_content(url):
try:
response = requests.get(url)
response.raise_for_status()
# Parse the HTML content
soup = BeautifulSoup(response.text, 'html.parser')
# Extract readable text from the web page (ignoring scripts, styles, etc.)
page_text = soup.get_text(separator=' ', strip=True)
return page_text[:5000] # Limit to 5000 chars (API limitation)
except requests.exceptions.RequestException as e:
print(f"Error fetching the webpage: {e}")
return None
if __name__ == "__main__":
url = "https://en.wikipedia.org/wiki/Natural_language_processing"
content = fetch_web_content(url)
print(content[:500]) # Print a sample of the content

85
week1/community-contributions/ai-web-summarizer/summarizer/summarizer.py

@ -0,0 +1,85 @@
import openai # type: ignore
import ollama
import requests
from utils.config import Config
# Local Ollama API endpoint
OLLAMA_API = "http://127.0.0.1:11434/api/chat"
# Initialize OpenAI client with API key
client = openai.Client(api_key=Config.OPENAI_API_KEY)
def summarize_with_openai(text, model):
"""Summarize text using OpenAI's GPT model."""
try:
response = client.chat.completions.create(
model=model,
messages=[
{"role": "system", "content": "You are a helpful assistant that summarizes web pages."},
{"role": "user", "content": f"Summarize the following text: {text}"}
],
max_tokens=300,
temperature=0.7
)
return response.choices[0].message.content
except Exception as e:
print(f"Error during OpenAI summarization: {e}")
return None
def summarize_with_ollama_lib(text, model):
"""Summarize text using Ollama Python library."""
try:
messages = [
{"role": "system", "content": "You are a helpful assistant that summarizes web pages."},
{"role": "user", "content": f"Summarize the following text: {text}"}
]
response = ollama.chat(model=model, messages=messages)
return response['message']['content']
except Exception as e:
print(f"Error during Ollama summarization: {e}")
return None
def summarize_with_ollama_api(text, model):
"""Summarize text using local Ollama API."""
try:
payload = {
"model": model,
"messages": [
{"role": "system", "content": "You are a helpful assistant that summarizes web pages."},
{"role": "user", "content": f"Summarize the following text: {text}"}
],
"stream": False # Set to True for streaming responses
}
response = requests.post(OLLAMA_API, json=payload)
response_data = response.json()
return response_data.get('message', {}).get('content', 'No summary generated')
except Exception as e:
print(f"Error during Ollama API summarization: {e}")
return None
def summarize_text(text, model, engine="openai"):
"""Generic function to summarize text using the specified engine (openai/ollama-lib/ollama-api)."""
if engine == "openai":
return summarize_with_openai(text, model)
elif engine == "ollama-lib":
return summarize_with_ollama_lib(text, model)
elif engine == "ollama-api":
return summarize_with_ollama_api(text, model)
else:
print("Invalid engine specified. Use 'openai', 'ollama-lib', or 'ollama-api'.")
return None
if __name__ == "__main__":
sample_text = "Artificial intelligence (AI) is intelligence demonstrated by machines, as opposed to the natural intelligence displayed by animals and humans."
# Summarize using OpenAI
openai_summary = summarize_text(sample_text, model="gpt-3.5-turbo", engine="openai")
print("OpenAI Summary:", openai_summary)
# Summarize using Ollama Python library
ollama_lib_summary = summarize_text(sample_text, model="deepseek-r1:1.5B", engine="ollama-lib")
print("Ollama Library Summary:", ollama_lib_summary)
# Summarize using local Ollama API
ollama_api_summary = summarize_text(sample_text, model="deepseek-r1:1.5B", engine="ollama-api")
print("Ollama API Summary:", ollama_api_summary)

0
week1/community-contributions/ai-web-summarizer/utils/__init__.py

11
week1/community-contributions/ai-web-summarizer/utils/config.py

@ -0,0 +1,11 @@
import os
from dotenv import load_dotenv
# Load environment variables from .env file
load_dotenv()
class Config:
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
if __name__ == "__main__":
print("Your OpenAI Key is:", Config.OPENAI_API_KEY)

16
week1/community-contributions/ai-web-summarizer/utils/logger.py

@ -0,0 +1,16 @@
import logging
# Setup logging configuration
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
handlers=[
logging.FileHandler("app.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
if __name__ == "__main__":
logger.info("Logger is working correctly.")
Loading…
Cancel
Save