AiCore Examples
This directory contains practical examples demonstrating how to use AiCore in different scenarios. Each example showcases specific features and integration patterns.
Available Examples
FastAPI Integration
fastapi/- Complete FastAPI application demonstrating:- Authentication and authorization
- Rate limiting middleware
- Websocket support for streaming responses
- Production-ready LLM service integration
- Configuration management
Chainlit Chat Interface
chainlit/- Interactive chat application featuring:- Multiple LLM provider profiles
- Advanced reasoning capabilities
- Docker deployment setup
- Customizable UI components
Core Functionality
observability_dashboard.py- Launch and interact with the observability dashboardsimple_llm_call.py- Basic synchronous LLM callsimple_async_llm_call.py- Basic asynchronous LLM callreasoning_example.py- Advanced reasoning capabilities demonstration
Getting Started
Prerequisites
- Install AiCore:
bash
pip install core-for-ai- Install example-specific dependencies:
bash
pip install -r examples/<example_dir>/requirements.txtConfiguration
- Set up environment variables:
bash
cp .env-example .env
# Edit .env with your API keys- Configure provider settings in
config/directory as needed
Running Examples
bash
# For Python scripts
python examples/<script_name>.py
# For FastAPI
uvicorn examples.fastapi.main:app --reload
# For Chainlit
chainlit run examples/chainlit/app/app.pyExample Structure
Each example directory contains:
README.md- Specific instructions for that examplerequirements.txt- Python dependencies- Source code demonstrating best practices
Contributing
We welcome example contributions! Please:
- Maintain consistent structure
- Include proper documentation
- Add requirements.txt if needed
- Submit via pull request
For more complex integrations, see our Built with AiCore showcase.