Gitea Setup
Configure Codecora for Gitea using a Personal Access Token. Perfect for self-hosted instances and homelabs.
Prerequisites
- A Gitea instance (gitea.io or self-hosted)
- A repository where you have admin permissions
- An OpenAI-compatible API key (OpenAI, Anthropic, Groq, Ollama, etc.)
Step 1: Generate Personal Access Token
Create a Personal Access Token (PAT) in Gitea with the required permissions:
- Log in to your Gitea instance
- Go to Settings → Applications
- Click "Generate New Token" (or "Add New Token")
- Name it something recognizable like "Codecora"
- Set the token scopes/permissions:
Required scopes:
- ✓ repo:code (read/write code)
- ✓ repo:issues (read/write issues/PRs)
- ✓ read:user (read user information)
Note: Scope names may vary slightly between Gitea versions.
- Click "Generate Token"
- Important: Copy the token immediately. You won't be able to see it again!
Step 2: Add Gitea to Codecora
- Go to the Codecora dashboard at codecora.dev/dashboard
- Click "Add Repository" or "Add Integration"
- Select "Gitea" as the platform
- Enter your Gitea instance URL:
- For gitea.io:
https://gitea.io - For self-hosted:
https://your-gitea-instance.com
- For gitea.io:
- Paste your Personal Access Token
- Click "Test Connection" to verify the token works
- Click "Save" to add the integration
Step 3: Configure Webhook
For Codecora to automatically review your pull requests, configure a webhook in Gitea:
- Go to your Gitea repository → Settings → Webhooks
- Click "Add Webhook"
- Enter the webhook URL:
https://codecora.dev/webhooks/gitea - Select trigger events:
- ✓ Pull Request events (push, opened, edited, synchronized)
- ✓ Push events (optional, for branch updates)
- Choose the content type:
application/json - Click "Add Webhook"
- Test the webhook by clicking the "Test Delivery" button
Step 4: Configure AI Provider
Configure your AI provider in the Codecora dashboard:
- Go to Settings → AI Provider
- Choose your provider:
- OpenAI (gpt-4o, gpt-4o-mini)
- Anthropic (claude-3-5-sonnet)
- Groq (llama-3.3-70b)
- Ollama (self-hosted models)
- Any OpenAI-compatible API
- Enter your API key or endpoint URL
- Test the connection
- Save your settings
🏠 Self-Hosted AI?
You can use Ollama or other self-hosted AI models with Codecora. Just enter the local endpoint URL (e.g., http://localhost:11434) when configuring the provider. Perfect for air-gapped environments!
Docker Compose Example
Here's an example docker-compose setup for running Gitea with Codecora:
version: "3.8"
services:
gitea:
image: gitea/gitea:latest
environment:
- GITEA__server__DOMAIN=git.example.com
- GITEA__server__ROOT_URL=https://git.example.com
- GITEA__server__SSH_DOMAIN=git.example.com
ports:
- "3000:3000"
- "222:22"
volumes:
- gitea_data:/data
restart: unless-stopped
codecora-worker:
image: codecora/worker:latest
environment:
- GITEA_URL=https://git.example.com
- GITEA_PAT=${GITEA_PAT}
- OPENAI_API_KEY=${OPENAI_API_KEY}
restart: unless-stopped
volumes:
gitea_data:Self-Hosted Gitea
Codecora is perfect for self-hosted Gitea instances:
- Works with any Gitea version (1.18+ recommended)
- No complex OAuth setup needed (just use PAT)
- Perfect for homelabs and air-gapped environments
- Can use self-hosted AI models (Ollama) for fully local setup
🔒 Fully Local Setup
For maximum privacy, run both Gitea and Codecora worker on your own infrastructure. Use Ollama for local AI models. Your code never leaves your network.
Troubleshooting
Webhook not triggering?
- Verify the webhook URL:
https://codecora.dev/webhooks/gitea - Check Gitea webhook delivery logs for error messages
- Ensure the webhook secret matches
- Confirm "Pull Request" events are enabled
PAT authentication failed?
- Verify the token has the required scopes
- Check the token hasn't expired
- Ensure you're using the correct Gitea instance URL
- Try regenerating the token
Can't access repositories?
- Ensure your PAT has repo:code and repo:issues scopes
- Verify you have admin access to the repository
- Check you're using the correct instance URL (including https://)
Self-hosted AI not working?
- Ensure Ollama or your AI server is running
- Check the endpoint URL is accessible from Codecora's servers
- For local AI, you may need self-hosted Codecora worker
Homelab & Self-Hosting Community
Gitea + Codecora is a popular combination for homelab enthusiasts:
- Full control: Your code, your infrastructure, your AI models
- Privacy: Nothing leaves your network
- Cost-effective: No monthly fees for Gitea, pay only for AI (or use free local models)
- Flexible: Use any OpenAI-compatible API or self-hosted models
💬 Join the Community
Share your setup and learn from others in our GitHub Discussions.