当前位置:首页 > 学习笔记 > 正文内容

FastAPI 构建 AI API 服务 - 从开发到部署

廖万里12小时前学习笔记1

# FastAPI 构建 AI API 服务 - 从开发到部署

FastAPI 是构建 AI API 服务的最佳选择,性能强、开发快、文档自动生成。本文从零开始教你构建生产级 AI API。

---

一、FastAPI 简介

FastAPI 是一个现代、高性能的 Python Web 框架,特别适合构建 AI API 服务。

核心优势

  • 高性能:基于 Starlette 和 Pydantic,性能媲美 Node.js
  • 开发快:自动生成 API 文档,减少 40% 开发时间
  • 类型检查:基于 Python 类型注解,IDE 友好
  • 异步支持:原生支持 async/await

安装

pip install fastapi uvicorn

---

二、第一个 API

创建 main.py

from fastapi import FastAPI

app = FastAPI( title="AI API 服务", description="基于 FastAPI 的 AI 接口", version="1.0.0" )

@app.get("/") async def root(): return {"message": "Hello AI World"}

@app.get("/health") async def health_check(): return {"status": "healthy"}

启动服务:

uvicorn main:app --reload

访问:

  • API 地址:http://127.0.0.1:8000
  • 自动文档:http://127.0.0.1:8000/docs
  • 备选文档:http://127.0.0.1:8000/redoc
---

三、请求与响应

路径参数

@app.get("/users/{user_id}")
async def get_user(user_id: int):
    return {"user_id": user_id}

查询参数

@app.get("/items/")
async def list_items(skip: int = 0, limit: int = 10):
    return {"skip": skip, "limit": limit}

请求体

from pydantic import BaseModel

class UserCreate(BaseModel): name: str email: str age: int | None = None

@app.post("/users/") async def create_user(user: UserCreate): return {"created": user.name}

---

四、集成 AI 模型

OpenAI API 集成

from openai import OpenAI
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel

app = FastAPI() client = OpenAI(api_key="your-api-key")

class ChatRequest(BaseModel): message: str model: str = "gpt-4o-mini"

class ChatResponse(BaseModel): reply: str model: str

@app.post("/chat", response_model=ChatResponse) async def chat(request: ChatRequest): try: response = client.chat.completions.create( model=request.model, messages=[{"role": "user", "content": request.message}] ) return ChatResponse( reply=response.choices[0].message.content, model=request.model ) except Exception as e: raise HTTPException(status_code=500, detail=str(e))

本地模型集成

from transformers import pipeline

# 加载模型(启动时执行一次) sentiment_analyzer = pipeline("sentiment-analysis")

class SentimentRequest(BaseModel): text: str

class SentimentResponse(BaseModel): label: str score: float

@app.post("/sentiment", response_model=SentimentResponse) async def analyze_sentiment(request: SentimentRequest): result = sentiment_analyzer(request.text)[0] return SentimentResponse( label=result["label"], score=result["score"] )

---

五、异步处理

后台任务

from fastapi import BackgroundTasks

def send_email(email: str, message: str): # 模拟发送邮件 print(f"发送邮件到 {email}: {message}")

@app.post("/send-notification") async def send_notification( email: str, message: str, background_tasks: BackgroundTasks ): background_tasks.add_task(send_email, email, message) return {"status": "sending"}

流式响应

from fastapi.responses import StreamingResponse
import asyncio

async def generate_stream(): for i in range(10): yield f"data: Message {i}\n\n" await asyncio.sleep(0.5)

@app.get("/stream") async def stream_data(): return StreamingResponse( generate_stream(), media_type="text/event-stream" )

---

六、安全与认证

API Key 认证

from fastapi import Security, HTTPException
from fastapi.security import APIKeyHeader

API_KEYS = {"key1", "key2", "key3"} api_key_header = APIKeyHeader(name="X-API-Key")

async def verify_api_key(api_key: str = Security(api_key_header)): if api_key not in API_KEYS: raise HTTPException(status_code=403, detail="Invalid API Key") return api_key

@app.get("/protected") async def protected_endpoint(api_key: str = Security(verify_api_key)): return {"message": "Access granted"}

JWT 认证

from fastapi import Depends
from fastapi.security import OAuth2PasswordBearer
from jose import JWTError, jwt

oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token") SECRET_KEY = "your-secret-key"

async def get_current_user(token: str = Depends(oauth2_scheme)): try: payload = jwt.decode(token, SECRET_KEY, algorithms=["HS256"]) username: str = payload.get("sub") if username is None: raise HTTPException(status_code=401) return username except JWTError: raise HTTPException(status_code=401)

---

七、生产部署

使用 Gunicorn + Uvicorn

pip install gunicorn uvicorn[standard]

# 启动命令 gunicorn main:app \ --workers 4 \ --worker-class uvicorn.workers.UvicornWorker \ --bind 0.0.0.0:8000

Docker 部署

创建 Dockerfile

FROM python:3.11-slim

WORKDIR /app

COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000

CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

构建并运行:

docker build -t ai-api .
docker run -d -p 8000:8000 ai-api

Nginx 反向代理

server {
    listen 80;
    server_name api.example.com;

location / { proxy_pass http://127.0.0.1:8000; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; } }

---

八、性能优化

缓存

from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache

@app.get("/expensive-operation") @cache(expire=60) # 缓存60秒 async def expensive_operation(): # 耗时操作 return {"result": "computed"}

数据库连接池

from databases import Database

DATABASE_URL = "postgresql://user:pass@localhost/db" database = Database(DATABASE_URL)

@app.on_event("startup") async def startup(): await database.connect()

@app.on_event("shutdown") async def shutdown(): await database.disconnect()

---

九、监控与日志

结构化日志

import logging
import json

class JSONFormatter(logging.Formatter): def format(self, record): log_entry = { "time": self.formatTime(record), "level": record.levelname, "message": record.getMessage() } return json.dumps(log_entry)

logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__)

健康检查端点

@app.get("/health")
async def health():
    return {
        "status": "healthy",
        "version": "1.0.0",
        "timestamp": datetime.now().isoformat()
    }

---

十、完整项目示例

项目结构:

ai-api/
├── main.py
├── routers/
│   ├── chat.py
│   ├── image.py
│   └── auth.py
├── models/
│   └── schemas.py
├── services/
│   └── ai_service.py
├── requirements.txt
└── Dockerfile

---

FastAPI 让 AI 模型部署变得简单高效,是生产环境的首选框架。
>
下一篇预告:Redis 缓存实战 - 提升应用性能 10 倍

---

作者:廖万里 发布时间:2026年3月16日

本文链接:https://www.kkkliao.cn/?id=654 转载需授权!

分享到:

版权声明:本文由廖万里的博客发布,如需转载请注明出处。


发表评论

访客

看不清,换一张

◎欢迎参与讨论,请在这里发表您的看法和观点。