🚧 Prototype Notice

This project (sufast) is currently a dummy prototype.
Only static routes are working at the moment.
Dynamic routing and full features are under development.
Thank you for understanding! 🙏

Sufast/Documentation
Performance Optimization

Performance Optimization

Learn how to optimize your Sufast applications for maximum performance and efficiency.

Application-Level Optimizations
Optimize your Sufast application code for better performance

Use Async/Await for I/O Operations

Async Optimizationpython
import asyncio
import aiohttp
from sufast import App

app = App()

# ❌ Blocking I/O - blocks the entire thread
@app.get("/slow-sync")
def slow_sync_endpoint():
    import time
    time.sleep(1)  # Simulates slow database query
    return {"message": "Done"}

# ✅ Non-blocking I/O - allows other requests to be processed
@app.get("/fast-async")
async def fast_async_endpoint():
    await asyncio.sleep(1)  # Simulates async database query
    return {"message": "Done"}

# ✅ Async HTTP requests
@app.get("/external-api")
async def call_external_api():
    async with aiohttp.ClientSession() as session:
        async with session.get("https://api.example.com/data") as response:
            data = await response.json()
            return {"external_data": data}

Minimize Response Size

Response Optimizationpython
from sufast import App, Response
import gzip
import json

app = App()

# ✅ Return only necessary data
@app.get("/users/{user_id}")
def get_user(user_id: str):
    user = get_user_from_db(user_id)
    
    # Return only public fields
    return {
        "id": user.id,
        "name": user.name,
        "email": user.email
        # Don't include sensitive or unnecessary fields
    }

# ✅ Implement compression for large responses
@app.get("/large-data")
def get_large_data():
    data = get_large_dataset()
    
    # Compress response if it's large
    json_data = json.dumps(data)
    if len(json_data) > 1024:  # 1KB threshold
        compressed = gzip.compress(json_data.encode())
        return Response(
            content=compressed,
            headers={
                "Content-Type": "application/json",
                "Content-Encoding": "gzip"
            }
        )
    
    return data

Efficient Data Processing

Data Processingpython
from sufast import App
import asyncio

app = App()

# ✅ Process data in batches
@app.get("/process-batch")
async def process_batch():
    items = get_items_to_process()
    
    # Process in batches to avoid memory issues
    batch_size = 100
    results = []
    
    for i in range(0, len(items), batch_size):
        batch = items[i:i + batch_size]
        batch_results = await process_batch_async(batch)
        results.extend(batch_results)
    
    return {"processed": len(results)}

# ✅ Use generators for large datasets
@app.get("/stream-data")
async def stream_large_data():
    def generate_data():
        for i in range(10000):
            yield {"id": i, "value": f"item_{i}"}
    
    # Stream response instead of loading all in memory
    return {"data": list(generate_data())}