I noticed that too — outages seem to happen during peak usage or when there’s a big rollout happening in the background.
A couple of quick alternatives I fall back on:
Claude.ai – better at longer context, feels “calmer.”
Perplexity.ai – solid for quick factual Q&A.
Local models (like Llama 3 or Mistral via Ollama) – handy if you’re into running things offline.
Honestly, moments like this remind me why it’s worth having multiple tools in the workflow, not just one.
Claude.ai – better at longer context, feels “calmer.”
Perplexity.ai – solid for quick factual Q&A.
Local models (like Llama 3 or Mistral via Ollama) – handy if you’re into running things offline.
Honestly, moments like this remind me why it’s worth having multiple tools in the workflow, not just one.