Ask HN: How are you preventing LLM hallucinations in production systems? | Dark Hacker News