Insiders LLM Benchmarking May 2025

With Insiders LLM Benchmarking, we as AI experts keep an eye on the LLM world, compare the most powerful models, and offer our customers reliable guidance in the fast-paced LLM jungle.
Insiders LLM benchmarking is entering the next round: Building on the first comprehensive performance comparison, we have further developed our approach and introduced new dimensions. While the first benchmarking focused primarily on pure performance in the areas of information classification and extraction, we now also take into account speed, data protection, and relative cost structure—decisive criteria for productive use in the IDP environment. What does LLM benchmarking at Insiders mean? The benchmarking was based on a standardized IDP data set with real documents from the insurance and finance world – including a new use case: claims invoices. This ensures that the results are directly transferable to our customers‘ requirements. Our AI experts regularly analyze and evaluate the most powerful models on the rapidly changing global technology market and identify those LLMs that are best suited for the data-to-process area. Insiders LLM benchmarking is a continuous process that drives the best-of-breed approach. This allows Insiders to keep track of the performance of the latest LLMs and ensure that its customers always use the best possible solution for their needs with the flexible LLM integration of the Insiders OvAItion Engine. This enables AI to be used sensibly and securely in the enterprise. The new benchmarking also shows that the question of “the best LLM” is not a black-and-white issue. Performance alone is not enough. In highly regulated industries such as insurance and finance, reliability, data protection, and integration capabilities are also key factors.
For individual use cases, Insiders AI experts offer sound advice for your company. We would be happy to include your data in an upcoming industry-specific benchmarking exercise. Simply contact our Insiders AI experts to find out more.