Vext v1.5: New Models and Improvements

Vext v1.5: New Models and Improvements


2 min read

Exciting news – we’ve rolled out v1.5! It’s jam-packed with the latest LLMs from the biggest names in AI. Now, you've got the power of high-end AI tech right at your fingertips, all without the hassle of managing complex AI systems. Let’s check out what’s new and how it can help you do more.

New LLM Models

Vext is all about giving you a buffet of top-tier managed LLM models – think of it as AI on tap. With our latest update, we've expanded your menu with an even broader selection of models. This means you get to craft an even smarter LLM pipeline, simple or complex, without any of the setup stress.

Simply create a new project (or navigate to an existing one), add a "Generate a Response" action, and you can find the new models in the "Model" section dropdown menu.


First up, we have Claude 3 Haiku from Anthropic, a model designed for speed without sacrificing quality. Haiku is your go-to when you need near-instant responsiveness for simple queries. It's perfect for crafting AI experiences that feel as natural as chatting with a friend.


Next, we introduce Claude 3 Sonnet, which finds the sweet spot between performance and efficiency. Tailored for businesses that demand a lot from their AI, Sonnet offers robustness, offering consistent, reliable performance, especially in handling large volumes of interactions.

Cohere Command

This model is a powerhouse, designed to understand and execute complex instructions with a high degree of accuracy, making it ideal for more intricate tasks.

Cohere Command Light

For those seeking efficiency and speed, Command Light offers a nimble alternative that still delivers impressive results without the heavy resource requirements.

Llama2 (13B and 70B)

Last but certainly not least, Llama2 models are joining the fray with their 13B and 70B versions. Whether you're a startup or a large enterprise, these models are scalable and adaptable to a variety of tasks and workloads. They're perfect for those who need a robust model that can scale with their ambitions.

Other Improvements

The "Execute a Function" action interface now comes with sample inputs and outputs. This means you'll have a clear expectation of how a function works and what results you can anticipate, streamlining your development process.