Predictions for AI in 2024
Entering 2024 feels like living in a future we used to dream about as kids – you know, with all those sci-fi fantasies about moon bases and robots doing our chores. While we might not be there yet, the leaps we've seen in AI, especially since the advent of GPT technologies, are nothing short of astounding. Here at Vext, we're buzzing with excitement about what AI has in store for us this year.
So, let's dive into our predictions for 2024, where the future might not have flying cars, but it's definitely thrilling.
Open Source vs. Proprietary: The Friendly Rivalry
At Vext, we're genuinely excited for the future of open source LLMs, but let's be real, the big players with their proprietary models are still leading the charge.
However, here are some of the models that we are really bullish on:
Launched in July 2023, is a robust generative text model equipped with 7 to 70 billion parameters and enhanced through Reinforcement Learning from Human Feedback (RLHF). This model, adaptable for various natural language generation tasks including chatbot functions and programming tasks, marks Meta's commitment to open and customizable AI solutions, as seen in its specialized versions like Llama Chat and Code Llama.
This advanced model has surpassed both LLaMA 2 and GPT-3.5 in various natural language processing tasks according to Hugging Face.
Mistral AI launched Mistral 7B in September 2023, a compact, high-efficiency 7 billion parameter open source LLM designed for speed and efficiency.
Mistral 7B uses innovative techniques like grouped-query and sliding window attention for faster, cost-effective processing of large texts. It's more agile than bulkier LLMs, boasting impressive scores on various benchmark tests, outshining its counterparts like LLama 2 7B and even rivaling Llama 1 34B in areas like coding and reasoning.
Volunteers from more than 70 countries, alongside the experts at Hugging Face, embarked on a mission to create BLOOM. This LLM is trained on a vast array of text data with considerable computing power; it's not only impressive with its 176 billion parameters, but it's also versatile, handling 46 languages and 13 programming languages.
Born from the LLaMa 13B model but fine-tuned using real conversations shared by users via ShareGPT. And when compared against prominent services like ChatGPT and Google Bard, Vicuna-13B holds its own, boasting over 90% of their quality. And compared to its cousins LLaMa and Alpaca, it outperforms them in most scenarios.
Our Take On This...
We're anticipating a significant advancement in open source language models, but we believe they might still fall short of the more sophisticated proprietary models, particularly those from OpenAI. It seems OpenAI has an edge with their advanced and efficient models. However, a crucial challenge for them, as we see it, will be to convincingly assure businesses about their data handling policies, especially concerning the use of conversation data for AI training. This aspect is crucial for maintaining trust and leading in the field.
This year, expect to see the AI startup landscape doing some serious gymnastics. Big players like OpenAI keep upping their game, and it's tough for the little guys to keep pace.
Most of the thin-wrappers will be the first to go - the "Custom GPT" or "My GPT" startup are likely heading towards a rough patch. These startups, mainly repackaging existing GPT technologies with a slight twist, might soon find themselves in a lot of trouble. It's a clear signal that in the bustling AI market, innovation and distinct value are not just nice-to-haves, but essential for survival and growth.
Classic example is when OpenAI goes full-Thanos by announcing "GPTs" in the Developer's Day, and it immediately wiped out most of the "XYZ GPT" companies on the market. For Vext, however, it's a huge boost, you can read more over here.
Multi-Modularity... Cool But Let's Chill
Multi-modal technology isn't a recent innovation; it's been around for quite some time. However, it really started gaining traction on platforms like LinkedIn only after ChatGPT expanded its capabilities to include vision support. I vividly recall the moment I came across this on LinkedIn – it was genuinely impressive:
It's great for consumer (or prosumer even). Now the question is how can we apply this to other use cases and generate value? It might be more buzz than bite for now. It's like that cool new gadget everyone talks about but doesn't quite know how to use yet.
Businesses Jumping on the AI Train
At Vext, we are at the front-row seat of this whole movement. We are talking to so many businesses that are trying push for innovation with the LLM technology and the main hurdle right now for them is to find an easy and scalable solution for them to incorporate AI/LLM into their product.
However, the main issue for businesses is that there are so many solutions (open source and proprietary) on the market and it's very difficult for them to decide on a solution with mixed information out there. The cost of trial and error is extremely high and usually businesses prefer a safer and proven solution unlike hobbyists.
This is also one of the reasons why we created the Vext platform: to provide businesses a solution to lego-block their LLM pipeline that's proven, simple to use, and reliable.
Cybersecurity For AI (LLM)
We're anticipating a surge in the AI cybersecurity sector, especially related to AI and LLMs, as more businesses start using these technologies. Right now, cybersecurity in this area isn't a hot topic yet, typically because security considerations often emerge later, once the technology matures and sees wider adoption by large corporations.
While we're not experts in cybersecurity, it's clear that any technology involving data, traffic, and storage is a potential target for security breaches. Therefore, it seems inevitable that focusing on cybersecurity in this context will become increasingly important.
Where Vext Stands
We are super excited where 2024 will take us. We just beta launched and the platform is now truly capable of enabling businesses to lego-block their LLM pipeline like how you can use Zapier to lego-block your automation.
We have more big plans in 2024 and we can't wait to see what you build with Vext! Try Vext for free and start building your AI applications.