Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Open-source or closed-source AI? Data quality and adaptability matter more 

Clint Boulton is Senior Advisor, Portfolio Marketing, APEX at Dell Technologies. 

As has happened with technology revolutions before, there is much debate over whether organizations should deploy commercial large language models (LLMs) or turn to the open-source community as they build generative AI applications. 

The answer, of course, is that it depends on what you’re trying to accomplish. Picking the right model and infusing it with high-quality corporate data are more critical decisions. More on that in a bit. 

First, it’s important to understand the different model structures — closed source, open source and its many variations — at least at a high level. Because such details get murky fast. 

Closed-source, open-source and partially-open models 

Organizations building gen AI applications typically choose between closed LLMs and open-source LLMs, with the latter option including the choice to build one’s own model using open-source components. 

Closed source LLMs are what they sound like: The source code is treated as proprietary information and therefore closed to the public, often for security, intellectual property and other reasons. 

Open source is also what you’d imagine. That is, software whose source code is available freely for people to access, use and modify. However, there are variations of open-source models in which only certain parts of the software are free to use while the remainder requires a commercial license. 

In such “open models,” as they are often referred to, the architecture, weights, parameters and other algorithmic specifics may be publicly available while source code may be kept proprietary. Regardless, collaboration fueled by open-source and open models can help expose bias and cultivate knowledge sharing. 

Data is the new…water? 

Licensing and usage terms of services matter in that they dictate how you use a particular model — and even what you use it for. Even so, getting caught up in the closed vs. open zealotry is shortsighted at a time when 70% of CEOs surveyed expect gen AI to significantly alter the way their companies create, deliver and capture value over the next three years, according to PwC. 

Rather, you should focus on the quality of your data. After all, data will be your competitive differentiator — not the model. 

In that vein, isn’t it time to begin thinking and talking about data differently? Metaphorically speaking, data has traditionally been described as the new oil. But really, isn’t it more like water? 

Just as water fuels and sustains us, data fuels gen AI. And just as humans require water to survive, your gen AI solutions require data to process and create content. Of course, as with water, accessing enough clean data is no easy task when it comes to nourishing gen AI models. 

Part of the challenge is funneling enough quality data to gen AI systems that are thirsty for it. And just as the world’s water must be cleaned before consumption, so too must data be cleansed before it is consumed. 

After defining your business use cases, you must follow a rigorous process in which you will collect, clean, preprocess, label and organize your data properly. Then you can proceed to model training, evaluating, monitoring and refining or fine-tuning, according to this playbook. 

Right-sizing your model is key 

Experimenting with different model types and sizes to suit your use cases is a critical part of the trial-and-error process. Right-sizing, or deploying the most appropriate model sizes for your business, is more crucial.   

Do you require a broad, boil-the-ocean approach that spans as much data as possible to build a digital assistant with encyclopedic knowledge? A large LLM cultivating hundreds of billions of data points may work well. 

Or do you require a domain-specific, fit-for-purpose model to help surface product information for customers? A small language model (SLM) filtering corporate data through retrieval-augmented generation (RAG) to refine results may suffice. 

For some organizations, portability to fuel gen AI mobile applications for employees or customers may be a primary option. In such scenarios, tiny LLMs tailored for smartphones may be faster, cheaper and more energy conscious.  

Choosing where to run these models is also critical. If you’re using your corporate data to build applications, an on-premises approach makes sound sense. Bringing AI to your data is sound business practice. 

The bottom line 

Of course, the gen AI model landscape is ever evolving. Future models will look and function differently than those of today. Regardless of your choices, with the right partner you can turn your data ocean into a wellspring of insights. 

Dell Technologies offers a broad portfolio of AI-optimized technologies, from servers and storage to client devices and more to provide your data the proper hydration and nourishment it requires. 

Also, Dell’s validated reference designs provide the foundational blueprints for you to build your gen AI services, while its professional services team are ready to provide the white-glove treatment you need to flourish. 

 

Content Courtesy – Dell AI Technologies