eCommerceNews Australia - Technology news for digital commerce decision-makers
Story image

The unseen challenge of AI storage

Yesterday

AI is constantly getting media attention, however much of the media focus has been on the capabilities and advancements in compute and capability. However, an equally critical aspect that often goes unnoticed is the challenge of data storage. As AI adoption continues to grow across industries, the demand for vast amounts of data storage is becoming a significant concern, particularly when it comes to cloud storage.

Avec's Data and AI Practice Lead, Jack Jorgensen, highlights, "Everyone's really focused on the compute side at the moment, and it's also very clear, more data means better results." The implementation of large-scale AI projects necessitates the storage of enormous volumes of data, which in turn requires fast and reliable access. This is where the challenge lies, as cloud storage, while convenient, comes with substantial costs.

One of the key concerns is data duplication. Jack highlights, "We're already seeing significant duplication of data. For instance, in customer service AI, customer queries are logged in the CRM system, and the same data is also ingested into storage for the model to utalise as examples." While this duplication is manageable for now, it's expected to increase as more organisations transition from proof of concept (POC) phases to full-scale AI implementations.

The implications of this are multi-layered. Firstly, there are the direct costs associated with storing more data. Jack notes, "The primary concern is cost, more storage, more data, and more backups all come with additional expenses. On top of that, there are management costs associated with keeping the data clean and usable, as well as compliance costs related to data security and regulations like GDPR." 

Moreover, the network bandwidth required to transfer and maintain this data adds another layer of complexity and expense. Jack describes, "Imagine ordering packages online. If everything is shipped individually instead of being neatly packed into boxes, you're constantly going to the door to retrieve each package, making it difficult to get anything done."

How do we fix it? 

To tackle these challenges, Jack emphasises the need for a greater focus on developing effective solutions. One potential approach is implementing data compression and decompression pipelines. "Data is heavily compressed as it moves into storage, and then it's decompressed when fetched, however, this method also introduces additional compute costs." 

Another possible solution is for organisations to adopt a hybrid storage approach, combining both cloud and on-premises storage. "Organisations may begin to move some storage back on-premises or use glacial storage in the cloud for slower-access data," Jack suggests. This strategy would allow for the immediate availability of critical data while enabling a more cost-effective way to store less frequently accessed information.

So, where does Avec come in?

As AI continues to permeate various sectors, the issue of data storage will become increasingly prominent. The industry must review and prioritise the development of efficient storage solution designs to manage the growing volumes of data. As Jack puts it, "There's a lot more design and system architecture which needs to be done around how that's going to be managed in the future." By addressing these storage challenges head-on, we can ensure that AI remains a powerful and sustainable tool for innovation. Reach out to Jack or the Avec team to ensure your AI initiatives are supported by sustainable and innovative storage solutions here - https://www.avecglobal.com/contact/ 

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X