Latest AWS data management features target cost control
As the volume and complexity of enterprise data estates increase, and the size of data workloads grows due to AI development, the tech giant aims to help users reduce spending.
As data volume and complexity continue to increase, and AI development demands massive data workloads, cloud computing costs are a growing concern for many enterprises. In fact, a 2024 Gartner survey of more than 300 CIOs found that managing cost limits their organization's ability to develop and deploy AI tools.
AWS on Tuesday unveiled numerous new data management capabilities at its annual re:Invent user conference in Las Vegas, many of which aim to help customers better control costs related to their data workloads.
For example, a new database pricing model reduces database costs up to 35% when customers commit to a certain amount of usage over one year. In addition, scale and performance capabilities for certain AWS databases target cutting costs in half, while Amazon S3 Vectors -- a new feature that enables customers to store vectorized data where it can be accessed through semantic searches -- was built to substantially lower the cost of storing and searching vectors.
Given that many enterprises are concerned about the rising cost of data management, AWS is addressing a need with its latest additions, according to Stephen Catanzano, an analyst at Omdia, a division of Informa TechTarget.
"Organizations are under pressure to optimize budgets while scaling their operations, especially in uncertain economic climates," he said. "AWS's focus on lowering costs is significant because it directly addresses one of the biggest pain points for enterprises, managing costs without sacrificing performance or scalability."
Beyond capabilities targeting cost control, new AWS data management capabilities include increasing storage capacity and performance speed in Amazon Simple Storage (S3) and database integrations designed to accelerate application development.
Organizations are under pressure to optimize budgets while scaling their operations, especially in uncertain economic climates. AWS's focus on lowering costs is significant because it directly addresses one of the biggest pain points for enterprises.
Stephen CatanzanoAnalyst, Omdia, a division of Informa TechTarget
In 2010, two zettabytes of data were generated worldwide, according to Statista. By 2020, that had grown to 64.2 zettabytes. This year, the total is expected to reach 181 zettabytes. Beyond sheer volume, data is also more complex than it used to be, with unstructured data, such as text in PDFs and emails, and images now making up the majority.
OpenAI's November 2022 launch of ChatGPT marked a significant improvement in generative AI (GenAI) technology. Given GenAI's potential to make employees better informed and more efficient, many enterprises have steadily increased their investments in developing AI tools such as chatbots and agents. Chatbots and agents, meanwhile, are dependent on large amounts of high-quality data to be accurate so their development includes massive data workloads.
Together, exploding data volume, rising data complexity and increasing AI development have all led to soaring data management costs.
AWS's effort to address the rising cost of data management is therefore noteworthy, according to William McKnight, president of McKnight Consulting.
"The significance of this focus … are closely tied to the growing and massive scale of modern data, the complexity of managing resources and the demands of AI," he said. "Computationally intense applications such as semantic search and context understanding require managing vast numbers of vectors, which can be complex and costly."
AWS competitors, including Google Cloud, Microsoft and Oracle, have also made efforts to help customers control spending, but not to the same extent as AWS demonstrates with its latest features, McKnight continued.
"Other major clouds are making serious efforts to address cost control, but no one has this level of focus on it," he said.
Catanzano similarly noted that while AWS's competitors have also addressed cost control, AWS's new features demonstrate a more heightened focus on helping customers reduce spending.
"AWS appears to be taking a more aggressive and complete approach," he said. "If AWS continues to innovate in this area, it could strengthen its position as a cost-effective choice, especially for enterprises managing large-scale data and AI workloads."
To help customers control costs related to data management and AI development, AWS introduced the following:
Vector search and storage capabilities in Amazon S3 Vectors that enable developers to discover relevant data for AI pipelines while targeting cost reduction by storing up to two billion vectors and delivering fast queries.
Automatic cost optimization in S3 through Intelligent-Tiering.
Automatic table replication in S3 to eliminate manual updates and complex synchronization projects.
Automatic scaling in Amazon EMR Serverless, a cloud-based big data platform, to reduce costs by eliminating the need to manually configure disk types and sizes or manage workload storage capacity.
Database Savings Plans, flexible pricing models designed to lower database spending when users commit to a certain level of use.
Increased storage and performance for AWS's Relational Database Service (RDS) for SQL Server and RDS for Oracle databases to help customers lower costs.
While not specifically addressing AWS's focus on cost control, Ganapathy "G2" Krishnamoorthy, AWS's vice president of databases, noted that customer feedback plays a primary role in the tech giant's product development plans for data management and AI development.
"All of this is inspired by what we hear from you, our customers," he said during a session at re:Invent.
Collectively, the new features are more pragmatic than innovative, according to Catanzano. However, because they efficiently address a need, they are nevertheless significant.
"These announcements are incremental but highly practical," he said. "They focus on improving scalability, performance and cost-efficiency rather than introducing groundbreaking innovations. These updates are meaningful because they simplify operations, reduce costs, and enhance the usability of AWS services, but they don't necessarily redefine the industry."
Particularly noteworthy are the Database Savings Plans and S3 Vectors, Catanzano continued.
"The Database Savings Plans are significant because they offer a straightforward way to reduce database costs," he said. "S3 Vectors, on the other hand, are a game-changer for AI and machine learning workloads."
Beyond new features addressing cost control, AWS increased the maximum object storage workload size in S3 from 5 terabytes to 50 terabytes, added performance to S3 Batch Operations that exponentially improve the speed of large workloads and unveiled native integrations in preview between Amazon Aurora and Amazon DynamoDB databases and the Vercel development platform.
Like Catanzano, McKnight highlighted S3 Vectors. However, he also noted the significance of markedly increased object storage.
"Increasing the maximum S3 object size is a crucial necessity given the surge in data volumes," he said. "This allows customers to store full AI training datasets, high-resolution videos and seismic data as single objects in their original form, simplifying workflows."
Meanwhile, although Catanzano termed AWS's updates incremental, McKnight called them innovative.
"AWS is making major technological advancements by focusing on extreme scaling, integrating AI capabilities directly into core services and automating complex operations," he said.
Looking ahead
Although AWS's new features focusing on cost control improve the way the tech giant responds to one need, there are other customer needs that also have to be addressed, according to Catanzano.
In particular, he noted that AWS needs to improve the interoperability of its numerous data management, storage, analytics and AI capabilities. Some enterprises deploy all their data and AI operations in the cloud and use the capabilities of a single provider, such as AWS, but most have hybrid deployments that include on-premises and integrate tools from various vendors.
"I'd like to see AWS focus on further simplifying multi-cloud and hybrid cloud integrations," Catanzano said. "While AWS has made strides in cost optimization and scalability, enabling seamless interoperability with other cloud providers and on-premises systems would address a growing need for flexibility and help enterprises avoid vendor lock-in."
McKnight, similarly suggested that AWS focus on interoperability. In particular, he advised the tech giant to integrate its new data management features with external development platforms.
"The next logical step for AWS to continue serving the needs of its users would be to further enhance the seamless integration and utilization of these new, specialized capabilities across a wider range of development environments and services," McKnight said.
Eric Avidon is a senior news writer for Informa TechTarget and a journalist with more than 25 years of experience. He covers analytics and data management.