Data Management, Analytics & AI

  • Data Analytics & Artificial Intelligence

    Companies must quickly, easily, reliably, and securely process, merge, transform, and transact with data to derive value and insights at the speed of business.

    Organizations must improve data management efficiency to effectively leverage their data.

    According to our research, two-thirds of organizations are turning to automation to gain efficiency: The second-most researched automation-related topic among IT buyers whose purchase research we observe is Business Intelligence and Analytics. Our Data Analytics & AI analysts and demand-side research help tech vendors succeed in this rapidly evolving market by covering essential topics that include:

    • Analytics
    • Artificial intelligence hardware
    • BI & data visualization
    • Data platforms
    • Data science & machine learning platforms
    • Databases, warehouses, lakes & platforms
    • DataOps & MLOps
    • Generative AI
    • Speech, NLP & computer vision

    Research Brief

    Data Complexity Inhibits Large Midmarket Organizations Along Their AI Journey

    Although teams across industries are eager to unlock the potential of AI within their businesses and operations, they regularly run into roadblocks in integration complexity, data source utilization, and data quality and trust. Recent research by Enterprise Strategy Group, now part of Omdia, revealed that some of these issues might be correlated to organizational size […]

    Read More

    Analysts Covering Data Analytics & Artificial Intelligence

    Mark Beccue

    Principal Analyst, Artificial Intelligence

    Areas of Expertise

    • AI Development Platforms & Tools
    • AI Lifecycle Management
    • AI Risk Management
    • Foundation Models/LLMs
    • Legacy and Generative AI
    • Natural Language AI

    Read more

    Mike Leone

    Practice Director, Data Management, Analytics & AI

    Areas of Expertise

    • Analytics
    • Artificial Intelligence (AI)
    • BI & Data Visualization
    • Data Platforms
    • Data Science/Machine Learning
    • Data/AI Lifecycle

    Read more

  • In today’s data-driven world, the power of artificial intelligence (AI) and advanced analytics is undeniable. They have the potential to revolutionize industries, drive innovation, and unlock valuable insights. However, behind every successful AI and analytics initiative lies a crucial foundation: excellent data management capabilities.

    In a recent research survey from TechTarget’s Enterprise Strategy Group, Data Platforms: The Path to Achieving Data-driven Empowerment, when they were asked about the most important areas of their data platform, 31% of organizations ranked data management, including databases, at the top. In the same survey, 50% of participants identified faster business decision-making as the leading driver and goal for their modern data platform strategies, with 19% focused on creating competitive advantage. Database performance is critical to reaching the data-driven outcomes desired by organizations.

    Oracle has introduced its latest addition to the Exadata database machine family, the Exadata X10M, which is powered by AMD’s EPYC server processors. This release marks a significant milestone for Oracle and its database performance capabilities. The Exadata platform is known for its co-engineered hardware and software, specifically designed to support enterprise data management. It offers a complete stack system with optimized processors, memory, and storage, as well as system software for efficient data storage, indexing, and movement. With the Exadata X10M, Oracle continues its commitment to delivering high-performance database servers by leveraging AMD’s EPYC CPUs, incorporating up to 96 multithreaded cores, DDR5 memory, and RDMA over Converged Ethernet (RoCE) for low-latency, high-bandwidth connectivity.

    Oracle’s optimization efforts extend beyond hardware to its software stack, ensuring linear scalability and maximum performance across multiple cores. The Exadata X10M stands out with its high performance, surpassing its predecessor, the X9M, in terms of OLTP performance, analytics, and database consolidation. Oracle’s decades-long expertise in both enterprise database software and hardware solutions enables the company to provide a tailored data management platform that combines performance, value, and cost-effectiveness.

    Customers can deploy Exadata on-premises, in the cloud, or as a hybrid solution, benefiting from Oracle’s collaboration with public cloud providers and high-speed interconnectivity options. Overall, Oracle’s focus on innovation and customer-centric solutions positions Exadata X10M as a compelling choice for organizations seeking high performance in their data management and analytics initiatives.

  • Aviv Kaufmann

  • Bill Lundell

  • Enterprise Strategy Group

  • Tony Palmer

  • Stephen Catanzano

    About

    Senior Analyst Stephen Catanzano brings more than 20 years of industry experience and deep expertise in data management to his role at Enterprise Strategy Group, where he focuses on how organizations can build a strong data foundation for generative AI and operationalize data at scale to maximize organizational value.

    Alongside his operational roles in larger enterprises, Stephen has successfully created and grown multiple SaaS startups, led M&A transactions, and raised both venture and private equity capital.

    Stephen holds a BS in Finance and Economics from Bentley University in Waltham, Massachusetts.

    Areas of Expertise

    • AI Tools for Data
    • Data Foundations for AI
    • Data Lakes & Warehouses
    • Data Management
    • Data Observability
    • Data Platforms
    • Data Quality and Governance
    • Data Streaming
    • Data Transformation
    • Data Visualization
    • Databases
    • DataOps

    Connect with Stephen Catanzano

    Request Briefing

    Quoation Mark

    Maximizing the value of data is about building data-driven organizations by turning data at scale into intelligence for faster corporate and consumer decision-making, quicker innovations, competitive advantages, and achieving operational efficiencies.”

    Stephen Catanzano
    Senior Analyst, Data Management
    Vertical Pills

    Survey Results

    Data Governance in the Age of AI

    This Complete Survey Results presentation focuses on data user trends, data classification and lineage, drivers behind data governance strategies, challenges with data governance adoption, the impact of AI on data governance, the importance of data governance elements, and investment plans. Already an Enterprise Strategy Group client? Log in to read the full report. If you are […]

    Read More

    Stephen Has Appeared In

  • Mike Leone

    About

    Principal Analyst Mike Leone leads ESG’s analyst team focused on data management, analytics and artificial intelligence.

    Mike draws upon his enthusiasm for bleeding-edge technology and his engineering and marketing background to provide a unique perspective to enterprise technology vendors. At Enterprise Strategy Group, Mike focuses on all things data, analytics, and AI. His passion shines through in helping organizations improve everything from go-to-market strategies and messaging to product development and content creation.

    Mike has a strong technical background, with early roles in software and hardware engineering focused on future product feasibility, modeling, and performance. He gradually moved into roles that interfaced with marketing by helping translate deep technical concepts into understandable business benefits.

    Mike holds a BS in Computer Science from Stonehill College in Easton, Massachusetts.

    Areas of Expertise

    • Analytics
    • Artificial Intelligence (AI)
    • BI & Data Visualization
    • Data Platforms
    • Data Science/Machine Learning
    • Data/AI Lifecycle
    • Databases, Warehouses, Lakes & Platforms
    • DataOps/MLOps
    • Generative AI

    Connect with Mike Leone

    Request Briefing

    Quoation Mark

    While data has the power to transform the business in radical ways, achieving data excellence requires an equal balance of tightly integrated technology, well-defined processes, and the empowerment of all stakeholders to confidently bring the right data to every decision.”

    Mike Leone
    Practice Director, Data Management, Analytics & AI
    Vertical Pills

    Survey Results

    Evaluating the Pillars of Responsible AI

    This Complete Survey Results presentation focuses on responsible AI maturity, prioritization, best practices, terms, outcomes, challenges, responses to issues, bias mitigation, measurements, acceptable fairness levels, data and analytics impact, and stakeholders. Already an Enterprise Strategy Group client? Log in to read the full report. If you are not yet a Subscription Client but would like to […]

    Read More

    Mike Has Appeared In

  • Aaron Tan

    About

    As Enterprise Strategy Group’s regional director of analyst services for APAC, Aaron Tan helps clients identify and quantify key market trends on a wide range of technology topics, including cloud infrastructure, DevOps, business applications, and cybersecurity in the Asia-Pacific region.

    Aaron also serves as Editor in Chief, APAC at ESG’s parent company, TechTarget.

    Aaron is a seasoned media and information professional who has been involved in technology implementations in the public sector and has nearly two decades of experience covering B2B technology for leading media companies. He has held various managerial roles in the Singapore public sector, including the National Library Board of Singapore and the Infocomm Media Development Authority.

    Aaron holds a bachelor’s degree in Communications from Nanyang Technological University Singapore and a master’s degree in Information Science from Syracuse University in Syracuse, New York.

    Areas of Expertise

    • Application Modernization & DevOps
    • Business Applications
    • Cloud Computing
    • Cybersecurity
    • DevOps
    • IaaS/Cloud
    • IoT
    • Networking & 5G
    • Storage

    Connect with Aaron Tan

    Request Briefing

    Quoation Mark

    The growing use of public cloud services in Asia-Pacific has not only drawn cloud suppliers to the region but also integration platform players who provide the glue that ties cloud services together in a multi- cloud and hybrid IT environment.”

    Aaron Tan
    Regional Director Analyst Services, APAC
  • Alex Arcilla

  • Pure Accelerate: Focus on Cyber-resilience

    Photo: Charlie Giancarlo (by CB)

    As I wrote in a previous blog, in-person events are coming back! Pure is holding its user conference in Las Vegas this week. My colleague Scott Sinclair, who is also attending, covers some of the announcements in a recent blog. For my part, I will focus on the cyber-resilience announcements at the event, in particular, the ransomware recovery SLA.

    Pure’s CEO Charlie Giancarlo kicked off the session by providing some interesting metrics in his keynote to support Pure’s power and space efficiencies, reliability, labor requirements, and TCO, amongst others.

    According to Charlie, Pure essentially differentiates itself in the market in 4 areas: direct to flash management (which is key at scale), a cloud operating model (run like the cloud, run in the cloud, build for the cloud, and power the cloud), an evergreen program to minimize obsolescence, and a coherent and consistent portfolio of platforms that rely on common technologies and software. I think a 5th one should be added to the list: Cyber-resilience!

    Ransomware Recovery SLA Program

    What it is:

    On the cyber-resilience front, Pure announced the Evergreen//One Ransomware Recovery SLA program, which is sold as an add-on subscription. Existing and new customers can now purchase an add-on service guarantee for a clean storage environment with bundled technical and professional services to recover from an attack.

    Many things can happen when ransomware hits: systems are essentially taken out of production, can be seized by law enforcement, and/or can be used to run forensics, for example. So it could be weeks before you gain access back to your own systems for production. At the end of the day, it’s about being able to recover as quickly and cleanly as possible in order to resume business operations. Of course, this assumes that your data is properly protected in the first place.

    A customer can initiate a recovery via Pure Technical Services at any time. When a customer calls with their request following the incident, Pure immediately starts working with the customer on a recovery strategy and plan, which includes Pure shipping a clean array within 24 hrs (for North America) with a professional services engineer onsite to help. The idea is to have you all recovered and ready to resume production within 48 hours with this “loaner” array. Transfer those immutable snapshots back on the loaner and you are back in business. You have 180 days to return the array.

    In order to maximize your chances and to qualify, end users must turn SafeMode on for all volumes and set retention to 14 days. This is a must-have best practice, in my opinion, regardless of whether you subscribe or not. The management software, Pure1, has a great set of capabilities for data protection assessment and anomaly detection. The software can give end users an assessment of their whole fleet of arrays and benchmark them against best practices, such as looking for customers having safe mode or snapshots turned on, for example. The protection can be very granular, at the volume level. In addition, the software can perform anomaly detection such as looking for signals like abnormal deduplication ratios. When data is encrypted, it becomes less unique and therefore less “de-dedupable.” A sharp dropping of the “normal” deduplication rate would be a key indication. Pure hinted that they will be adding additional signals in the future, looking at latency, file name changes, and other signals.

    Why This Matters

    To be clear, this is not a “marketing” guarantee (“we’ll pay you X if you can’t recover data”…followed by many exclusions and requirements). This is a practical, customer-focused, and outcome-driven service. If an array has questionable data, it will not go back in production. If you have protected your environment, you will need to recover the latest good copy of data (which can take a long time if you don’t use high performance snaphots) on a “clean” system. All the while, everyone is in full crisis mode, which is adding tremendous stress to the teams and processes. This is not only differentiated, it is smart and focused on what matters: resuming business ASAP.

    Christophe Bertrand (left) and Andy Stone (right) – photo by Scott Sinclair

    Panel: Building a Data-resilient Infrastructure

    I also had the pleasure of participating in a breakout session on building a data-resilient infrastructure with Andy Stone, Pure’s Field CTO, and a cyber-resilience expert. I shared some of the findings of our state of ransomware preparedness research and discussed “hot” topics such as budgeting and funding for ransomware preparedness, the reality of recovery service levels, best practices, cyber insurance, etc.

    The level of interest in the topic was clearly very high and many attendees shared their concerns and challenges. Andy reminded the group that no one can do it alone, it’s teamwork, and no vendor can solve the whole problem on their own. More importantly, we discussed how it’s not just the data that needs protection, it’s also the infrastructure, the “Tier 0,” and first line of defense. The ransomware SLA program was also mentioned and triggered many questions and a lot of interest.

    I have the strongest suspicion Andy’s schedule will be booked solid for the next few weeks with client visits and calls.

    A Big Surprise

    Look who came to say Hi on stage at the end of the keynote!

    Shaquille O’Neal and Charlie Giancarlo (photo by me)

  • The Strategic and Evolving Role of Data Governance

    Research Objectives

    • Determine the amount and value of data for a typical organization, and how this impacts data management activities like availability, usability, and security.
    • Connect the dots between the important elements of data governance like classification, placement, and compliance as ecosystems evolve and become more distributed.
    • Help overwhelmed IT organizations find the right combination of process and technology to solve their unique data governance challenges.
    • Identify data governance process and technology gaps that need to be addressed in vendor solutions.

    (more…)