Insight

  • As AI continues its meteoric rise into business and IT environments, organizations are rapidly assembling or accelerating strategies to support AI technologies across every applicable area. While many organizations are consistent in their efforts to build AI strategies, the components and direction of those strategies often vary. TechTarget’s Enterprise Strategy Group recently surveyed data and IT professionals responsible for the infrastructure supporting AI initiatives at their organization to gain insights into these trends.

    To learn more about these trends, download the free infographic, Navigating the Evolving AI Infrastructure Landscape.

  • As AI continues its meteoric rise into business and IT environments, organizations are rapidly assembling or accelerating strategies to support AI technologies across every applicable area. Unlike niche technologies that impact only certain processes or personnel, AI has wide-ranging potential to transform entire businesses, IT environments, and associated teams. In turn, AI strategies must be multi-pronged efforts that properly align business objectives with AI initiatives and expectations, which requires thorough participation from stakeholders across the organization. The underlying infrastructure and other supportive elements must be fully capable of supporting that tandem strategy.

    While many organizations are consistent in their efforts to build AI strategies, the components and direction of those strategies often vary. To assess the evolving AI landscape and the infrastructure that supports it, TechTarget’s Enterprise Strategy Group surveyed 375 data and IT professionals in North America (US and Canada) responsible for strategizing, evaluating, purchasing, and/or managing infrastructure specifically supporting AI initiatives for their organization. This study sought to answer the following questions:

    • What are the primary business objectives for implementing AI? How long does it take for organizations to start seeing value from AI initiatives?
    • What are the top challenges organizations encounter when implementing AI?
    • What individuals or teams influence decision making related to infrastructure used to support AI initiatives? Which of these has the most influence on final decisions?
    • How are organizations planning to address skills gaps related to the selection, implementation, and management of infrastructure supporting AI initiatives?
    • In which physical locations do organizations primarily deploy their AI infrastructure? What are the top factors that influence the choice of these locations? Are AI environments mostly centralized, mostly decentralized, or an even mix of both?
    • What capabilities of AI infrastructure are most important?
    • Are organizations using internal resources, third-party resources, or both to manage their AI infrastructure?
    • How important is sustainability and environmental responsibility when selecting AI infrastructure? How important is a vendor’s stance on these factors when making purchase decisions for AI infrastructure?
    • What types of data do organizations use to build and train AI models and algorithms? What steps do organizations take to ensure accuracy in the data used for building and training these models?
    • How do organizations handle the movement of the large amounts of data required to support AI initiatives? What challenges are involved with this process?
    • How are organizations using synthetic and third-party data to support AI model training?
    • How are organizations using generative AI (GenAI)? What challenges are they encountering?
    • To what extent are developers leveraging AI infrastructure resources? How do developers access these resources?
    • How do organizations measure the success and effectiveness of AI initiatives?
    • What is AI’s impact on employee productivity, processes, workflows, competitiveness, and other factors?

    Survey participants represented a wide range of industries, including financial, manufacturing, retail/wholesale, and healthcare, among others. For more details, please see the Research Methodology and Respondent Demographics sections of this report.

    Already an Enterprise Strategy Group client? Log in to read the full report.
    If you are not yet a Subscription Client but would like to learn more about accessing this report, please contact us.
  • Cloud Data Protection Strategies at a Crossroads

    The broad adoption of public cloud services and containers as sources and repositories of business-critical data puts the onus on data owners to deliver on data protection SLAs for cloud-resident and container-based applications and data. As vendors and the cloud ecosystem evolve and add as-a-service consumption options, end-users are making incorrect comparisons and assumptions, leading to lasting challenges and a market at a crossroads.

    Learn more about these trends with this free infographic.

  • The need for observability in IT operations management is driven by the desire for organizations to reduce downtime, increase operational security, and improve customer, digital, and employee experiences. In IT operations management, the addition of distributed and multi-cloud, cloud-native development and architectures means that the infrastructure is much more complex. Against this backdrop, IT and DevOps teams are embracing observability and, to a lesser extent, AIOps to help them instrument and monitor their infrastructure and applications.

    Learn more about these trends with this free infographic.

  • Women in Cybersecurity: Fleur Chapman

    For this episode, I had the opportunity to interview Fleur Chapman, chief operating officer for ITC Secure. Although she more recently joined the cybersecurity field, her analytical and project management skills, her background in economics, finance, and even speech and drama, have contributed to her current role running operations at a global cyber security services company. Be sure to watch the full video to hear the full interview. Below are some highlights and resources that she shared.

    Jump to video >>

    Using Her Analytics and Project Management Background

    Chapman came into cybersecurity when she was recruited to her role due to her experience as a technical project manager and consultant in previous roles. With her education in finance and econ, and a teacher’s diploma in speech and drama, she started her career in the public sector, working for the Ministry of Education in New Zealand.

    She described one of her roles for the Ministry of Education where she led the implementation of the first births data matching programme in NZ. Analytical by nature, she enjoyed interfacing with software developers, third-party vendors, internal and external stake holders, and realized her passion for quality and delivery.

    She earned IT service management certifications and ran projects overseeing operational teams, transactional data systems, delivering strategy and policy initiatives, transformational change, data privacy, identity and access management (IAM) solutions, and regulatory compliance programs.

    At one point, she managed the technical assurance program for Transport. She said it was a huge challenge, but she wasn’t satisfied. She wanted to go further than project or program management, and she wanted a more strategic role in the private sector.

    She was a contractor delivering managed service projects when she started working with ITC Secure. She worked on several projects, including implementing governance risk and compliance (GRC) for a large customer, and she worked on incident response services for a remote security operations center (SOC). This experience helped her move to an internal role as programme director, and then she was head of compliance. She also set up their risk and compliance function, GDPR, and governance model, and then become their Chief Operating Officer.

    Security Challenges and Goals

    Her role is broad – she helps global customers with risk and compliance, and she works on internal information security, legal and operations. Operations includes 24×7 managed services, the SOC, the network operations center (NOC), governance, and their platform team.

    Her biggest challenges: the skills shortage, evolving threat landscape, ever changing technologies, and thinking about third party risk.

    “If you miss something, you’re remembered for what you missed, not the constant good stuff that you’ve caught. You have to make sure you have robust processes in place, as well as everchanging technologies, keeping up to date with technological advancements, offering more to customers for less.”

    For third-party risk, she said they are constantly reassessing their third parties, direct and indirect suppliers. “We are judged by the suppliers we use and the threats we expose.”

    People, Process and Technology

    “We have to think about confidentiality, integrity and availability of the information – it’s vital. The key thing for our business is to prevent sensitive info from falling into the wrong hands.”

    So her background in the public sector with process and procedures is helpful with fundamentals around change management, incident management, access control, and authorization.

    “Procedures underpin how to provide your best services. We can guarantee services with confidentiality and integrity,” she said. “Humans make mistakes, so minimizing risk comes back to standardizing, documenting, managing and monitoring. It’s a constant challenge. There is no blame culture. People will make mistakes, but how do you identify them and learn. You can identify opportunities to improve services and operational security.”

    Advice and Resources

    Chapman has experience taking on roles to rescue failing projects due to constraints around timeframes, costs, or mistakes that people have made. Her advice:

    “Be confident in yourself and your ability. There is no challenge too great, never be afraid to ask for guidance from your leaders or your peers,” she said. “I’ve had situation where I had to tell myself to step back, take a breath, pause, and think before your next course of action, never respond straightaway to the email that angered you. Not to sweat the small stuff, know what to let go of, and what to push.”

    While this is useful advice earned from experience that you can’t learn from a book, Chapman mentioned that she is an avid reader for advice and inspiration. Two recent books she recommends:

    To listen to the full interview, click here.

    Be sure to also visit our Women in Cybersecurity page, where you can view past episodes and connect with us to hear more inspiring stories in future shows!

  • 6 Reasons Cisco Acquired Splunk

  • Time for an Identity Security Revolution

    Over the course of the long, slow evolution of cyberdefenses, we’ve lost focus on what the attacker really wants: access to your data via an identity.

    Read my blog to learn more about why I think identity needs to be the foundational component of the cybersecurity stack.

  • Overcoming Threats Within Encrypted Traffic

    Organizations of all sizes recognize the risk of cybersecurity threats to their businesses. This is why recent Technology Spending Intentions research from TechTarget’s Enterprise Strategy Group (ESG) [1] highlights that organizations continue to invest in improving cybersecurity. Respondents’ most-cited business initiative driving IT spending at their organizations in 2023 and the most-cited area in which they expect to increase investment in 2023 is cybersecurity. Why is that needed? Because the threat landscape is always changing and evolving. Like a never-ending game of whack-a-mole, cybersecurity professionals must adapt to new threats and tactics.

    Over the last few years, the amount of encrypted traffic on the network has expanded significantly. While this is important to ensure privacy and security, it can represent a blind spot for security tools. ESG research [2] highlights that 83% of organizations have either a notable or significant concern regarding scanning encrypted traffic for threats. In fact, 68% of ESG research respondents have reported being impacted by multiple encrypted attacks.

    Why is this happening? Well, because most organizations lack visibility into all their encrypted traffic, with only 34% stating that they have complete visibility. There are a number of reasons organizations forgo decryption, some of the most common of which include:

    • Selective decryption to avoid looking at sensitive employee information (56%).
    • Performance issues with decryption (47%).
    • Too expensive to decrypt all traffic (32%).
    • Don’t have a lot of encrypted traffic (32%).

    It will only get harder to gain visibility into encrypted traffic as new versions of Transport Layer Security (TLS 1.3 -TCP) and QUIC (UDP) rolls out. As these standards are adopted, it will become increasingly difficult to decrypt and analyze traffic.

    So where does that leave organizations that are trying to stay ahead of the cyber threat curve? Well, one company that we just spoke to believes they have a solution. Gigamon, a pioneer in network visibility and intelligence that now focuses on providing deep observability pipelines, announced Precryption technology. This enables organizations to have deep visibility into encrypted traffic across virtual machines or container workloads so it can be delivered to threat detection, investigation and response tools for hybrid cloud environments. Gigamon believes this technology will eliminate blind spots and ensure organizations can inspect all encrypted traffic.

    It accomplishes this by utilizing eBPF technology within the Linux kernel to capture traffic prior to encryption or just after decryption – eliminating the need for costly decryption technology or having to sniff keys. Gigamon claims that it runs independently of the application, unlike approaches that require agents. By collecting traffic prior to encryption, this approach alleviates the need to manage keys on a separate decryption tool, as well as the performance impact from decrypting network traffic.

    Clearly, organizations have to work really hard to stay ahead of an ever-changing threat landscape, so It’s great to see vendors like Gigamon bringing to the market innovative solutions that deliver full visibility into encrypted traffic.

    [1] Source: Enterprise Strategy Group Research Report, 2023 Technology Spending Intentions Survey, November 2022.

    [2] Source: Enterprise Strategy Group Complete Survey Results, Network Threat Detection Response Trends, April 2023.

  • Passwordless in the Enterprise

    Research Objectives

    Traditional authentication methods aren’t working. With the availability of cheap cloud GPUs to crack passwords and tens of billions of known accounts/passwords, it’s clear that passwords aren’t secure. MFA hasn’t been a viable replacement as it’s susceptible to social engineering, phishing, and other attacks while introducing user friction and degrading the user experience.

    Successful attacks are cultivating the need for a new authentication method. Recent prominent MFA-based breaches and friction in the end-user experience have reached the ears of app developers, IT, and cybersecurity leadership. Organizations are now searching for alternative methods to address the risks and challenges of MFA and password-based authentication.

    IAM vendors need to demystify passwordless authentication. While the concept has received tremendous publicity as a panacea, organizations struggle to understand which passwordless methods are the best fit for different use cases. Passwordless vendors are jockeying to differentiate themselves in this crowded space to demonstrate they’re the best fit for prospective customers.

    To gain insights into the authentication landscape generally and the evolution of passwordless technology specifically, TechTarget’s Enterprise Strategy Group surveyed 377 IT, cybersecurity, and application development professionals responsible for identity and access management programs, projects, processes, solutions, and services in North America.

    This study sought to answer the following questions:

    • What priority level do organizations assign to their practices for authenticating workforce and customer identities?
    • Approximately what percentage of organizations’ workforce and customer identities are believed to be insufficiently secured?
    • Do organizations make multifactor authentication mandatory for their workforce?
    • How are organizations prioritizing the use of passwordless authentication methods for their workforce and customers relative to other areas of identity?
    • What types of passwordless solutions do organizations currently use for their customers?
    • How confident are organizations in their ability to detect a session with an attacker using a compromised account versus a session with a real user?
    • Have organizations experienced any account or credential compromises in the last 12 months? Approximately how many times has this happened?
    • What contributed to the compromise of organizations’ accounts or credentials? Have any of the compromised accounts or credentials over the last 12 months led to a successful cybersecurity attack?
    • Relative to other areas of identity and access management, how do organizations expect their spending on authentication to change, if at all, over the next 12 months?
    • With respect to any increase in spending on authentication, in which areas do organizations expect most of this investment to go to in the next 12 months?

    Survey participants represented a wide range of industries including manufacturing, technology, financial services, and retail/wholesale. For more details, please see the Research Methodology and Respondent Demographics sections of this report.

    Already an Enterprise Strategy Group client? Log in to read the full report.

    If you are not yet a Subscription Client but would like to learn more about accessing this report, please contact us.