It’s November 2024 and I think it is safe to say, AI is officially everywhere. People are talking about it in cafés worldwide. So are your kids. Or your parents. Or grandparents. It may have been integral to your last car buying experience and has been used as a recommendation engine by your at-home/connected devices that hear you and want to improve your life by recommending products they think you need. It is pervasive and getting more so every day. That’s why I wanted to write this blog. First, as a reminder of where AI came from. Second, to explain how it is being used today. And finally, to share insights on where the market may be heading based on the research available to me.
I’ll start by exploring cutting-edge data management and infrastructure innovations, more affectionately known as Artificial Intelligence (AI). Over the past two decades, transformative technologies have reshaped how businesses store, process, analyze, and create value from data. Then I’ll dive into the forefront of these advancements, including the rise of generative AI (GenAI) and the critical role of GPUs (graphics processing units) in powering these technologies.
Whether you’re a business or IT leader, or just curious about the future, this will provide the insights you need to navigate the evolving AI landscape.
The Breakthrough of AI
Long a buzzword, AI is now realizing its commercial potential through GenAI – a field focused on creating content that mimics human creativity at an unprecedented scale. This revolution is driving new levels of innovation, transforming how businesses operate and unlocking creative solutions to complex challenges.
These advancements are only possible thanks to the evolution of hardware, particularly GPUs. Once primarily used for graphics, GPUs are now essential for AI, enabling efficient model training and inference with their powerful computational capabilities.
AI and generative AI are the latest disruptive technologies, following in the footsteps of others like network-attached storage (NAS), virtualization and solid-state drives (SSDs). Each of these innovations redefined how we manage data and IT operations. AI is poised to do the same, leading us into a new era of productivity and creativity.
Where It All Began…
Alan Turing, a pioneer in theoretical computer science and artificial intelligence, laid the foundation for modern computing in the 1930s with his concept of a "universal machine," now known as the Turing Machine. His groundbreaking 1950 paper introduced the Turing Test, which continues to be a fundamental tool in evaluating machine intelligence and shaping the ethical considerations of AI today.
AI as a scientific field took shape in 1956 at the Dartmouth Conference, where John McCarthy and pioneers like Marvin Minsky and Herbert A. Simon officially introduced the term Artificial Intelligence. Early AI research focused on symbolic methods for problem solving, but the late 20th century saw a shift to machine learning, driven by advancements in computing power and data availability.
The recent explosion of deep learning – a subset of machine learning using complex neural networks – has revolutionized AI, making strides in image and speech recognition, natural language processing (NLP), and autonomous systems. Thanks to these and other developments, AI has quickly moved beyond academia to become a transformative force across industries, driving innovation and efficiency.
The Rise of the GPU in AI
The evolution of AI has been closely linked with the rise of GPUs. In the early 2000s, Stanford researchers Ian Buck and Pat Hanrahan pioneered using GPUs for general-purpose computing with BrookGPU. This work laid the foundation for NVIDIA’s CUDA platform in 2006, transforming GPUs from graphics-focused tools into powerful devices capable of accelerating a wide range of scientific and technical tasks.
A pivotal moment came in 2012 when Alex Krizhevsky, Ilya Sutskever and Geoffrey Hinton used GPUs to train AlexNet, a deep neural network that dominated the ImageNet competition. This victory highlighted the immense potential of GPUs in deep learning, leading to widespread adoption in AI research and applications.
AI, machine learning (ML) and GenAI are three disruptive technologies rapidly helping to reshape our world. While the latter two are branches of AI, they each play unique roles in driving innovation.
- Artificial Intelligence (AI): AI is a broad field dedicated to creating systems that perform tasks requiring human intelligence, such as learning, decision-making and understanding natural language. It’s disruptive because it transforms industries by automating cognitive and manual tasks, enhancing efficiency, and introducing new problem-solving methods. For instance, AI predicts patient diagnoses faster and more accurately in healthcare than traditional methods.
- Machine Learning (ML): A subset of AI, ML focuses on systems that learn from data and improve over time without explicit programming. Machine Learning is particularly disruptive due to its multiple applications, from predictive analytics in business to personalized recommendations on streaming services. Its ability to uncover insights from vast data sets enhances efficiency, accuracy and productivity across various sectors.
- Generative AI: GenAI systems create new content – text, images, music, even code – resembling human-generated output. Technologies like GPT for text and DALL·E for images are disruptive because they unlock new possibilities in creativity and automation. They reduce the time and cost of content creation, enable personalized content at scale, and drive innovation in entertainment, design and education.
How GenAI is Impacting Business Today
The disruptive nature of these technologies comes from their potential to change how businesses operate fundamentally, impact labor markets, create new product categories and alter competitive landscapes. Their applications can lead to efficiency gains, cost reductions, new business models and even entirely new industries, challenging existing companies to adapt or risk obsolescence.
According to a recent study by Hitachi Vantara with the Enterprise Strategy Group (ESG) exploring AI buyer insights, 97% of organizations with a GenAI project in flight indicated it’s a top 5 priority for their organization. In comparison, 63% of organizations have identified at least one use case for GenAI within their organization. These numbers indicate this technology’s transformative opportunities for businesses looking to create a competitive edge, expand their service offerings or make better and more impactful decisions with their data.
Insights from this research highlight how generative AI drives innovation and efficiency across multiple industries, offering scalable solutions that enhance personalization, decision-making and operational effectiveness.
Here’s a glimpse at five key areas where companies are finding success with GenAI as part of their new business model.
- Automated Content and Report Generation: Generative AI transforms how organizations handle content creation across various industries. Whether it’s generating financial reports, medical summaries or customer responses, AI can automate the production of accurate, compliant and personalized content at scale, significantly reducing manual effort and improving consistency.
- Personalization and Customer Engagement: Delivering personalized experiences is crucial in any market. GenAI enables organizations to tailor services, advice and communications to individual preferences and needs, enhancing customer satisfaction and loyalty.
- Synthetic Data Generation and Privacy Preservation: Generative AI is increasingly used to create synthetic data sets for training models, especially in sensitive industries like finance and healthcare. This allows organizations to maintain privacy while improving the accuracy and robustness of AI-driven solutions, such as fraud detection, diagnostics and customer/patient care services.
- Risk Management and Predictive Analytics: GenAI is a powerful modeling and scenario analysis tool that helps organizations anticipate and manage risks. In finance, it can simulate market conditions for multiple trading strategies. In healthcare, it aids in predicting patient outcomes, and in customer/patient care and advocacy, it helps preemptively address potential issues.
- Virtual Assistants and Automated Support: AI-powered virtual assistants and chatbots are revolutionizing customer interactions across sectors. These tools provide real-time support, handle routine inquiries and guide users through complex processes, freeing human agents to focus on higher-level tasks and improving overall service efficiency.
Implementation Challenges and Considerations
Despite the potential benefits, Fortune 2000 companies need help in AI adoption, including data privacy concerns, ethical considerations, the need for skilled personnel, and the integration of AI into legacy systems. Successful deployment often requires a strategic approach, significant investment in talent and technology, and a culture that supports innovation and continuous learning. In fact, in the same ESG research mentioned earlier, we are told security is the top concern (38%), followed by cost/technical debt (27%), data availability and quality (27%), and integration challenges (25%).
Get a Jumpstart with AI Discovery Service for Hitachi iQ
AI Discovery Service for Hitachi iQ is a consulting solution from Hitachi Vantara Professional Services, designed to help organizations smoothly integrate AI technologies into their operations. And more importantly, to leverage AI to create value. As part of the Hitachi iQ portfolio, this service aids businesses in identifying key AI use cases, evaluating their current data infrastructure, and estimating potential ROI from AI initiatives.
Tailored to fit each organization's unique needs, the service offers a strategic roadmap for navigating the complexities of AI adoption. This includes assessing necessary technologies, running proof-of-concept (POC) trials, and planning for full-scale deployment. With flexible options ranging from short-term engagements to more in-depth advisory and implementation support, our AI Discovery Service is adaptable to different business requirements. Paired with the Hitachi iQ solution portfolio – which features AI-ready infrastructure that has achieved NVIDIA DGX BasePODTM storage certification – this service provides the robust, scalable foundation needed to support advanced AI workloads and drive digital transformation.
Stay tuned over the coming weeks and months for more on Hitachi iQ as well as a continuation of this primer series with helpful background and insight on the AI space. Next up, perspective on retrieval-augmented generation (RAG). Helpful and important information for anyone looking to leverage AI to build a competitive edge and achieve operational excellence. See you back here soon.
Read more
- News Release: Hitachi Vantara Announces General Availability of Hitachi iQ and New AI Discovery Service to Help Businesses Become AI-Ready
- Blog: Seeking Enterprise AI Business Outcomes? Introducing Hitachi iQ with NVIDIA DGX BasePOD
- Blog: Discovering the Potential of Your AI Initiatives and Applications to Achieve GenAI Success
David A. Chapa
David A. Chapa is a recognized thought leader in data storage management and AI, with over three decades of experience shaping the industry. A prolific author and speaker, he has contributed extensively to advancing data management, protection and security strategies. David was at the forefront of AI's expansion in HPC, while researching the next-generation filesystem to support exascale computing and high-demand AI innovation in data storage. Today, at Hitachi Vantara, David continues to drive transformative insights and solutions in data storage and AI.