This is a cache of https://developer.ibm.com/technologies/artificial-intelligence. It is a snapshot of the page as it appeared on 2026-02-02T13:29:41.979+0000.
Next generation resources for AI builders to create trusted solutions
Artificial intelligence is the application of machine learning to build systems that mimic the problem-solving and decision-making capabilities of the human mind. It includes several disciplines such as machine learning, knowledge discovery, natural language processing, computer vision, and data science. Generative AI refers to deep-learning models that can generate novel, high-quality text, images, and other content based on the data with which they were trained.
Learn how fine‑tuning a small language model (Mistral 7B) and using a Granite 3.3 8B orchestrator solves the consistency challenges of large‑scale business document evaluation.
Learn how Model Context Protocol enables scalable multi agent AI systems through client, server, and hybrid architecture patterns, LLM placement strategies, reusable AI agents, dynamic orchestration, enterprise design trade offs, and real world implementation guidance.
Boost machine learning performance on IBM Z with SnapML preprocessing that uses scikit learn pipelines, optimized C++ execution, efficient data transformations, graph feature engineering, and consistent training to production workflows for accurate and low latency enterprise inference.
Learn to deploy MCP Composer on IBM Cloud Code Engine using Docker for scalable AI orchestration, FastAPI tool integration, and serverless container hosting.
In this tutorial, learn how to set up a local AI co-pilot in Visual Studio Code using IBM Granite 4, Ollama, and Continue, overcoming common enterprise challenges such as data privacy, licensing, and cost. The setup includes open-source LLMs, Ollama for model serving, and Continue for in-editor AI assistance.
Discover the challenges of running AI in production at enterprise scale. Java alone is not enough, as AI workloads require a platform designed for scale. Red Hat OpenShift AI provides the operational backbone, extending Kubernetes with capabilities for building, deploying, and managing AI workloads. It ensures governance, observability, and portability, aligning with open standards. This allows Java developers to treat AI workloads with the same discipline as other enterprise services, making AI viable at enterprise scale. IBM and Red Hat are investing in open standards and products to secure the future of enterprise Java and AI.
Learn about using Granite models and watsonx for enterprise AI with Java. Understand the importance of openness, interoperability, and long-term support in AI adoption. Granite models provide an open foundation for AI, while watsonx delivers enterprise governance and lifecycle management. The combination of Quarkus, LangChain4j, Jakarta EE, and Red Hat OpenShift AI provides a complete story for confident AI adoption. This approach aligns AI adoption with the principles that made Java successful, ensuring investments remain valuable over time.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.