This is a cache of https://developer.ibm.com/components/granite-models/?utm_source=developer-home&utm_medium=explore. It is a snapshot of the page as it appeared on 2025-11-15T02:28:00.427+0000.
A family of AI models that help drive trust and scalability in AI-driven applications
Granite models are lightweight, low-cost/no-cost LLMs and tools that help developers swiftly prototype ideas before scaling them on production systems. The open source family of Granite models allows developers to collaborate with other developers to modernize their code, improve their productivity, and transform experiences.
In this tutorial, learn how to set up a local AI co-pilot in Visual Studio Code using IBM Granite 4, Ollama, and Continue, overcoming common enterprise challenges such as data privacy, licensing, and cost. The setup includes open-source LLMs, Ollama for model serving, and Continue for in-editor AI assistance.
Learn how to build a Language Advisor agent with Langflow, Ollama’s Granite 4.0 Micro model, and IBM watsonx Orchestrate. Detect code language, fetch best‑practice guides via web search, and deliver concise, actionable summaries all on‑premise and governed.
Learn about using Granite models and watsonx for enterprise AI with Java. Understand the importance of openness, interoperability, and long-term support in AI adoption. Granite models provide an open foundation for AI, while watsonx delivers enterprise governance and lifecycle management. The combination of Quarkus, LangChain4j, Jakarta EE, and Red Hat OpenShift AI provides a complete story for confident AI adoption. This approach aligns AI adoption with the principles that made Java successful, ensuring investments remain valuable over time.
In this article, we demonstrate using different prompts with different contexts to analyze how LLMs behave when asked generic-related questions. We show how the Granite 3.3-2b model is good at identifying associated quantifiers in generic phrases and the model's responses are context-driven too.
InstructLab empowers developers to unleash the full potential of LLMs, offering a streamlined training process, cost-efficiency, community collaboration, and stability in model performance.
Boost mainframe DevOps with AI tools such as watsonx Code Assistant for Z and IBM Test Accelerator for Z for COBOL refactoring, Java transformation, automated testing, and z/OS modernization.
About cookies on this siteOur websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising.For more information, please review your cookie preferences options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.