This is a cache of https://developer.ibm.com/tutorials/create-ai-java-app-quarkus-langchain/. It is a snapshot of the page as it appeared on 2025-11-16T02:50:45.421+0000.
Create your first AI Java application with Quarkus and LangChain4j - IBM Developer

Tutorial

Create your first AI Java application with Quarkus and LangChain4j

Build a simple RESTful Java app that asks an LLM to write a poem

By

Laura Cowen

In this tutorial, you learn how to create a simple RESTful Java AI application that asks a large language model (LLM) to write a short poem based on a topic provided by the application user. The service responds to GET requests made to the http://localhost:8080/poems/{topic}/{lines} URL. The user enters the topic and length of the desired poem; for example http://localhost:8080/poems/purple/5, to generate a poem such as this one:

In twilight blooms a regal hue,
The whispers of the evening sky,
Lavender dreams in softest dew,
A canvas where the shadows lie,
In purple’s embrace, we learn to fly.

This Java AI app requires a Java class and a Java interface. The class represents a resource that defines the application's endpoint and calls the AI model by using the interface to implement an AI service. The AI service uses the parameter values passed in through the endpoint to build a prompt, a natural language text request, and sends it to the AI.

The AI (the LLM) parses the prompt and returns a poem according to the topic and number of lines requested in the prompt. LLMs are AI models that are trained to generate output based on the natural language requests they receive.

architecture of sample app

The application is built and run on Quarkus. Much of the work needed to build the prompt and to connect to the LLM in order to send the request and get a response is handled for you by the open source LangChain4j extension to Quarkus.

At the end of this tutorial, you will have built this simple Java AI application. The complete code is available in my GitHub repository.

Prerequisites

On Linux and Mac, the easiest way is to first install SDKMAN!, and then use SDKMAN! to install both Java 17 and the Quarkus CLI.

Step 1. Creating the AI service

The AI service provides an abstraction layer to make it easier to write a Java application that interacts with an LLM. The AI service composes the prompts to send to the LLM and receives the responses from the LLM.

The application needs only minimal configuration to connect to the LLM because the AI service handles the connection details.

The AI service composes the prompt from two pieces of information in the class that implements the AI service:

  • The system message, which provides context to the request that the application sends to the LLM. For example, you can set the role or persona of the LLM and guide the LLM's behavior when responding.
  • The user message, which represents the user's latest input to send to the LLM. The user message is usually received, and processed, by the LLM after the system message.

Create the AiPoemService interface in a file at src/main/java/org/acme/AiPoemService.java by copying the following code:

package org.acme;

import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
import io.quarkiverse.langchain4j.RegisterAiService;

@RegisterAiService( ) // <1>
public interface AiPoemService {

    @SystemMessage("You are a professional poet. Display the poem in well-formed HTML with line breaks (no markdown).") // <2>
    @UserMessage("Write a poem about {poemTopic}. The poem should be {poemLines} lines long.") // <3>
    String writeAPoem(String poemTopic, int poemLines); // <4>
}

A quick explanation of the significant parts of this code:

  • The @RegisterAiService annotation (marked <1> in the code) implements the interface as an AI service which can connect to the LLM that is configured in the resources/application.properties file.
  • The @SystemMessage annotation (<2>) instructs the LLM to take the role of a professional poet and to display the generated poem in well-formed HTML with line breaks so that it renders neatly when viewed in a web browser.
  • The @UserMessage annotation (<3>) asks the LLM to generate a poem on the topic and of the length that the user has chosen. The user's choices are passed as parameters from the endpoint, {topic} and {lines}, to complete the templated user message placeholders, {poemTopic} and {poemLines}.
  • The writeAPoem() method (<4>) starts an exchange between the application and the AI service. The AI service composes a prompt including the system message and the user message and sends it to the LLM. The writeAPoem() method is called by the showMeAPoem() method in the Poems class, passing in the user's chosen topic and length from the endpoint parameters.

Step 2. Creating the RESTful resource

The RESTful resource defines the endpoint of your RESTful service. When a GET request is made to the endpoint, the showMeAPoem() method runs and calls the writeAPoem() method in the AI service to send a request to the LLM.

In this application, the resource class defines the endpoint that receives the user's input (choice of topic and number of lines in the poem) and then passes it to the AI service to include in its request to the LLM.

Create the Poems class in a file at src/main/java/org/acme/Poems.java by copying the following code:

package org.acme;

import jakarta.inject.Inject;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;

@Path("/poems")
public class Poems {

    @Inject
    AiPoemService aiPoemService;  // <1>

    @GET
    @Produces(MediaType.TEXT_HTML)
    @Path("/{topic}/{lines}")  // <2>
    public String showMeAPoem(@PathParam("topic") String userTopic, @PathParam("lines") int userLines) {  // <3>
        return aiPoemService.writeAPoem(userTopic, userLines);  // <4>
    }

    @GET
    @Produces(MediaType.TEXT_PLAIN)
    public String hello() {
        return "hello";
    }
}

A quick explanation of the significant parts of this code:

  • The @Inject annotation (<1>) implements the AiPoemService interface as aiPoemService.
  • The @Path annotation (<2>) defines the RESTful endpoint that takes the user's input as endpoint parameters, {topic} and {lines}.
  • The showMeAPoem() method is declared and takes two arguments, userTopic and userLines (because there is more than one argument, each parameter must be annotated with the @PathParam annotation to explicitly show how the user’s input populates the parameters). When a GET request is made to the /poems/{topic}/{lines} endpoint, the values of the {topic} and {lines} parameters are passed as the writeAPoem() method's userTopic and userLines arguments.
  • The writeAPoem() method is called with the values received from the endpoint. Calling the writeAPoem() method causes these values to be added to the user message as part of the prompt that the AI service sends to the LLM. The response from the LLM is then displayed in HTML...hopefully.

Step 3. Configuring the application

Connecting to an LLM is greatly simplified by using the Quarkus LangChain4j extension. For this application, Quarkus uses the Quarkus LangChain4j OpenAI extension, which is configured in the pom.xml file. You then need only set the API key and the base URL properties for the LLM in the resources/application.properties file. In this case, you can use the value demo to get limited demo access to the LLM which is sufficient for this application.

Create the application properties in a file at src/main/resources/application.properties by copying the following code:

quarkus.langchain4j.openai.api-key=demo
quarkus.langchain4j.openai.base-url=http://langchain4j.dev/demo/openai/v1

Step 4. Running the application

If you installed the Quarkus CLI, run the following command to start Quarkus in dev mode:

quarkus dev

Otherwise, run the following Maven command which starts the application in dev mode:

./mvnw quarkus:dev

Any changes you make to the application are automatically rebuilt and re-deployed while running in dev mode.

To test the application, request the endpoint with the values you choose replacing the template placeholders. For example, request a poem of 5 lines about purple with the URI http://localhost:8080/poems/purple/5. The HTML request in the system message means that it should display neatly in a web browser.

Alternatively, run the following curl command:

curl -w "\n" http://localhost:8080/poems/purple/5

Notice that there is a slight pause while the LLM responds, but then the application returns a short poem on the chosen topic and of the requested length.

Try alternative prompts without modifying your code

The Quarkus Dev UI (when running in dev mode) provides a chat interface where you can test alternative user messages without modifying your application code.

To use the Dev UI chat interface:

  1. From the running terminal, press d to open the Quarkus Dev UI Extensions page (http://localhost:8080/q/dev-ui/extensions) in a browser. The Extensions page lists all the extensions installed in your running instance of Quarkus.
  2. In the LangChain4j Core tile, click Chat to open the Chat interface.
  3. The System message field contains the system message from your application. You can modify the system message if you want to.
  4. In the Message field, type a user message then press Send.

The application runs and returns a response based on the system and user messages entered in the chat window.

Summary and next steps

Congratulations! You have created your first AI Java application and run it on Quarkus with LangChain4j.

Next, to dive a little deeper into RAG-based AI applications, learn how to build an AI-powered document assistant with Quarkus and LangChain4j.