Skip to content

Commit 2363970

Browse files
committed
Blog post about MCP and Quarkus LangChain4j
1 parent a6a32eb commit 2363970

File tree

2 files changed

+203
-0
lines changed

2 files changed

+203
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,203 @@
1+
---
2+
layout: post
3+
title: 'Using the Model Context Protocol with Quarkus+LangChain4j'
4+
date: 2025-01-08
5+
tags: langchain4j llm ai
6+
synopsis: 'Executing tools via the Model Context Protocol with Quarkus+LangChain4j'
7+
author: jmartisk
8+
---
9+
:imagesdir: /assets/images/posts/mcp
10+
11+
We are thrilled to announce that starting with version 0.23.0, the Quarkus
12+
LangChain4j project integrates calling tools using the
13+
https://modelcontextprotocol.io[Model Context Protocol (MCP)].
14+
15+
== What is the Model Context Protocol?
16+
17+
MCP is an open protocol that standardizes how applications provide context
18+
to LLMs. An MCP server is an application that can provide tools, resources
19+
(be it a set of static documents or dynamically accessed data, for example
20+
from a database), or pre-defined prompts that your AI-infused application
21+
can use when talking to LLMs. When you package such functionality into an
22+
MCP server, it can be plugged into and used by any LLM client toolkit that
23+
supports MCP, including Quarkus and LangChain4j. There is also already a
24+
growing ecosystem of reusable MCP servers that you can use out of the box.
25+
26+
In version 0.23.x, Quarkus LangChain4j supports using the MCP protocol to
27+
execute tools. Support for resources and prompts is planned for future
28+
releases.
29+
30+
In this article, we will show you how to use Quarkus and LangChain4j to
31+
easily create an application that connects to an MCP server providing
32+
filesystem-related tools and exposes a chatbot that a user can use to
33+
interact with the local filesystem, that means read and write files as
34+
instructed by the user.
35+
36+
There is no need to set up an MCP server separately, we will configure
37+
Quarkus to run one as a subprocess. As you will see, setting up MCP with
38+
Quarkus is extremely easy.
39+
40+
NOTE: To download the final project, visit the
41+
https://github.com/quarkiverse/quarkus-langchain4j/tree/0.23.0/samples/mcp-tools[
42+
quarkus-langchain4j samples]. That sample contains the final functionality
43+
developed in this article, plus some stuff on top, like a JavaScript-based
44+
UI. In this article, for simplicity, we will skip the creation of that UI,
45+
and we will only use the Dev UI chat page that comes bundled in Quarkus out
46+
of the box.
47+
48+
== Prerequisites
49+
50+
* Apache Maven 3.9+
51+
* The `npm` package manager installed on your machine
52+
53+
== Creating a Filesystem assistant project
54+
55+
We will assume that you are using OpenAI as the LLM provider. If you are
56+
using a different provider, you will need to swap out the
57+
`quarkus-langchain4j-openai` extension and use something else.
58+
59+
Start by generating a Quarkus project. If you are using the Quarkus CLI, you can do it like this:
60+
61+
[source, shell]
62+
----
63+
quarkus create app org.acme:filesystem-assistant:1.0-SNAPSHOT \
64+
--extensions="langchain4j-openai,langchain4j-mcp,vertx-http" -S 3.17
65+
----
66+
67+
If you prefer to use the web-based project generator, go to
68+
https://code.quarkus.io/[code.quarkus.io] and select the same extensions.
69+
70+
Whenever you run the application, make sure the
71+
`QUARKUS_LANGCHAIN4J_OPENAI_API_KEY` environment variable is set to your
72+
OpenAI API key.
73+
74+
=== Create the directory to be used by the agent
75+
76+
Under the root directory of the Maven project, create a directory named `playground`.
77+
This will be the only directory that the agent will be allowed to interact with.
78+
79+
Inside that directory, create any files that you want for testing. For
80+
example, create a file named `playground/hello.txt` with the following
81+
contents:
82+
83+
----
84+
Hello, world!
85+
----
86+
87+
=== Create the AI service
88+
89+
Next, we need to define an AI service that will define how the bot should
90+
behave. The interface will look like this:
91+
92+
[source, java]
93+
----
94+
@RegisterAiService
95+
@SessionScoped
96+
public interface Bot {
97+
98+
@SystemMessage("""
99+
You have tools to interact with the local filesystem and the users
100+
will ask you to perform operations like reading and writing files.
101+
102+
The only directory allowed to interact with is the 'playground' directory relative
103+
to the current working directory. If a user specifies a relative path to a file and
104+
it does not start with 'playground', prepend the 'playground'
105+
directory to the path.
106+
107+
If the user asks, tell them you have access to a tool server
108+
via the Model Context Protocol (MCP) and that they can find more
109+
information about it on https://modelcontextprotocol.io/.
110+
"""
111+
)
112+
String chat(@UserMessage String question);
113+
}
114+
----
115+
116+
Feel free to adjust the system message to your liking, but this one should
117+
be suitable to get the application working as expected.
118+
119+
=== Configure the MCP server and the connection to it
120+
121+
We will use
122+
https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem[server-filesystem]
123+
MCP server that comes as an NPM package, this is why you need to have `npm`
124+
installed on your machine. It is assumed that you have the `npm` binary
125+
available on your `PATH` (the `PATH` variable that the Quarkus process
126+
sees).
127+
128+
Starting the server and configuring the connection to it is extremely easy.
129+
We will simply tell Quarkus to start up a `server-filesystem` MCP server as
130+
a subprocess and then communicate with it over the `stdio` transport. All
131+
you need to do is to add two lines into your `application.properties`:
132+
133+
[source, properties]
134+
----
135+
quarkus.langchain4j.mcp.filesystem.transport-type=stdio
136+
quarkus.langchain4j.mcp.filesystem.command=npm,exec,@modelcontextprotocol/[email protected],playground
137+
----
138+
139+
With this configuration, Quarkus will know that it should create a MCP
140+
client that will be backed by a server that will be started by executing
141+
`npm exec @modelcontextprotocol/[email protected] playground` as a
142+
subprocess. The `playground` argument denotes the path to the directory that
143+
the agent will be allowed to interact with. The `stdio` transport means that
144+
the client will communicate with the server over standard input and output.
145+
146+
When you configure one or more MCP connections this way, Quarkus also
147+
automatically generates a `ToolProvider`. Any AI service that does not
148+
explicitly specify a tool provider will be automatically wired up to this
149+
generated one, so you don't need to do anything else to make the MCP
150+
functionality available to the AI service.
151+
152+
Optionally, if you want to see the actual traffic between the application
153+
and the MCP server, add these three additional lines to your
154+
`application.properties`:
155+
156+
[source, properties]
157+
----
158+
quarkus.langchain4j.mcp.filesystem.log-requests=true
159+
quarkus.langchain4j.mcp.filesystem.log-responses=true
160+
quarkus.log.category.\"dev.langchain4j\".level=DEBUG
161+
----
162+
163+
And that's all! Now, let's test it.
164+
165+
=== Try it out
166+
167+
Since we didn't create any UI for our application that a user could use,
168+
let's use the Dev UI that comes with Quarkus out of the box. With the
169+
application running in development mode, access
170+
http://localhost:8080/q/dev-ui in your browser and click the `Chat` link in
171+
the `LangChain4j` card (either that, or go to
172+
http://localhost:8080/q/dev-ui/io.quarkiverse.langchain4j.quarkus-langchain4j-core/chat
173+
directly).
174+
175+
Try a prompt to ask the agent to read a file that you created previously, such as:
176+
177+
----
178+
Read the contents of the file hello.txt.
179+
----
180+
181+
If all is set up correctly, the agent will respond with the contents of the
182+
file, like in this screenshot:
183+
184+
image::devui.png[Dev UI chat page after asking about a file,400,float="right",align="center"]
185+
186+
The bot can also write files, so try a prompt such as:
187+
188+
----
189+
Write a Python script that prints "Hello, world!" and save it as hello.py.
190+
----
191+
192+
Then have a look into your `playground` directory, and you should see the new Python file there!
193+
194+
=== Conclusion
195+
196+
The Model Context Protocol allows you to easily integrate reusable sets of
197+
tools and resources to AI-infused applications in a portable way. With the
198+
Quarkus LangChain4j extension, you can instruct Quarkus to run a server
199+
locally as a subprocess, and configuring application to use it is just a
200+
matter of adding a few configuration properties.
201+
202+
And that's not all. Stay tuned, because Quarkus also has an extension that
203+
allows you to create MCP servers! More about that soon.

assets/images/posts/mcp/devui.png

34.4 KB
Loading

0 commit comments

Comments
 (0)