Skip to content

Commit 5070247

Browse files
committed
day 2 - langchain
1 parent 4dc25c9 commit 5070247

File tree

3 files changed

+678
-1
lines changed

3 files changed

+678
-1
lines changed

Langchain/README.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,5 +5,13 @@
55
66
This is my notes as I'm exploring the LlamaIndex codebase. If you're looking to contribute or interested in understanding more about how Langchain works under the hood I think this would be useful for you too 🙂.
77

8-
### Content
8+
### Basics
99
1. [Prompts](./prompts.ipynb) - Prompts is langchain's abstraction over the strings you pass into models but it does have some tricks up its sleeve that help you when building Apps.
10+
2. [LLMs](./llms-completion.ipynb) - Using the LLM abstraction and the fancy
11+
stuff it allows us to do. Access to a wide range of LLM endpoints.
12+
3. [LLMs: chat models](./llms-chat.ipynb) - Chat Models are becoming more common
13+
and is cheaper too. In this notebook we'll see what langchain offers in this
14+
regards
15+
16+
### Langsmith
17+
(soon...)

Langchain/llms-chat.ipynb

Lines changed: 305 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,305 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "10c38b7c",
6+
"metadata": {},
7+
"source": [
8+
"# LLMs: Chat Models\n",
9+
"\n",
10+
"Chat models are a variation on language models. While chat models use language models under the hood, the interface they expose is a bit different. Rather than expose a \"text in, text out\" API, they expose an interface where \"chat messages\" are the inputs and outputs."
11+
]
12+
},
13+
{
14+
"cell_type": "code",
15+
"execution_count": 1,
16+
"id": "47445a1b",
17+
"metadata": {},
18+
"outputs": [],
19+
"source": [
20+
"from langchain.chat_models import ChatOpenAI\n",
21+
"\n",
22+
"chat = ChatOpenAI()"
23+
]
24+
},
25+
{
26+
"cell_type": "markdown",
27+
"id": "11f1eefb",
28+
"metadata": {},
29+
"source": [
30+
"the Chat model interface is based around messages rather that raw text. LangChain are `AIMessage`, `HumanMessage`, `SystemMessage`, and `ChatMessage` -- `ChatMessage` takes in an arbitrary role parameter. Most of the time, you'll just be dealing with `HumanMessage`, `AIMessage`, and `SystemMessage`"
31+
]
32+
},
33+
{
34+
"cell_type": "code",
35+
"execution_count": 2,
36+
"id": "07842ccb",
37+
"metadata": {},
38+
"outputs": [
39+
{
40+
"data": {
41+
"text/plain": [
42+
"AIMessage(content=\"J'adore programmer.\", additional_kwargs={}, example=False)"
43+
]
44+
},
45+
"execution_count": 2,
46+
"metadata": {},
47+
"output_type": "execute_result"
48+
}
49+
],
50+
"source": [
51+
"from langchain.schema import AIMessage, HumanMessage, SystemMessage\n",
52+
"\n",
53+
"chat([HumanMessage(\n",
54+
" content=\"Translate this sentence from English to French: I love programming.\"\n",
55+
")])"
56+
]
57+
},
58+
{
59+
"cell_type": "markdown",
60+
"id": "b8661027",
61+
"metadata": {},
62+
"source": [
63+
"You can also send it as system messages"
64+
]
65+
},
66+
{
67+
"cell_type": "code",
68+
"execution_count": 3,
69+
"id": "05b472ce",
70+
"metadata": {},
71+
"outputs": [
72+
{
73+
"data": {
74+
"text/plain": [
75+
"AIMessage(content=\"J'adore la programmation.\", additional_kwargs={}, example=False)"
76+
]
77+
},
78+
"execution_count": 3,
79+
"metadata": {},
80+
"output_type": "execute_result"
81+
}
82+
],
83+
"source": [
84+
"messages = [\n",
85+
" SystemMessage(content=\"You are a helpful assistant that translates English to French.\"),\n",
86+
" HumanMessage(content=\"I love programming.\")\n",
87+
"]\n",
88+
"chat(messages)"
89+
]
90+
},
91+
{
92+
"cell_type": "markdown",
93+
"id": "ab09ff3f",
94+
"metadata": {},
95+
"source": [
96+
"You can use `generate()` like in completion for batch calls and richer outputs. It returns an `ChatResult` which contains `ChatGenerations`."
97+
]
98+
},
99+
{
100+
"cell_type": "code",
101+
"execution_count": 6,
102+
"id": "1410a858",
103+
"metadata": {},
104+
"outputs": [],
105+
"source": [
106+
"batch_messages = [\n",
107+
" [\n",
108+
" SystemMessage(content=\"You are a helpful assistant that translates English to French.\"),\n",
109+
" HumanMessage(content=\"I love programming.\")\n",
110+
" ],\n",
111+
" [\n",
112+
" SystemMessage(content=\"You are a helpful assistant that translates English to French.\"),\n",
113+
" HumanMessage(content=\"I love artificial intelligence.\")\n",
114+
" ],\n",
115+
"]\n",
116+
"result = chat.generate(batch_messages)"
117+
]
118+
},
119+
{
120+
"cell_type": "code",
121+
"execution_count": 8,
122+
"id": "edf91be9",
123+
"metadata": {},
124+
"outputs": [
125+
{
126+
"data": {
127+
"text/plain": [
128+
"\"J'adore la programmation.\""
129+
]
130+
},
131+
"execution_count": 8,
132+
"metadata": {},
133+
"output_type": "execute_result"
134+
}
135+
],
136+
"source": [
137+
"chat_g = result.generations[0][0]\n",
138+
"\n",
139+
"chat_g.text"
140+
]
141+
},
142+
{
143+
"cell_type": "code",
144+
"execution_count": 11,
145+
"id": "8f805cdf",
146+
"metadata": {},
147+
"outputs": [
148+
{
149+
"data": {
150+
"text/plain": [
151+
"AIMessage(content=\"J'adore la programmation.\", additional_kwargs={}, example=False)"
152+
]
153+
},
154+
"execution_count": 11,
155+
"metadata": {},
156+
"output_type": "execute_result"
157+
}
158+
],
159+
"source": [
160+
"chat_g.message"
161+
]
162+
},
163+
{
164+
"cell_type": "code",
165+
"execution_count": 12,
166+
"id": "6b854b89",
167+
"metadata": {},
168+
"outputs": [],
169+
"source": [
170+
"chat_g.generation_info"
171+
]
172+
},
173+
{
174+
"cell_type": "code",
175+
"execution_count": 22,
176+
"id": "ffacf15f",
177+
"metadata": {},
178+
"outputs": [
179+
{
180+
"data": {
181+
"text/plain": [
182+
"{'token_usage': {'prompt_tokens': 53,\n",
183+
" 'completion_tokens': 20,\n",
184+
" 'total_tokens': 73},\n",
185+
" 'model_name': 'gpt-3.5-turbo'}"
186+
]
187+
},
188+
"execution_count": 22,
189+
"metadata": {},
190+
"output_type": "execute_result"
191+
}
192+
],
193+
"source": [
194+
"result.llm_output"
195+
]
196+
},
197+
{
198+
"cell_type": "markdown",
199+
"id": "ceab930c",
200+
"metadata": {},
201+
"source": [
202+
"### *can you use chat_prompts in non-chat models?*\n",
203+
"\n"
204+
]
205+
},
206+
{
207+
"cell_type": "code",
208+
"execution_count": 21,
209+
"id": "68e8c5e5",
210+
"metadata": {},
211+
"outputs": [
212+
{
213+
"name": "stdout",
214+
"output_type": "stream",
215+
"text": [
216+
"System: You are a helpful assistant that translates English to French.\n",
217+
"Human: I love programming.\n"
218+
]
219+
},
220+
{
221+
"data": {
222+
"text/plain": [
223+
"\"\\n\\nSystem: J'adore le programmation.\""
224+
]
225+
},
226+
"execution_count": 21,
227+
"metadata": {},
228+
"output_type": "execute_result"
229+
}
230+
],
231+
"source": [
232+
"from langchain.llms import OpenAI\n",
233+
"from langchain.prompts import ChatPromptTemplate\n",
234+
"\n",
235+
"prompt = ChatPromptTemplate.from_messages(messages)\n",
236+
"print(prompt.format())\n",
237+
"llm = OpenAI()\n",
238+
"llm(prompt.format())"
239+
]
240+
},
241+
{
242+
"cell_type": "code",
243+
"execution_count": 25,
244+
"id": "f3e47d3b",
245+
"metadata": {},
246+
"outputs": [
247+
{
248+
"name": "stdout",
249+
"output_type": "stream",
250+
"text": [
251+
"Human: Translate this sentence from English to French: I love programming.\n"
252+
]
253+
},
254+
{
255+
"data": {
256+
"text/plain": [
257+
"'\\n\\nJe adore la programmation.'"
258+
]
259+
},
260+
"execution_count": 25,
261+
"metadata": {},
262+
"output_type": "execute_result"
263+
}
264+
],
265+
"source": [
266+
"without_system_message = [HumanMessage(\n",
267+
" content=\"Translate this sentence from English to French: I love programming.\"\n",
268+
")]\n",
269+
"prompt = ChatPromptTemplate.from_messages(without_system_message)\n",
270+
"print(prompt.format())\n",
271+
"\n",
272+
"llm(prompt.format())"
273+
]
274+
},
275+
{
276+
"cell_type": "code",
277+
"execution_count": null,
278+
"id": "0e8063bd",
279+
"metadata": {},
280+
"outputs": [],
281+
"source": []
282+
}
283+
],
284+
"metadata": {
285+
"kernelspec": {
286+
"display_name": "Python 3 (ipykernel)",
287+
"language": "python",
288+
"name": "python3"
289+
},
290+
"language_info": {
291+
"codemirror_mode": {
292+
"name": "ipython",
293+
"version": 3
294+
},
295+
"file_extension": ".py",
296+
"mimetype": "text/x-python",
297+
"name": "python",
298+
"nbconvert_exporter": "python",
299+
"pygments_lexer": "ipython3",
300+
"version": "3.10.12"
301+
}
302+
},
303+
"nbformat": 4,
304+
"nbformat_minor": 5
305+
}

0 commit comments

Comments
 (0)