Skip to content

Commit ab57830

Browse files
eolivellimendonk
andauthored
Add docs for new Python utils in 0.4.x and for LangServe integration (#127)
Co-authored-by: Mendon Kissling <59585235+mendonk@users.noreply.github.com>
1 parent 4787816 commit ab57830

File tree

12 files changed

+413
-27
lines changed

12 files changed

+413
-27
lines changed

SUMMARY.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,7 @@
5252

5353
* [CLI Commands](langstream-cli/langstream-cli-commands.md)
5454
* [CLI Configuration](langstream-cli/langstream-cli-configuration.md)
55+
* [Web interface](langstream-cli/langstream-ui.md)
5556

5657
## Integrations
5758

@@ -68,6 +69,7 @@
6869
* [Solr](configuration-resources/data-storage/solr.md)
6970
* [JDBC](configuration-resources/data-storage/jdbc.md)
7071
* [OpenSearch](configuration-resources/data-storage/opensearch.md)
72+
* [LangServe](configuration-resources/langserve/README.md)
7173

7274
## Pipeline Agents
7375

@@ -110,9 +112,11 @@
110112
* [Agent Types](pipeline-agents/agent-developer-guide/agent-types.md)
111113
* [Agent Creation](pipeline-agents/agent-developer-guide/agent-creation.md)
112114
* [Configuration and Testing](pipeline-agents/agent-developer-guide/configuration-and-testing.md)
115+
* [Enviroment variables](pipeline-agents/agent-developer-guide/enviroment.md)
113116
* [Python sink](pipeline-agents/custom-agents/python-sink.md)
114117
* [Python source](pipeline-agents/custom-agents/python-source.md)
115118
* [Python processor](pipeline-agents/custom-agents/python-function.md)
119+
* [Python service](pipeline-agents/custom-agents/python-service.md)
116120

117121
## Messaging
118122

@@ -127,4 +131,5 @@
127131

128132
## Examples
129133

134+
* [LangServe chatbot](configuration-resources/langserve/README.md)
130135
* [LlamaIndex Cassandra sink](examples/llamaindex-cassandra-sink.md)
Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
---
2+
layout:
3+
title:
4+
visible: true
5+
description:
6+
visible: true
7+
tableOfContents:
8+
visible: true
9+
outline:
10+
visible: false
11+
pagination:
12+
visible: false
13+
---
14+
15+
# Integation with LangServe applications
16+
17+
[LangServe](https://github.com/langchain-ai/langserve) is a popular runtime to execute LangChain applications.
18+
19+
LangStream natively integrates with LangServe and allows you to invoke services exposed by LangServe applications.
20+
21+
Use the built-in `langserve-invoke` agent to implement this integration.
22+
23+
This example invokes a LangServe application that exposes a service at `http://localhost:8000/chain/stream`.
24+
25+
```yaml
26+
topics:
27+
- name: "input-topic"
28+
creation-mode: create-if-not-exists
29+
- name: "output-topic"
30+
creation-mode: create-if-not-exists
31+
- name: "streaming-answers-topic"
32+
creation-mode: create-if-not-exists
33+
pipeline:
34+
- type: "langserve-invoke"
35+
input: input-topic
36+
output: output-topic
37+
id: step1
38+
configuration:
39+
output-field: value.answer
40+
stream-to-topic: streaming-answers-topic
41+
stream-response-field: value
42+
min-chunks-per-message: 10
43+
debug: false
44+
method: POST
45+
allow-redirects: true
46+
handle-cookies: false
47+
url: "http://host.docker.internal:8000/chain/stream"
48+
fields:
49+
- name: topic
50+
expression: "value"
51+
```
52+
53+
54+
When you run the LangStream application in docker the URL is `http://host.docker.internal:8000/chain/stream` due to how docker desktop works.
55+
56+
57+
To allow your LangStream application to be accessible from a UI, you have to configure a gateway:
58+
```yaml
59+
gateways:
60+
- id: chat
61+
type: chat
62+
chat-options:
63+
answers-topic: streaming-answers-topic
64+
questions-topic: input-topic
65+
headers:
66+
- value-from-parameters: session-id
67+
```
68+
69+
## Starting the LangServe application locally
70+
71+
This is the sample code of the LangServe application:
72+
73+
```python
74+
from fastapi import FastAPI
75+
from langchain.prompts import ChatPromptTemplate
76+
from langchain.chat_models import ChatOpenAI
77+
from langserve import add_routes
78+
79+
80+
app = FastAPI(
81+
title="LangChain Server",
82+
version="1.0",
83+
description="A simple api server using Langchain's Runnable interfaces",
84+
)
85+
86+
model = ChatOpenAI()
87+
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
88+
add_routes(
89+
app,
90+
prompt | model,
91+
path="/chain",
92+
)
93+
94+
if __name__ == "__main__":
95+
import uvicorn
96+
97+
uvicorn.run(app, host="localhost", port=8000)
98+
```
99+
100+
Start the LangServe application with the following command:
101+
102+
```bash
103+
export OPENAI_API_KEY=...
104+
pip install fastapi langserve langchain openai sse_starlette uvicorn
105+
python example.py
106+
```
107+
108+
## Starting the LangStream application locally
109+
110+
To run the LangStream application on docker locally:
111+
112+
```bash
113+
langstream docker run -app /path/to/applicationn
114+
```
115+
116+
The LangStream UI will be running at [http://localhost:8092/](http://localhost:8092/)

examples/llamaindex-cassandra-sink.md

Lines changed: 60 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,66 @@ pipeline:
2828
table: vs_ll_openai
2929
```
3030
31+
This is the code for the python sink:
32+
33+
```python
34+
35+
import base64
36+
import io
37+
from typing import Dict, Any
38+
39+
import openai
40+
from cassandra.auth import PlainTextAuthProvider
41+
from cassandra.cluster import Cluster
42+
from langstream import Sink, Record
43+
from llama_index import VectorStoreIndex, Document
44+
from llama_index.vector_stores import CassandraVectorStore
45+
46+
47+
class LlamaIndexCassandraSink(Sink):
48+
def __init__(self):
49+
self.config = None
50+
self.session = None
51+
self.index = None
52+
53+
def init(self, config: Dict[str, Any]):
54+
self.config = config
55+
openai.api_key = config["openaiKey"]
56+
57+
def start(self):
58+
secure_bundle = self.config["cassandra"]["secureBundle"]
59+
secure_bundle = secure_bundle.removeprefix("base64:")
60+
secure_bundle = base64.b64decode(secure_bundle)
61+
cluster = Cluster(
62+
cloud={
63+
"secure_connect_bundle": io.BytesIO(secure_bundle),
64+
"use_default_tempdir": True,
65+
},
66+
auth_provider=PlainTextAuthProvider(
67+
self.config["cassandra"]["username"],
68+
self.config["cassandra"]["password"],
69+
),
70+
)
71+
self.session = cluster.connect()
72+
73+
vector_store = CassandraVectorStore(
74+
session=self.session,
75+
keyspace=self.config["cassandra"]["keyspace"],
76+
table=self.config["cassandra"]["table"],
77+
embedding_dimension=1536,
78+
insertion_batch_size=15,
79+
)
80+
81+
self.index = VectorStoreIndex.from_vector_store(vector_store)
82+
83+
def write(self, record: Record):
84+
self.index.insert(Document(text=record.value()))
85+
86+
def close(self):
87+
if self.session:
88+
self.session.shutdown()
89+
```
90+
3191
### Topics
3292
3393
**Input**

installation/docker.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -104,7 +104,7 @@ If you have built [the CLI locally from the sources](../installation/build-and-i
104104
You can also override the command used to launch the container, the default value is `docker`, but you can pass the `--docker-command` flag to use a different binary path.
105105

106106

107-
### Connect to the docker application
107+
### Connect to the docker application using the CLI
108108
The docker container exposes the API gateway on port `8091` and the control plane on port `8090` of your local machine.
109109
To connect to the docker container it's highly suggested to use a special profile named `local-docker-run`.
110110
This profile ensures you will always connect to the right endpoints.
@@ -119,3 +119,12 @@ or for getting the application description:
119119
langstream -p local-docker-run apps get test -o yaml
120120
```
121121

122+
### Connect to the docker application using a Web interface
123+
124+
By default the CLI starts a web interface that you can use to test your application. The web interface is available at [http://localhost:8092/](http://localhost:8092/).
125+
This interface displays:
126+
127+
* the application logs
128+
* a chatbot-like interface to interact with the gateways
129+
* a diagram of the application pipelines
130+
* the JSON description of the application, both the logical and the physical description (execution plan)

langstream-cli/langstream-ui.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# LangStream Web Interface
2+
3+
The LangStream CLI provides a web interface that you can use to test your application.
4+
The web interface is available at [http://localhost:8092/](http://localhost:8092/).
5+
6+
This interface displays:
7+
8+
* the application logs
9+
* a chatbot-like interface to interact with the gateways
10+
* a diagram of the application pipelines
11+
* the JSON description of the application, both the logical and the physical description (execution plan)
12+
13+
14+
The UI is started automatically when you run the application in docker mode (with `langstream docker run...`), but you can also start it manually using the `langstream app ui` command.
15+
16+
```bash
17+
langstream apps ui application-id
18+
```
19+
20+
With this command the CLI starts a local web service bound on local host and proxies the requests to the LangStream services - both the Control Plane and the API gateway.
21+
22+
The connection to the service is defined in the [CLI configuration](./langstream-cli-configuration.md).
23+

pipeline-agents/agent-developer-guide/README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,10 @@ Developing your own agent for LangStream is quite simple, using Python best prac
1010

1111
The Agent Developer Guide is broken into three milestones:
1212

13-
1\. [Agent Types](agent-types.md) - Understand the three main Python agents you'll use in your applications.\
14-
2\. [Deploying Agents](broken-reference) - Create agents to process records. Handle exceptions for each agent type.\
15-
3\. [Configuration and Testing](broken-reference) - Configure, test, and package your agents for production.
13+
1\. [Agent Types](./agent-types.md) - Understand the three main Python agents you'll use in your applications.\
14+
2\. [Deploying Agents](./agent-creation.md) - Create agents to process records. Handle exceptions for each agent type.\
15+
3\. [Configuration and Testing](./configuration-and-testing.md) - Configure, test, and package your agents for production. \
16+
4\. [Enviroment variables](./enviroment.md) - Configure, test, and package your agents for production.
1617

1718
Get started with [Part 1: Agent Types.](agent-types.md)
1819

pipeline-agents/agent-developer-guide/agent-creation.md

Lines changed: 24 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ description: Part 2 of the Agent Developer Guide
88
This is Part 2 of the Agent Developer Guide. Start at the beginning [here.](./)
99
{% endhint %}
1010

11-
Once you have built, tested, and packaged the agent you will need to include it as a part of the LangStream application deployment. Within the “application” directory create a directory named “python”. Within that directory place all the files included in packaging.
11+
Within the “application” directory create a directory named “python”. Within that directory place all the files included in packaging.
1212

1313
```
1414
|- application
@@ -26,6 +26,29 @@ To include the agent as a step in the pipeline, set the className to match the e
2626
className: main.MySourceAgent
2727
```
2828
29+
### Development enviroment
30+
31+
You can run your application locally in a docker container. This is the recommended way to develop your agent.
32+
You can use your IDE, like VS Studio Code, to develop your agent. You can also use the [VS Code LangStream Extension](https://github.com/LangStream/vscode-extension) to get started even faster.
33+
34+
Start the application locally using this command:
35+
36+
```bash
37+
langstream docker run test -app /path/to/application
38+
```
39+
40+
This launches a docker container with a docker image containing the same runtime that you are going to use in production.
41+
The container will run your application and will print the logs to the console.
42+
If you have a gateways.yaml file, it will also start a local gateway that you can use to test your agent.
43+
44+
The UI runs at http://localhost:8092/
45+
46+
The application runs in a process inside the container.
47+
48+
When you change and save a Python file, the process is automatically reloaded in order to pick up your changes.
49+
This way you don't need to restart the container or the Python process manually.
50+
51+
2952
### Agent records
3053

3154
When developing a custom agent, your contract with the LangStream runtime will be implementing the correct methods as well as working with the `Record` type.

pipeline-agents/agent-developer-guide/agent-types.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,12 @@ Processor agents are typically placed throughout an application’s pipeline.&#x
2626

2727
A processor agent might manipulate data as it flows through the pipeline, or could add in context to help downstream agents make decisions. A processor agent is responsible for accepting a list of `Record`s as input, doing some processing as necessary, and returning a `Record` (or many `Record`s) as a result.
2828

29+
### Service
30+
31+
Service agents are generic applications that usually do not process streaming data in the scope of a pipeline.&#x20;
32+
33+
Typically a Service exposes an API service that can be consumed by external applications. For instance you can build your ChatBot UI using a Service&#x20;
34+
2935
### What's next?
3036

31-
Continue on to Part 2 of the Agent Developer Guide, [Agent Creation.](broken-reference)
37+
Continue on to Part 2 of the Agent Developer Guide, [Agent Creation.](agent-creation.md)

pipeline-agents/agent-developer-guide/configuration-and-testing.md

Lines changed: 14 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -51,16 +51,14 @@ When you are ready to package the agent for deployment to LangStream, use the fo
5151
The command assumes you are running it from the “application” folder and your dependencies are declared in "python/requirements.txt"
5252

5353
```bash
54-
docker run --rm \
55-
-v $(pwd):/app-code-download \
56-
--entrypoint "" \
57-
-w /app-code-download/python \
58-
ghcr.io/langstream/langstream-runtime:0.1.0 \
59-
/bin/bash -c 'pip3 install --target ./lib --upgrade --prefer-binary -r requirements.txt'
54+
55+
langstream python load-pip-requirements -app /path/to/application
56+
6057
```
6158

6259
{% hint style="info" %}
63-
Note the version of LangStream was provided as the image’s tag. This should match the version of LangStream you are developing for.
60+
The command above will create a “python/lib” folder with the dependencies installed. This folder is added to the PYTHONPATH environment variable when the agent is run on LangStream. It uses the same docker image as the runtime, with the same version of Python and of the core libraries that will run in production.
61+
This is very important, especially if you are using Mac or Windows to develop your agent.
6462
{% endhint %}
6563

6664
#### **Unit testing**
@@ -70,25 +68,20 @@ Similar to packaging, the below Docker command is a starting suggestion of how t
7068
Using unittest:
7169

7270
```bash
73-
docker run --rm \
74-
-v $(pwd):/app-code-download \
75-
--entrypoint "" \
76-
-w /app-code-download/python \
77-
ghcr.io/langstream/langstream-runtime:0.1.0 \
78-
/bin/bash -c 'PYTHONPATH=$PYTHONPATH:/app-code-download/python/lib python3 -m unittest'
71+
langstream python run-tests -app /path/to/application
7972
```
8073

8174
Using tox:
8275

8376
```bash
84-
docker run --rm \
85-
-v $(pwd):/app-code-download \
86-
--entrypoint "" \
87-
-w /app-code-download/python \
88-
ghcr.io/langstream/langstream-runtime:0.1.0 \
89-
/bin/bash -c 'tox'
77+
langstream python run-tests -app /path/to/application -c tox
9078
```
9179

92-
### Multiple Python apps in one LangStream application
80+
### Multiple Python apps with dependency conflicts
81+
82+
If your LangStream application consists of more than one custom agent and you have dependencies conflicts, it is recommended that you separate them into 2 different applications. They can share input or output topics or be put inline with one another indirectly by topic. Separating by application gives you two clear “python” folders to house your artifact. This will aid in dependency collisions and other effects of two apps trying to share the same folder.
83+
84+
85+
### What's next?
9386

94-
If your LangStream application consists of more than one custom agent, it is recommended that you separate them into 2 different applications. They can share input or output topics or be put inline with one another indirectly by topic. Separating by application gives you two clear “python” folders to house your artifact. This will aid in dependency collisions and other effects of two apps trying to share the same folder.
87+
Continue on to Part 4 of the Agent Developer Guide, [Environment variables.](environment.md)

0 commit comments

Comments
 (0)