前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >LangChain Java - a framework for developing applications with LLMs

LangChain Java - a framework for developing applications with LLMs

原创
作者头像
数据探险家
发布2023-06-14 10:11:03
1.2K0
发布2023-06-14 10:11:03
举报
文章被收录于专栏:数据探险家专栏

https://github.com/HamaWhiteGG/langchain-java

1. What is this?

This is the Java language implementation of LangChain.

Large language models (LLMs) are emerging as a transformative technology, enabling developers to build applications that they previously could not. But using these LLMs in isolation is often not enough to create a truly powerful app - the real power comes when you can combine them with other sources of computation or knowledge.

This library is aimed at assisting in the development of those types of applications.

Looking for the Python version? Check out LangChain.

2. Quickstart Guide

This tutorial gives you a quick walkthrough about building an end-to-end language model application with LangChain.

View the Quickstart Guide on the LangChain official website.

2.1 Maven Repository

Prerequisites for building:

  • Java 17 or later
  • Unix-like environment (we use Linux, Mac OS X)
  • Maven (we recommend version 3.8.6 and require at least 3.5.4)
代码语言:html
复制
<dependency>
    <groupId>io.github.hamawhitegg</groupId>
    <artifactId>langchain-core</artifactId>
    <version>0.1.6</version>
</dependency>

2.2 Environment Setup

Using LangChain will usually require integrations with one or more model providers, data stores, apis, etc.

For this example, we will be using OpenAI’s APIs.

We will then need to set the environment variable.

代码语言:shell
复制
export OPENAI_API_KEY=xxx

# If a proxy is needed, set the OPENAI_PROXY environment variable.
export OPENAI_PROXY=http://host:port

If you want to set the API key and proxy dynamically, you can use the openaiApiKey and openaiProxy parameter when initiating OpenAI class.

代码语言:java
复制
var llm = OpenAI.builder()
        .openaiApiKey("xxx")
        .openaiProxy("http://host:port")
        .build()
        .init();

The following test code can be used to view the QuickStart.java

2.3 LLMs: Get predictions from a language model

The most basic building block of LangChain is calling an LLM on some input. Let’s walk through a simple example of how to do this. For this purpose, let’s pretend we are building a service that generates a company name based on what the company makes.

代码语言:java
复制
var llm = OpenAI.builder()
        .temperature(0.9f)
        .build()
        .init();

String text = "What would be a good company name for a company that makes colorful socks?";
System.out.println(llm.call(text));
代码语言:shell
复制
Feetful of Fun

2.4 Prompt Templates: Manage prompts for LLMs

Calling an LLM is a great first step, but it’s just the beginning. Normally when you use an LLM in an application, you are not sending user input directly to the LLM. Instead, you are probably taking user input and constructing a prompt, and then sending that to the LLM.

代码语言:java
复制
var prompt = new PromptTemplate(List.of("product"),
        "What is a good name for a company that makes {product}?");

System.out.println(prompt.format(Map.of("product", "colorful socks")));
代码语言:shell
复制
What is a good name for a company that makes colorful socks?

2.5 Chains: Combine LLMs and prompts in multi-step workflows

Up until now, we’ve worked with the PromptTemplate and LLM primitives by themselves. But of course, a real application is not just one primitive, but rather a combination of them.

A chain in LangChain is made up of links, which can be either primitives like LLMs or other chains.

2.5.1 LLM Chain

The most core type of chain is an LLMChain, which consists of a PromptTemplate and an LLM.

代码语言:java
复制
var llm = OpenAI.builder()
        .temperature(0.9f)
        .build()
        .init();

var prompt = new PromptTemplate(List.of("product"),
        "What is a good name for a company that makes {product}?");

var chain = new LLMChain(llm, prompt);
System.out.println(chain.run("colorful socks"));
代码语言:shell
复制
\n\nSocktastic!
2.5.2 SQL Chain

This example demonstrates the use of the SQLDatabaseChain for answering questions over a database.

代码语言:java
复制
var database = SQLDatabase.fromUri("jdbc:mysql://127.0.0.1:3306/demo", "xxx", "xxx");

var llm = OpenAI.builder()
        .temperature(0)
        .build()
        .init();

var chain = SQLDatabaseChain.fromLLM(llm, database);
System.out.println(chain.run("How many students are there?"));
代码语言:shell
复制
There are 6 students.

2.6 Agents: Dynamically Call Chains Based on User Input

Agents no longer do: they use an LLM to determine which actions to take and in what order. An action can either be using a tool and observing its output, or returning to the user.

When used correctly agents can be extremely powerful. In this tutorial, we show you how to easily use agents through the simplest, highest level API.

Set the appropriate environment variables.

代码语言:shell
复制
export SERPAPI_API_KEY=xxx

Now we can get started!

代码语言:java
复制
var llm = OpenAI.builder()
        .temperature(0)
        .build()
        .init();

// load some tools to use.
var tools = loadTools(List.of("serpapi", "llm-math"), llm);

// initialize an agent with the tools, the language model, and the type of agent
var agent = initializeAgent(tools, llm, AgentType.ZERO_SHOT_REACT_DESCRIPTION);

// let's test it out!
String text =
        "What was the high temperature in SF yesterday in Fahrenheit? What is that number raised to the .023 power?";
System.out.println(agent.run(text));
代码语言:shell
复制
I need to find the temperature first, then use the calculator to raise it to the .023 power.

Action: Search
Action Input: "High temperature in SF yesterday"
Observation: San Francisco Weather History for the Previous 24 Hours ; 60 °F · 60 °F · 61 °F ...

Thought: I now have the temperature, so I can use the calculator to raise it to the .023 power.
Action: Calculator
Action Input: 60^.023
Observation: Answer: 1.09874643447

Thought: I now know the final answer
Final Answer: 1.09874643447

1.09874643447

3. Run Test Cases from Source

代码语言:txt
复制
git clone https://github.com/HamaWhiteGG/langchain-java.git
cd langchain-java

# export JAVA_HOME=JDK17_INSTALL_HOME && mvn clean test
mvn clean test

4. Apply Spotless

代码语言:txt
复制
cd langchain-java

# export JAVA_HOME=JDK17_INSTALL_HOME && mvn spotless:apply
mvn spotless:apply

5. Support

Don’t hesitate to ask!

Open an issue if you find a bug in Flink.

6. Fork and Contribute

This is an active open-source project. We are always open to people who want to use the system or contribute to it.

Contact me if you are looking for implementation tasks that fit your skills.

原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。

如有侵权,请联系 cloudcommunity@tencent.com 删除。

原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 1. What is this?
  • 2. Quickstart Guide
    • 2.1 Maven Repository
      • 2.2 Environment Setup
        • 2.3 LLMs: Get predictions from a language model
          • 2.4 Prompt Templates: Manage prompts for LLMs
            • 2.5 Chains: Combine LLMs and prompts in multi-step workflows
              • 2.5.1 LLM Chain
              • 2.5.2 SQL Chain
            • 2.6 Agents: Dynamically Call Chains Based on User Input
            • 3. Run Test Cases from Source
            • 4. Apply Spotless
            • 5. Support
            • 6. Fork and Contribute
            领券
            问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档