前往小程序,Get更优阅读体验!
立即前往
发布
社区首页 >专栏 >GitHub Models

GitHub Models

作者头像
JusterZhu
发布2025-01-23 20:24:15
发布2025-01-23 20:24:15
7700
代码可运行
举报
文章被收录于专栏:JusterZhu
运行总次数:0
代码可运行

The first step in developing generative AI applications is to choose a model. How to choose a model is the key. This includes

  1. When we combine application development with business scenarios, there are many comparisons, such as the generation effects of the same prompt words under different models.
  2. Quick comparison and switching of multiple models
  3. How different models adapt to new application frameworks and solutions to complete projects more effectively.

The release of GitHub Models plays a very important role for developers and different development teams to more effectively select models in the process of developing applications and create applications based on different application frameworks. Let’s take a look at how I use GitHub Models to complete development in different scenarios.

Model comparison

In GitHub Models, through the provided playground, we can complete the comparison of the same prompt for different models.

Let’s take a look at the comparison between Phi-3-mini and Mistral Nemo

Judging from the results, this is an evenly matched result.

Quick comparison and switching of multiple models

Above, we switched models in the playground to compare different models under the same prompt. For development, a more direct approach may be required. With the Azure AI Inference SDK you can quickly switch to different models. You can choose Python, JavaScript, and REST access methods by selecting Code.

If we choose the Phi-3-mini scenario, we can choose to obtain the access method in Code

Of course, you can directly and seamlessly access the programming environment through Codespace.

Adaptation to different application frameworks

Generative AI has different application frameworks combined with models to complete applications, such as GraphRAG. We can use the REST interface provided by GitHub Models to test model solutions other than GPT-4o, such as selecting the latest Meta LLama 3.1 405b Instruct. If the local deployment of this model has been limited by computing power, it will be difficult for individuals and small teams to adopt it. But based on the interface provided by GitHub Models, we can complete the test in the local environment very simply

  1. Configure the environment

Install the GraphRAG Python library

代码语言:javascript
代码运行次数:0
复制

pip install graphrag -U

  1. Create a GraphRAG project
代码语言:javascript
代码运行次数:0
复制

mkdir -p ./ragmd/input

python -m graphrag.index --init --root ./ragmd


  1. Modify settings.yaml
代码语言:javascript
代码运行次数:0
复制


encoding_model: cl100k_base
skip_workflows: []
llm:
 api_key: ${GRAPHRAG_API_KEY}
 type: openai_chat # or azure_openai_chat
 model: meta-llama-3.1-405b-instruct
 model_supports_json: true # recommended if this is available for your model.
 max_tokens: 4000
 api_base: https://models.inference.ai.azure.com

parallelization:
 Stagger: 0.3

async_mode: threaded # or asyncio

embeddings:
 async_mode: threaded # or asyncio
 llm:
 api_key: ${GRAPHRAG_API_KEY}
 type: openai_embedding # or azure_openai_embedding
 model: jinaai
 api_base: http://localhost:5146/v1

Note Please configure GitHub Tokens in .env

  1. Run
代码语言:javascript
代码运行次数:0
复制

python -m graphrag.index --root ./ragmd

Test Results

代码语言:javascript
代码运行次数:0
复制

python -m graphrag.query --root ./ragmd --method global "What's GraphRAG"

Through GitHub Models, we can quickly use the provided models for model comparison and application development environment testing, which allows model and application testing to be completed more efficiently and quickly in environments with limited computing power.

Learning Resources

  1. Sign Up https://gh.io/models
  2. Introducing GitHub Models: A new generation of AI engineers building on GitHub https://github.blog/news-insights/product-news/introducing-github-models/
  3. Understand Phi-3 https://aka.ms/phi-3cookbook
  4. Learn about GraphRAG https://microsoft.github.io/
本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2024-08-02,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 JusterZhu 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Model comparison
  • Quick comparison and switching of multiple models
  • Adaptation to different application frameworks
  • Learning Resources
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档