OpenCHAT-mini2 / 支持类型.md
sanbo
update sth. at 2024-11-21 17:04:24
f7bf6aa

A newer version of the Gradio SDK is available: 5.14.0

Upgrade

hf的功能

文字

https://huggingface.co/docs/huggingface_hub/main/en/guides/inference#run-inference-on-servers

from huggingface_hub import InferenceClient messages = [{"role": "user", "content": "中国的首都是哪里?"}] client = InferenceClient("meta-llama/Llama-3.2-11B-Vision-Instruct") client.chat_completion(messages, max_tokens=100)

from huggingface_hub import InferenceClient messages = [{"role": "user", "content": "中国的首都是哪里?"}] client = InferenceClient("google/gemma-2-2b-it") client.chat_completion(messages, max_tokens=100)

图片

中文识别不好

from huggingface_hub import InferenceClient client = InferenceClient() image = client.text_to_image("An astronaut riding a horse on the moon.") image.save("/Users/sanbo/Desktop/astronaut.png") # 'image' is a PIL.Image object

from huggingface_hub import InferenceClient client = InferenceClient("black-forest-labs/FLUX.1-dev") image = client.text_to_image("一个天上飞的乌龟.") image.save("/Users/sanbo/Desktop/astronaut.png") # 'image' is a PIL.Image object

视觉问答visual_question_answering

from huggingface_hub import InferenceClient client = InferenceClient() client.visual_question_answering( image="https://huggingface.co/datasets/mishig/sample_images/resolve/main/tiger.jpg", question="What is the animal doing?" )

翻译

from huggingface_hub import InferenceClient client = InferenceClient() client.translation("My name is Wolfgang and I live in Berlin") 'Mein Name ist Wolfgang und ich lebe in Berlin.' client.translation("My name is Wolfgang and I live in Berlin", model="Helsinki-NLP/opus-mt-en-zh")

Helsinki-NLP/opus-mt-zh-en

使用特定模型

client = InferenceClient(model="prompthero/openjourney-v4") client.text_to_image("xxx")

方式二 client = InferenceClient() client.text_to_image(..., model="prompthero/openjourney-v4")

客户端请求

from huggingface_hub import InferenceClient client = InferenceClient() response = client.post(json={"inputs": "An astronaut riding a horse on the moon."}, model="stabilityai/stable-diffusion-2-1") response.content

支持模型

https://huggingface.co/models?other=conversational&sort=likes

client = InferenceClient("meta-llama/Meta-Llama-3-8B-Instruct") client = InferenceClient("Qwen/Qwen2.5-Coder-32B-Instruct")

支持任务

任务页面: https://huggingface.co/tasks