Welcome to AusAI Overview page
Clarifai app is a place for you to organize all of the content including models, workflows, inputs and more.
For app owners, API keys and Collaborators have been moved under App Settings.
Chatbot Template
This Chatbot App Template serves as an extensive guide for the building AI chatbot swiftly and effectively, utilizing the capabilities of Clarifai's Large Language Models (LLMs).
What is an AI Chatbot?
An AI chatbot Assistant is like a virtual assistant that is designed to interact with users via text or speech in a conversational manner. It leverages artificial intelligence, particularly natural language processing and LLMs, to understand user queries and provide appropriate responses, assistance, or information, enhancing user experience across various platforms and services. They aim to mimic human conversation to provide efficient and helpful user interactions.
Language models became more proficient at understanding and maintaining contexts within a conversation, and consequently, chatbots got better at handling conversations while providing more coherent responses.
Use Cases
AI chatbot assistants are commonly used in various applications including customer service, virtual assistants, and information retrieval, among others.
Customer Service and Support: Chatbots handle inquiries, complaints, and FAQs, providing quick responses outside of business hours or as first-level support, reducing the workload on human agents.
E-commerce and Retail: They assist customers in product searches, recommendations, shopping assistance, and post-purchase support, including order tracking and returns
Personal Assistants: Personalized AI chatbots help individuals manage schedules, set reminders, find information, and perform tasks like controlling smart home devices.
Build Chatbot using Clarifai Models
Chatbot Assistant could be built for the LLM in two ways: one without using external data, and another using external data with the help of RAG
Without external Data
All the Clarifai language Models could work as chatbot assistants that allow the Clarifai LLMs to understand and maintain contexts within a conversation, and consequently get better at handling conversations while providing more coherent responses.
Using Module
Clarifai's Chatbot Module lets you chat with several Large Language Models with a UI interface.
Chat with Mistral-7b-instruct and Mixtral
First, Export your PAT as an environment variable. Then, import and initialize the API Client.
Find your PAT in your security settings.
export CLARIFAI_PAT={your personal access token}
Instruction format
Both the Mistral-7b and Mixtral model follow the same instruction format and the format must be strictly respected, otherwise, the model will generate sub-optimal outputs.
The template used to build a prompt for the Instruct model is defined as follows:
<s> [INST] Instruction [/INST] Model answer</s> [INST] Follow-up instruction [/INST]
from clarifai.client.model import Model
prompt_template = "[INST] {input} [/INST]"
prompt = ""
def chatbot(input_text):
global prompt
if input_text:
if prompt== "":
prompt = "<s>"
prompt+= prompt_template.format(input=input_text)
# Model Predict
model_prediction = Model("https://clarifai.com/mistralai/completion/models/mixtral-8x7B-Instruct-v0_1").predict_by_bytes(prompt.encode(), "text")
reply = model_prediction.outputs[0].data.text.raw
prompt+= reply
prompt += "</s>"
return reply
assistent_reply1 = chatbot("Hi there!")
print(assistent_reply1)
assistent_reply2 = chatbot("Can I ask a question?")
print(assistent_reply2)
assistent_reply3 = chatbot("Which is bigger, a virus or a bacterium?")
print(assistent_reply3)
Chat with Gemma-7b-it
Similar, to Mistral, below is a script for using Gemma Model as a chatbot
from clarifai.client.model import Model
prompt_template = '''<start_of_turn>user
{input} <end_of_turn>
<start_of_turn>model
'''
prompt = ""
def chatbot(input_text):
global prompt
if input_text:
if prompt=="":
prompt = "<bos>"
prompt+= prompt_template.format(input=input_text)
# Model Predict
model_prediction = Model("https://clarifai.com/gcp/generate/models/gemma-7b-it").predict_by_bytes(prompt.encode(), "text")
reply = model_prediction.outputs[0].data.text.raw
prompt+= reply
prompt+= "\n"
return reply
assistent_reply1 = chatbot("Hi there!")
print(assistent_reply1)
assistent_reply2 = chatbot("Can I ask a question?")
print(assistent_reply2)
assistent_reply3 = chatbot("Which is bigger, a virus or a bacterium?")
print(assistent_reply3)
With external Data,
We can use LLMs with an external knowledge database with the help of RAG.
RAG, or Retrieval-Augmented Generation, is an AI framework designed to enhance large language models (LLMs) by retrieving facts from an external knowledge base. This process not only ensures the provision of accurate, current information but also enriches the understanding of the generative mechanisms of LLMs.
RAG template is a complete guide for using any LLM with external data and then using it as a chatbot.
RAG Chatbot with 4 lines of Code
RAG chatbot could be built in just 4 lines of code using Clarifai's Python SDK!
Export your PAT as an environment variable. Then, import and initialize the API Client.
Find your PAT in your security settings.
export CLARIFAI_PAT={your personal access token}
from clarifai.rag import RAG
rag_agent = RAG.setup(user_id=YOUR_USER_ID) rag_agent.upload(folder_path = "~/docs")
rag_agent.chat(messages=[{"role":"human", "content":"What is Clarifai"}])
For a detailed walkthrough, refer to this video.
.
- DescriptionAussie Chatbot, lets Chat over some brekky or some arvo tea
- Base Workflow
- Last UpdatedAug 02, 2024
- Default Languageen
- Share