-
Need help to acess customised LLM
My firm had restricted acess to the llm and they had built wrapper to access llm with the customised API key. I can sucessfully connect to those model with the below format of jason request with payload..i would require help how to use this in the standard packages like langchain, phidata, crewai..as these packages have the standard llm model support like OpenAIchat…any help/leads ? really appreciated..
Sample way working well as below
def call_llm_with_context(context, query): url = “https://API” payload = json.dumps( “model”: “internal model”, “max_tokens”: 4096, temperature”: 1 “stream”: False, ٢ “messages”: ٠ “role”: “system”, “content”: fcontext)’ “role”: “user”, “content”: f”Answer this based on above context: fquery)’
headers Authorization’: Bearer Token, content-type’: ‘application/json
response = requests.request(“POST”, url, headers=headers, data=payload) return response.json()[“choices”][0][“message”][“content”]
If iwant to use like the below sample code snippet from phidata for openailike.. wondering how to use my apis with these existing packages.
from os import getenv
from phi.agent import Agent, RunResponse
from phi.model.openai.like import OpenAILike
agent = Agent(
model=OpenAILike(
id=”mistralai/Mixtral-8x7B-Instruct-v0.1″,
api_key=getenv(“TOGETHER_API_KEY”),
base_url=”https://api.together.xyz/v1″,
)
)
# Get the response in a variable
# run: RunResponse = agent.run(“Share a 2 sentence horror story.”)
# print(run.content)
# Print the response in the terminal
agent.print_response(“Share a 2 sentence horror story.”)
Log in to reply.