• Need help to acess customised LLM

    Posted by Newgen V on March 18, 2025 at 4:57 pm

    My firm had restricted acess to the llm and they had built wrapper to access llm with the customised API key. I can sucessfully connect to those model with the below format of jason request with payload..i would require help how to use this in the standard packages like langchain, phidata, crewai..as these packages have the standard llm model support like OpenAIchat…any help/leads ? really appreciated..

    Sample way working well as below

    def call_llm_with_context(context, query): url = “https://API” payload = json.dumps( “model”: “internal model”, “max_tokens”: 4096, temperature”: 1 “stream”: False, ٢ “messages”: ٠ “role”: “system”, “content”: fcontext)’ “role”: “user”, “content”: f”Answer this based on above context: fquery)’

    headers Authorization’: Bearer Token, content-type’: ‘application/json

    response = requests.request(“POST”, url, headers=headers, data=payload) return response.json()[“choices”][0][“message”][“content”]

    If iwant to use like the below sample code snippet from phidata for openailike.. wondering how to use my apis with these existing packages.

    from os import getenv

    from phi.agent import Agent, RunResponse

    from phi.model.openai.like import OpenAILike

    agent = Agent(

    model=OpenAILike(

    id=”mistralai/Mixtral-8x7B-Instruct-v0.1″,

    api_key=getenv(“TOGETHER_API_KEY”),

    base_url=”https://api.together.xyz/v1″,

    )

    )

    # Get the response in a variable

    # run: RunResponse = agent.run(“Share a 2 sentence horror story.”)

    # print(run.content)

    # Print the response in the terminal

    agent.print_response(“Share a 2 sentence horror story.”)

    Husein replied 1 week, 1 day ago 2 Members · 3 Replies
  • 3 Replies
  • Husein

    Administrator
    March 19, 2025 at 8:41 am

    Hello Newgen, i didnt quite understand whats the problem you’re facing can you please rephrase it and mention again what it is.

  • Newgen V

    Member
    March 19, 2025 at 4:11 pm

    Thanks for the reply and sorry if i was not clear.I can access the llm with the below python request.

    def call_llm_with_context(context, query): url = “https://API” payload = json.dumps( “model”: “internal model”, “max_tokens”: 4096, temperature”: 1 “stream”: False, “messages”: ٠ “role”: “system”, “content”: fcontext)’ “role”: “user”, “content”: f”Answer this based on above context: fquery)’

    headers Authorization’: Bearer Token, content-type’: ‘application/json

    response = requests.request(“POST”, url, headers=headers, data=payload) return response.json()[“choices”][0][“message”][“content”]

    if i want to use phidata or crewai..how do i use it ? As the default packages (openaichat..)are more for industry openai..

    • Husein

      Administrator
      March 20, 2025 at 9:38 am

      Oh okay, i get it now, but what is the error you’re getting. Anyways, here is my code if you want to try it directly:

      from os import getenv

      from phi.agent import Agent

      from phi.model.openai.like import OpenAILike

      # Ensure the Together API key is set as an environment variable

      api_key = getenv(“TOGETHER_API_KEY”)

      if not api_key:

      raise ValueError(“TOGETHER_API_KEY environment variable is not set.”)

      # Initialize the agent with the Together AI model

      agent = Agent(

      model=OpenAILike(

      id=”mistralai/Mixtral-8x7B-Instruct-v0.1″,

      api_key=api_key,

      base_url=”https://api.together.xyz/v1″,

      )

      )

      # Define the prompt

      prompt = “Share a 2-sentence horror story.”

      # Get the response from the agent

      response = agent.run(prompt)

      # Print the response content

      print(response.content)

      • This reply was modified 1 week, 1 day ago by  Husein.
      • This reply was modified 1 week, 1 day ago by  Husein.

Log in to reply.