Forum in maintenance, we will back soon 🙂
Notifications
Clear all
Determining token cost
Python Scripting
3
Posts
2
Users
1
Reactions
470
Views
Topic starter
Calculating token cost is giving me result as none. Can you please point me to what I am doing wrong?
Main file:
import helpers token_count=3000 costs=helpers.estimate_input_cost_optimized("gpt-3.5-turbo-0613", token_count) print(f"Costs: {costs}")
Helpers is written as
import tiktoken import openai # Estimate cost def estimate_input_cost_optimized(model_name, token_count): model_cost_dict= { "gpt-3.5-turbo-0613": 0.0015, "gpt-3.5-turbo-16k-0613": 0.003, "gpt-4-0613": 0.03, "gpt-4-32k-0613": 0.06 } try: cost_per_1000_tokens = model_cost_dict[model_name] except KeyError: raise ValueError(f"The model '{model_name}' is not recognized.") estimated_cost=(token_count / 1000) * cost_per_1000_token
Result after running the code:
Costs: None
Posted : 09/11/2023 1:57 pm
you forgot to add:
return estimated_cost
At the end of the function.
Here is the corrected version:
def estimate_input_cost_optimized(model_name, token_count): model_cost_dict = { "gpt-3.5-turbo-0613": 0.0015, "gpt-3.5-turbo-16k-0613": 0.003, "gpt-4-0613": 0.03, "gpt-4-32k-0613": 0.06, } try: cost_per_1000_tokens = model_cost_dict[model_name] except KeyError: raise ValueError(f"The model '{model_name}' is not recognized.") estimated_cost = (token_count / 1000) * cost_per_1000_tokens return estimated_cost
Posted : 09/11/2023 3:56 pm
Forum Information
Our newest member: Augustine Rono
Forum Icons:
Forum contains no unread posts
Forum contains unread posts
Topic Icons:
Not Replied
Replied
Active
Hot
Sticky
Unapproved
Solved
Private
Closed