Forum Discussion

baiyang666's avatar
baiyang666
Copper Contributor
Oct 31, 2024

openai token usage log exports

Hi,

I am trying to find a way to transfer the model token usage data to BigQuery, here are my attampts:

1. OpenAI model- deployment - metrics. It provides token usage directly, but I failed to find a way to export it on a scheduled basis.

2. API management service. Not working when using streaming responses with Azure OpenAI models because it requires more configuration to capture model completions.  As I am developing a chatbot, the streaming feature is necessary.

3. Export native logs by diagnostic settings. Unfortunately there is no token usage related info included.

 

So I wanna kindly ask for a potential solution for exporting the token usage to BigQuery with streaming function when using openai model. Thanks!

1 Reply

  • baiyang666 

     

    Below an example of using Python and Tiktoken Library:

     

    import openai
    import tiktoken
    from google.cloud import bigquery

    # Initialize OpenAI and BigQuery clients
    openai.api_key = 'YOUR_OPENAI_API_KEY'
    client = bigquery.Client()

    def log_token_usage_to_bigquery(token_count):
    table_id = 'your_project.your_dataset.your_table'
    rows_to_insert = [
    {"token_count": token_count}
    ]
    errors = client.insert_rows_json(table_id, rows_to_insert)
    if errors:
    print(f"Encountered errors while inserting rows: {errors}")

    def stream_chat_completion():
    response = openai.ChatCompletion.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True
    )
    token_count = 0
    for chunk in response:
    if 'choices' in chunk:
    for choice in chunk['choices']:
    if 'delta' in choice:
    token_count += len(tiktoken.encode(choice['delta']['content']))
    log_token_usage_to_bigquery(token_count)

    stream_chat_completion()

Resources