The Financial Agent is powered by FinBloom 7B, specifically fine-tuned on the Financial Context Dataset to act as a bridge between natural language and structured data. It functions by converting complex user inquiries into optimized retrieval parameters, allowing for seamless integration with downstream Data Modules. This specialized translation layer ensures that financial data sourcing is both highly precise and computationally efficient.

How to Get Started with the Model

Use the code below to get started with the model.

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel, PeftConfig

peft_model_id = "Chaitanya14/Financial_Agent"

config = PeftConfig.from_pretrained(peft_model_id)

model = AutoModelForCausalLM.from_pretrained(
    config.base_model_name_or_path,
    return_dict=True,
    device_map="auto",
    torch_dtype=torch.float16,
)

model = PeftModel.from_pretrained(model, peft_model_id)

tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)
if tokenizer.pad_token_id is None:
    tokenizer.pad_token_id = tokenizer.eos_token_id

test_query = "What was the net income and total revenue for Google and Microsoft in September 2024?"

prompt = f"Query : {test_query} Label : "

inputs = tokenizer(prompt, return_tensors="pt").to("cuda")

with torch.no_grad():
    output_tokens = model.generate(
        input_ids=inputs["input_ids"],
        attention_mask=inputs["attention_mask"],
        max_new_tokens=128,
        eos_token_id=tokenizer.eos_token_id,
        do_sample=False 
    )

full_text = tokenizer.decode(output_tokens[0], skip_special_tokens=True)

if "Label :" in full_text:
    full_text = full_text.split("Label :")[-1].strip()

print(f"\n--- Model Output ---\n{full_text}")

Citation

If you use the FinBloom 7B LLM, please cite with the following BibTex entry:

@misc{sinha2025finbloomknowledgegroundinglarge,
      title={FinBloom: Knowledge Grounding Large Language Model with Real-time Financial Data}, 
      author={Ankur Sinha and Chaitanya Agarwal and Pekka Malo},
      year={2025},
      eprint={2502.18471},
      archivePrefix={arXiv},
      primaryClass={cs.IR},
      url={https://arxiv.org/abs/2502.18471}, 
}
  • PEFT 0.10.1.dev0
Downloads last month
28
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Chaitanya14/Financial_Agent

Adapter
(38)
this model

Paper for Chaitanya14/Financial_Agent