Skip to content

pdavis327/llm_agent_finance

Repository files navigation

Financial Agent Support Bot

NOTE: Originally forked from this repo.

This chatbot is part of a POC for a FSI use case.

Getting Started

Prerequisites

  1. Clone the repository and navigate to the project directory:

    git clone <repository-url>
    cd <repository-name>
  2. Rename .env.example to .env

  3. Specify the environment parameters in the .env file.

Executing the Program

Creating a Chroma Database and Embedding Documents

To convert pdf to md using chrome run specify the input,output, and mode parameters when running convert_pdf.py

non ocr, default:

python convert_pdf.py ./assets/library/documents ./assets/library/docling_out

or if you want to do ocr

python convert_pdf.py ./assets/library/documents ./assets/library/docling_out --mode ocr

or if you have mac and want to do ocr

python convert_pdf.py ./assets/library/documents ./assets/library/docling_out --mode mac_ocr

You can create a Chroma database and embed documents using util/chroma.py. It requires one argument: the filepath to the documents you wish to embed and store.

Run the following command:

python util/chroma.py ./assets/library/docling_out

The results will be stored using your environment variables in a new Chroma database defined by CHROMA_COLLECTION_NAME and CHROMA_PERSIST_PATH.

Running the Application locally

podman-compose up

You should be able to view the app in your browser at the following URL:

http://0.0.0.0:8501

Running the Application on Openshift

  1. Generate streamlit-secret (skip this step if environment variables were not changed):

See Updating Environment Variables below.

  1. Generate builds (requires write access - skip this step if images were not rebuilt):
docker build -t quay.io/oawofolurh/finance_rag_assets -f Containerfile.chroma --platform linux/amd64 --push .
docker build -t quay.io/oawofolurh/finance-agent-ollama-container -f Containerfile.ollama --platform linux/amd64 --push .
docker build -t quay.io/oawofolurh/llm-agent-finance-streamlit-app -f Containerfile.streamlit --platform linux/amd64 --push .
  1. Deploy app:
oc delete -f k8s/
oc apply -f k8s/
  1. Create a route for the app if it does not already exist:
oc expose svc streamlit-app --port 8501
  1. View the deployment to validate that there are no issues:
watch oc get all
  1. The app should be accessible at the FQDN below:
oc get route streamlit-app -ojson | jq -r '.spec.host'

Updating Environment Variables

  1. Update the .env file as appropriate.

  2. Generate streamlit-secret (skip this step if environment variables were not changed):

oc delete secret streamlit-secret --ignore-not-found
oc create secret generic streamlit-secret --from-env-file=.env

Serving LLMs on Openshift AI

  1. Install Minio: (see link)
oc new-project minio --display-name="Minio S3 for LLMs"
oc apply -f k8s/minio/minio-all.yaml

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •