Note: This template only works for self-hosted n8n.

This n8n template demonstrates how to use the Langchain code node to track token usage and cost for every LLM call. This is useful if your templates handle multiple clients or customers and you need a cheap and easy way to capture how much of your AI credits they are using.

How it works

In our mock AI service, we’re offering a data conversion API to convert Resume PDFs into JSON documents. A form trigger is used to allow for PDF upload and the file is parsed using the Extract from File node.

An Edit Fields node is used to capture additional variables to send to our log. Next, we use the Information Extractor node to organise the Resume data into the given JSON schema.

The LLM subnode attached to the Information Extractor is a custom one we’ve built using the Langchain Code node. With our custom LLM subnode, we’re able to capture the usage metadata using lifecycle hooks.

We’ve also attached a Google Sheet tool to our LLM subnode, allowing us to send our usage metadata to a Google sheet. Finally, we demonstrate how you can aggregate from the Google sheet to understand how much AI tokens/costs your clients are liable for.

Requirements

– Self-hosted version of n8n
– OpenAI for LLM
– Google Sheets to store usage metadata

Customising this template

Bring the custom LLM subnode into your own templates! In many cases, it can be a drop-in replacement for the regular OpenAI subnode. Not using Google Sheets? Try other databases or a HTTP call to pipe into your CRM.