Use LLM Tracing with AI Model Experiments
Last updated on
Implementing LLM Tracing in STACKIT AI Model Experiments allows you to see “under the hood” of your AI’s decision-making process. This tutorial shows you how to use MLflow™‘s one-line auto-logging for popular frameworks like LangChain, or manual tracing for custom code.
1. Prepare your Environment
Section titled “1. Prepare your Environment”Ensure you have the environment variables from the Getting Started guide active in your terminal. In addition, you will need to install the LangChain library for this example.
pip install mlflow langchain langchain-community2. Implement Minimal Tracing
Section titled “2. Implement Minimal Tracing”You have two ways to implement tracing: Automatic (best for frameworks) or Manual (best for custom logic).
Option A: Automatic Tracing (LangChain Example)
Section titled “Option A: Automatic Tracing (LangChain Example)”If you use a supported library, a single line of code captures every step of the running chain.
import mlflowfrom langchain_core.prompts import ChatPromptTemplatefrom langchain_community.chat_models.fake import FakeListChatModel
# Enable MLflow Tracing for LangChainmlflow.langchain.autolog()
# Set experimentmlflow.set_experiment("Tracing example")
# Setup the chain using a Fake LLM.# This allows the tutorial to run without an LLM API token.# Because it uses the standard LangChain interface, MLflow will still# generate a complete Trace in the STACKIT UI.# To use a real model, swap FakeListChatModel for eg.: ChatOpenAI(model="gpt-4o-mini")llm = FakeListChatModel( responses=[ "STACKIT AI Model Experiments is a managed MLflow server. (This is a simulated response! No API key was used.)" ])prompt = ChatPromptTemplate.from_template("What is {topic}")chain = prompt | llm
# Invoke the chain - This automatically sends a "Trace" to the AI Model Experiments serverresponse = chain.invoke({"topic": "STACKIT AI Model Experiments"})print(response.content)Option B: Manual Tracing (Custom Function)
Section titled “Option B: Manual Tracing (Custom Function)”Use the @mlflow.Trace decorator to capture custom logic as a trace in the UI.
import mlflow
# Set experimentmlflow.set_experiment("Tracing example")
# Use @mlflow.Trace decorator to trace custom function@mlflow.tracedef my_function(): return "This is the STACKIT AI Model Experiments Docu."
# Calling the function generates a tracemy_function()3. Inspect the Results
Section titled “3. Inspect the Results”- Open your AI Model Experiments Instance URL in your browser. 2. Select the newly generated experiment. 3. Navigate to the Traces tab on the left sidebar. 4. Click on your recent request to see the Trace View.