Skip to main content

Tracing SDK

LastMileTracer

LastMileTracer is the main class that encapsulates all tracing APIs.

It has 4 main functionalities:

  1. OpenTelemetry Trace APIs: Create OpenTelemetry traces and spans.
  2. ParamSet APIs: Register a dictionary of parameters to be logged and associated with the trace data.
  3. RAG Event APIs: Add RAG events and store their states alongside the trace data
  4. User feedback APIs: Log user feedback

It is available as a singleton for easy access.

def get_lastmile_tracer(
tracer_name: str,
pipeline_type: Optional[RagFlowType] = None,
output_filepath: Optional[str] = None,
project_id: Optional[str] = None,
lastmile_api_token: Optional[str] = None,
) -> LastMileTracer:
"""
Return a tracer object to instrument your code, log RAG-specific events (e.g. document embedding), and record application configuration parameters (e.g. chunk size, LLM temperature).


Args:
tracer_name: The name of the tracer to be used. Used to retrieve the tracer singleton.
output_filepath: If specified, all trace data gets saved to a file on local disk.
By default, trace data is exported to an OpenTelemetry collector,
and saved into a hosted backend storage such as ElasticSearch.

NOTE: This is useful for debugging and demo purposes, but not recommended for production use.
pipeline_type: RAG consists of 2 pipelines - ingestion and retrieval.
LastMileTracer allows tracing both pipelines. Individual traces can be logged either as Ingestion or Query traces, and show up separately in the Trace UI.
If left unspecified, the same tracer can be used for both pipelines.
project_id:
project_id: The id of the project to log the traces to..
If not provided, and LASTMILE_PROJECT_ID environment variable is also unspecified,
then a project will automatically be created with the name `tracer_name`.
lastmile_api_token: The API token for the LastMile API. If not provided,
will try to get the token from the LASTMILE_API_TOKEN
environment variable.
You can create a token from the "API Tokens" section from this website:
{WEBSITE_BASE_URL}/settings?page=tokens
"""

Current API

def get_lastmile_tracer(
tracer_name: str,
lastmile_api_token: Optional[str] = None,
project_name: Optional[str] = None,
initial_params: Optional[dict[str, Any]] = None,
output_filepath: Optional[str] = None,
rag_flow_type: Optional[RagFlowType] = None,
) -> LastMileTracer:
"""
Return a tracer object that uses the OpenTelemetry SDK to instrument
tracing in your code as well as other functionality such as logging
the rag event data and registered parameters.

See `lastmile_eval.rag.debugger.api.tracing.LastMileTracer for available
APIs and more details

@param tracer_name str: The name of the tracer to be used.
@param lastmile_api_token (str): Used for authentication.
Create one from the "API Tokens" section from this website:
https://lastmileai.dev/settings?page=tokens
@param project_name Optional(str): The project name that will be
associated with the trace data. This can help group traces in the UI
@param initial_params Optional(dict[str, Any]): The K-V pairs to be
registered and saved with ALL traces created using the returned tracer
object. Defaults to None (empty dict).
@param output_filepath Optional(str): By default, trace data is exported to
an OpenTelemetry collector and saved into a hosted backend storage such
as ElasticSearch. However if an output_filepath is defined,
then the trace data is saved to that file instead. This is useful for
debugging and demo purposes, but not recommened for production use.
@param rag_flow_type Optional[RagFlowType]: The type of RAG flow that the
tracer is being used in. If it is none, then the returned tracer can
be used for both ingestion and query tracing.

@return LastMileTracer: The tracer interface object to log OpenTelemetry data.
"""

OpenTelemetry Trace APIs

APIs that log OpenTelemetry traces and spans.

Spans

The current API for recording spans is exactly the same as the OpenTelemetry one.

def start_as_current_span(  # pylint: disable=too-many-locals
self,
# OpenTelemetry params
name: str,
context: Optional[Union[context_api.Context, SpanContext, str]] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Optional[Sequence[trace_api.Link]] = None,
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
end_on_exit: bool = True,
) -> Iterator[Span]:
"""
Same API as opentelemetry.trace.Tracer.start_as_current_span
But also allows a SpanContext to be passed in as the context parameter.
If context is a string, it is assumed to be a serialized SpanContext

Just like the OpenTelemetry API, this method can be used as both a
context manager and a decorator.

Example:
from opentelemetry import trace as trace_api
from lastmile_eval.rag.debugger.tracing import get_lastmile_tracer

tracer: LastMileTracer = get_lastmile_tracer(
tracer_name="<my-tracer-name>",
lastmile_api_token="<my-api-token>"
)

# Context Manager
with tracer.start_as_current_span("my-span") as span:
span.set_attribute("<my-key>", "<my-value>")

# Decorator
@tracer.start_as_current_span("my-span")
def my_function():
span = trace_api.get_current_span()
span.set_attribute("<my-key>", "<my-value>")

If you are using this as a decorator instead of context manager, it's
recommended to use `@traced(tracer)` instead since that also logs the
wrapped method's inputs and outputs as span attributes:\

from lastmile_eval.rag.debugger.tracing.decorators import traced

# Recommended way of decorating a function
@traced(tracer)
def my_function(my_arg: str):
# my_arg is automatically added to the span attributes

span = trace_api.get_current_span()
span.set_attribute("<my-key>", "<my-value>")
...

# output_value is automatically added to the span attributes too
return output_value
"""

def start_span(
self,
# OpenTelemetry params
name: str,
context: Optional[Union[context_api.Context, SpanContext, str]] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Sequence[trace_api.Link] = (),
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
) -> Span:
"""
Same API as opentelemetry.trace.Tracer.start_span
But also allows a SpanContext to be passed in as the context parameter.
If context is a string, it is assumed to be a serialized SpanContext.

A span must be manually ended when it's completed:
manual_span_example = tracer.start_span("new-child-span")
# Some logic
manual_span_example.end()
"""

Decorators

def trace_function(
self,
name: Optional[str] = None,
context: Optional[Union[context_api.Context, SpanContext, str]] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Optional[Sequence[trace_api.Link]] = None,
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
end_on_exit: bool = True,
) -> Callable[[Callable[..., Any]], Callable[..., Any]]:
"""
Function decorator that logs the wrapped function's input and output as attributes on a span.

To specify custom metadata, use the `attributes` property. The function complies with
the OpenTelemetry specification.
"""

def atrace_function(
self,
name: Optional[str] = None,
context: Optional[Union[context_api.Context, SpanContext, str]] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Optional[Sequence[trace_api.Link]] = None,
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
end_on_exit: bool = True,
) -> Callable[[Callable[..., Any]], Callable[..., Awaitable[Any]]]:
"""
Async version of `trace_function`
"""

Current API

def trace_function(
self,
name: Optional[str] = None,
context: Optional[Union[context_api.Context, SpanContext, str]] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Optional[Sequence[trace_api.Link]] = None,
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
end_on_exit: bool = True,
) -> Callable[[Callable[..., Any]], Callable[..., Any]]:
"""
Decorator that provides the same functionality as
LastMileTracer.start_as_current_span except that it also logs the wrapped
function's input and output values as attributes on the span.
"""

Auto-instrumentation

See auto-instrumentation docs

ParamSet APIs

Parameter sets (ParamSet) capture the hyperparameters of the RAG system. They are simply a collection of key-value pairs representing things like generation model, chunk size, query decomposition, etc. Together, they give a snapshot of the configuration of the RAG pipeline.

def register_param(
self,
key: str,
value: Any,
span: Optional[Span] = None,
) -> None:
"""
Define a parameter to save in the ParamSet for the current trace.

key: The name of the parameter (e.g. 'model')
value: JSON-serializable value of the parameter to be saved (e.g. 'gpt-3.5-turbo')
span: Option span to save the parameter in.
This can help associate different parameters with specific spans where those parameters are most relevant. By default, the parameter gets logged to the current span.
"""

def register_params(
self,
params: dict[str, Any],
overwrite: bool = False,
span: Optional[Span] = None,
) -> None:
"""
Upsert the ParamSet for the currently active trace.

params: The dictionary of parameters (e.g. {"model": "gpt-3.5-turbo"})
overwrite: Controls whether to overwrite or append the ParamSet.
If this function is called multiple times, by default it will be an upsert operation.
span: Option span to save the parameter in.
This can help associate different parameters with specific spans where those parameters are most relevant. By default, the parameter gets logged to the current span.
"""

RAG-specific ParamSet APIs

The Trace SDK has APIs to easily record a selection of Parameter configurations that are common in RAG applications.

def register_document_preprocess_params(
self,
chunk_size: Optional[int],
chunk_strategy: Optional[str],
extras: Optional[dict[str, Any]],
) -> None:
"""
Register the configuration of the document preprocessing step, like chunking.

Args:
chunk_size: Chunk size.
chunk_strategy: Chunking algorithm (e.g. sliding window).
extras: Any other params to also register.
"""

def register_embedding_params(
self,
embedding_model: Optional[str],
embedding_dimensions: Optional[int],
extras: Optional[dict[str, Any]],
) -> None:
"""
Register the configuration of the embedding step.

Args:
embedding_model: Embedding model.
embedding_dimensions: Cardinality of the embedding.
extras: Any other params to also register.
"""

def register_query_processing_params(
self,
embedding_model: Optional[str],
embedding_dimensions: Optional[int],
decomposition_strategy: Optional[str],
extras: Optional[dict[str, Any]],
) -> None:
"""
Register the configuration of the query processing step, such as query decomposition, embedding the query, etc.

Args:
embedding_model: Embedding model used for the query.
embedding_dimensions: Cardinality of the embedding used for the query.
decomposition_strategy: Algorithm for how a query is processed (e.g. generate subquestions, query expansion, etc.)
extras: Any other params to also register.
"""

def register_retrieval_params(
self,
top_k: Optional[int],
reranking_model: Optional[str],
extras: Optional[dict[str, Any]],
) -> None:
"""
Register the configuration of the retrieval step, such as top_k, etc.

Args:
top_k: Number of embeddings to retrieve.
reranking_model: The model used for reranking results.
extras: Any other params to also register.
"""

def register_generation_params(
self,
model_params: dict[str, Any],
extras: Optional[dict[str, Any]],
) -> None:
"""
Register the LLM settings like model, temperature, context_length used for the LLM generation phase of the RAG flow.

Args:
model_params: The dictionary of model settings.
NOTE: It is strongly recommended for this dictionary to comply with the OpenAI API spec. The params will automatically exclude messages from the ParamSet.
extras: Any other params to also register.
"""

Current APIs

def register_query_model(
self,
value: str,
should_also_save_in_span: bool = True,
span: Optional[Span] = None,
) -> None:
"""
Register the model used by the query for the current trace instance.

Args:
value (str): The value of the query model parameter.
should_also_save_in_span (bool): Flag indicating if the parameter should also be saved in the span. Defaults to True.
span (Optional[Span]): The span to associate with the parameter. Defaults to None.

Example:
>>> tracer.register_query_model("gpt-4")
"""

def register_query_temperature(
self,
value: float,
should_also_save_in_span: bool = True,
span: Optional[Span] = None,
) -> None:
"""
Register the query temperature for the current trace instance.

Args:
value (float): The value of the query temperature.
should_also_save_in_span (bool): Flag indicating if the parameter should also be saved in the span. Defaults to True.
span (Optional[Span]): The span to associate with the parameter. Defaults to None.

Example:
>>> tracer.register_query_temperature(0.7)
"""

def register_query_top_p(
self,
value: float,
should_also_save_in_span: bool = True,
span: Optional[Span] = None,
) -> None:
"""
Register the query top_p value for the current trace instance.

Args:
value (float): The value of the query top_p parameter.
should_also_save_in_span (bool): Flag indicating if the parameter should also be saved in the span. Defaults to True.
span (Optional[Span]): The span to associate with the parameter. Defaults to None.

Example:
>>> tracer.register_query_top_p(0.9)
"""

def register_ingestion_chunk_size(
self,
value: int,
should_also_save_in_span: bool = True,
span: Optional[Span] = None,
) -> None:
"""
Register the ingestion chunk size for the current trace instance.

Args:
value (int): The value of the ingestion chunk size.
should_also_save_in_span (bool): Flag indicating if the parameter should also be saved in the span. Defaults to True.
span (Optional[Span]): The span to associate with the parameter. Defaults to None.

Example:
>>> tracer.register_ingestion_chunk_size(100)
"""

def register_retrieval_top_k(
self,
value: int,
should_also_save_in_span: bool = True,
span: Optional[Span] = None,
) -> None:
"""
Register the retrieval top_k value for the current trace instance.

Args:
value (int): The value of the retrieval top_k parameter.
should_also_save_in_span (bool): Flag indicating if the parameter should also be saved in the span. Defaults to True.
span (Optional[Span]): The span to associate with the parameter. Defaults to None.

Example:
>>> tracer.register_retrieval_top_k(10)
"""

RAG Event APIs

APIs for logging opinionated events that map to specific parts of the RAG ingestion or query flow.

class RAGEventType(Enum):
"""
Enum to define the type of RAG event that is being added.
"""

CHUNKING = "chunking"
EMBEDDING = "embedding"
MULTI_EMBEDDING = "multi_embedding"
QUERY = "query"
RETRIEVAL = "retrieval"
SYNTHESIZE = "synthesize"
SUB_QUESTION = "sub_question"
TEMPLATING = "templating"
TOOL_CALL = "tool_call"
RERANKING = "reranking"
CUSTOM = "custom"

def log_trace_event(
self,
input: core.JSON = None,
output: core.JSON = None,
metadata: Optional[core.JSON] = None,
)
"""
Log an event tracking the input, output and JSON-serializable event data for the trace.
There can only be one RAG Event at the trace level, meant to capture the input and output of the entire flow.

You can use the data recorded in the event to generate test cases and run evaluations.

input: The input to the RAG application.
output: The output produced by the RAG application.
metadata: JSON-serializable event data capturing any other metadata to save as part of the event.
"""

def log_span_event(
self,
input: core.JSON = None,
output: core.JSON = None,
event_data: Optional[core.JSON] = None,
event_kind: RAGEventType = RAGEventType.CUSTOM,
name: Optional[str],
)
"""
Log an event tracking the input, output and JSON-serializable event data for an individual span.
There can only be one RAG Event for the trace, meant to capture the input and output of the span.

You can use the data recorded in the event to generate test cases and run evaluations.

input: The input to record.
output: The output to record.
metadata: JSON-serializable event data capturing any other metadata to save as part of the event.
event_kind: The kind of event (e.g. "reranking", "tool_call", etc.).
If this is a well-defined event kind, it will be rendered in an event-specific way in the UI.
name: A name to give the event, if needed.
Useful to disambiguate multiple of the same kind of event.
"""

RAG-specific Event APIs

Each API corresponds to an enum in RAGEventType.


@dataclass
class Document:
"""Document metadata"""
id: str
name: Optional[str] = None
content: Optional[Any] = None
extras: Optional[dict[str, Any]] = None

@dataclass
class Chunk:
"""Chunk metadata"""
id: str
content: Optional[str] = None
embedding: Optional[List[float]] = None
extras: Optional[dict[str, Any]] = None

@dataclass
class RetrievedChunk(Chunk):
"""Retrieved chunk"""
retrieval_score: float # Similarity score (e.g. cosine) for the retrieved chunk


def log_chunking_event(
self,
input_document: Optional[Document],
output_chunks: List[Chunk],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None
):
"""
Log an event for the chunking step of document ingestion.

input_document: Metadata (and optionally content) representing the document being processed.
output_chunks: List of chunks produced, represented as a dictionary.
metadata: Use metadata to store other information such as chunk size, mime type, file metadata, etc.
name: Optional name for the event.
"""

def log_embedding_event(
self,
embeddings: List[List[float]],
data: Optional[Any],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None
):
"""
Log an event for embedding generation, either during document ingestion or query retrieval.

embeddings: The list of embeddings created (each embedding is a List[float] vector).
data: The data that was embedded (e.g. the chunks of text).
metadata: Use metadata to store other information such as embedding model, etc.
name: Optional name for the event.
"""

def log_query_rewrite_event(
self,
query: str,
rewritten_query: List[str],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None,
):
"""
Log an event for query rewriting, such as decomposing an incoming query into sub-queries.

query: The input query
rewritten_query: The rewritten query (or queries in some instances).
metadata: Use metadata to store other information such as embedding model, etc.
name: Optional name for the event.
"""

def log_retrieval_event(
self,
query: str,
retrieved_data: List[RetrievedChunk],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None,
):
"""
Log an event for the vector DB lookup for a query.

query: The string used to retrieve data from the vector store.
retrieved_data: The data retrieved from the DB.
metadata: Use metadata to store other information such as reranker model, etc.
name: Optional name for the event.
"""

def log_rerank_event(
self,
retrieved: List[RetrievedChunk],
reranked: List[RetrievedChunk],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None,
):
"""
Log an event for the reranking of retrieved data.

retrieved: The list of chunks that were retrieved.
reranked: The reranked list of chunks (in order).
metadata: Use metadata to store other information such as embedding model, etc.
name: Optional name for the event.
"""

def log_tool_call_event(
self,
tool: str,
tool_args: dict[str, Any],
tool_result: dict[str, Any],
metadata: Optional[dict[str, Any]] = None,
name: Optional[str] = None,
):
"""
Log an event representing a tool call invocation.

tool: The name of the tool.
tool_args: The arguments for invoking the tool.
tool_result: The result of invoking the tool.
metadata: Use metadata to store other information such as embedding model, etc.
name: Optional name for the event.
"""

def log_query_event(
self,
query: str,
response: str | list[str],
system_prompt: Optional[str] = None,
name: Optional[str] = None,
metadata: Optional[core.JSON] = None,
):

Current API

Please see https://github.com/lastmile-ai/eval/blob/main/src/lastmile_eval/rag/debugger/api/add_rag_event_interface.py

User Feedback APIs

APIs for logging user feedback in various forms

def log_feedback(
self,
feedback: str | dict[str, Any]
) -> None:
"""
Log user feedback.

If there is a currently active trace/span, the feedback gets logged to that trace and span. Otherwise, it is logged to the project.

feedback: A string or JSON-serializable object representing user feedback.
"""

def log_binary_feedback(
self,
value: bool,
trace_id: Optional[str]
) -> None:
"""
Log binary feedback (positive or negative) for the current trace.

Examples of binary feedback include thumbs up/down, helpful/unhelpful, or any other
binary sentiment indicator.

value: The binary feedback value. True for positive feedback,
False for negative feedback.
trace_id: Optional trace ID to associate the feedback to.
"""

def log_numeric_feedback(
self,
value: float,
trace_id: Optional[str]
) -> None:
"""
Log numeric feedback for a specific trace between 0 and 1.

Examples of numeric feedback include a rating scale normalized between 0 and 1 (inclusive).

Args:
value: The numeric feedback value within the range (0.0 - 1.0)
trace_id: Optional trace ID to associate the feedback to.
"""

def log_categorical_feedback(
self,
value: str,
categories: list[str],
trace_id: Optional[str]
) -> None:
"""
Log categorical feedback for a specific trace.

Categorical feedback allows users to select a category or label that best
represents their assessment.

Args:
value: The selected category or label for the feedback.
categories: The list of available categories or labels to choose from.
trace_id: Optional trace ID to associate the feedback to.
"""

def log_text_feedback(
self,
value: str,
trace_id: Optional[str]) -> None:
"""
Log text feedback for a specific trace.

Text feedback allows users to provide open-ended and unstructured feedback
about the assistant's response. It can capture more detailed and specific
comments or suggestions from the user.

Args:
value: The text feedback provided by the user.
trace_id: Optional trace ID to associate the feedback to.
"""

Current API

Please see https://github.com/lastmile-ai/eval/blob/main/src/lastmile_eval/rag/debugger/api/tracing.py#L213

Logger API

The current API is sufficient as a base logger API

Current API

def log(
self,
data: Any,
logger: Optional[Logger] = None,
) -> None:
"""
Pass in data to be logged into a separate logger. This is useful if you
don't want to explicitly save data into span attributes, rag events, or
param sets, but still want to log it anyways to look at later such as
warnings, errors, or debugging info. These logs are saved to the
LastMile database and can be accessed later when looking at trace data
in the UI debugging tool.

@param data (Any): The data to be logged. It gets converted to string
format using the repr() function
@param logger Optional(Union[Logger, str]): The logger to use for
saving the data to. If it's a str, we assume it's the name of a
filepath. You can have multiple loggers across different calls to
save data to different places.
If not defined, we use the default logger associated with
the tracer and all the data from log calls get saved together.
"""

Deprecated / WIP

🚨 PLEASE READ - This API reference represents the Tracing SDK from 4/24/24. This requires an update as we are making significant changes to the SDK.

LastMileTracer

A class responsible for tracing events and parameter of your RAG application. It is a tracer proxy around the OpenTelemetry tracer and serves the following core functionalities:

  1. Create span data and attach it to the tracer. This is the same API as OpenTelemetry's tracer.
  2. Mark RAG events and store their states alongside the trace data.
  3. Register a dictionary of parameters to be logged and associated with the trace data.

Implemented methods:

  • start_as_current_span
  • start_span
  • add_rag_event_for_span
  • add_rag_event_for_trace
  • register_param
  • get_params

Example:

# Instantiate a tracer object
tracer = get_lastmile_tracer("my_rag_tracer")

start_as_current_span

Start a new span and set it as the current span in this tracer's context.

@abc.abstractmethod
@contextmanager
def start_as_current_span(
self,
name: str,
context: Optional[context_api.Context] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Optional[Sequence[trace_api.Link]] = None,
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True,
end_on_exit: bool = True
) -> Iterator[Span]

Arguments:

  • name - The name of the span to start.
  • context - (optional) The context to start the span in.
  • kind - (optional) The kind of span to start, defaults to INTERNAL.
  • attributes - (optional) Any attributes to associate with the span.
  • links - (optional) Any links to associate with the span.
  • start_time - (optional) The start time of the span.
  • record_exception - (optional) Whether to record exceptions that occur while the span is active.
  • set_status_on_exception - (optional) Whether to set the status on exception.
  • end_on_exit - (optional) Whether to end the span on exit.

Returns:

  • An iterator over the current Span.

Example:

# You can use the tracer either as a decorator around a function (like below)
# or with the "with ... as span_variable_name:" syntax
@tracer.start_as_current_span("ingestion-root-span")
def run_ingestion_flow():
documents = SimpleDirectoryReader("./data/paul_graham/").load_data()

# Register the doc file paths as a parameter
doc_file_paths = [
doc.metadata.get("file_path")
for doc in documents
if doc.metadata.get("file_path") is not None
]
...

start_span

Start a new span without setting it as the current span in this tracer's context.

def start_span(
self,
name: str,
context: Optional[context_api.Context] = None,
kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
attributes: types.Attributes = None,
links: Sequence[trace_api.Link] = (),
start_time: Optional[int] = None,
record_exception: bool = True,
set_status_on_exception: bool = True
) -> Span

Arguments:

  • name - The name of the span to start.
  • context - (optional) The context to start the span in.
  • kind - (optional) The kind of span to start, defaults to INTERNAL.
  • attributes - (optional) Any attributes to associate with the span.
  • links - (optional) Any links to associate with the span.
  • start_time - (optional) The start time of the span.
  • record_exception - (optional) Whether to record exceptions that occur while the span is active.
  • set_status_on_exception - (optional) Whether to set the status on exception.

Returns:

  • The started Span.

register_param

Define a parameter key-value pair to save for the current trace instance.

def register_param(
self,
key: str,
value: Any,
should_also_save_in_span: bool = True,
span: Optional[Span] = None
) -> None

Arguments:

  • key - The name of the parameter to be saved.
  • value - The value of the parameter to be saved.
  • should_also_save_in_span - (optional) Whether to also save the key-value pair in the current span attributes data. Defaults to true.
  • span - (optional) The span to save the key-value pair in addition to regular paramSet.

Example:

# Register chunk_size as a parameter in this trace's parameter set
chunk_size = 512
tracer.register_param("chunk_size", chunk_size)

get_params

Returns the dictionary that contains all the parameters that have been registered with a trace so far.

def get_params(self) -> dict[str, Any]

Returns:

  • A dictionary containing all the registered parameters.