id
stringlengths 14
16
| text
stringlengths 20
3.26k
| source
stringlengths 65
181
|
---|---|---|
af6afe90327e-17 | Field that can be configured by the user with a default value.
runnables.utils.ConfigurableFieldSpec(id, ...)
Field that can be configured by the user.
runnables.utils.FunctionNonLocals()
Get the nonlocal variables accessed of a function.
runnables.utils.GetLambdaSource()
Get the source code of a lambda function.
runnables.utils.IsFunctionArgDict()
Check if the first argument of a function is a dict.
runnables.utils.IsLocalDict(name, keys)
Check if a name is a local dict.
runnables.utils.NonLocals()
Get nonlocal variables accessed.
runnables.utils.SupportsAdd(*args, **kwargs)
Protocol for objects that support addition.
Functions¶
runnables.base.chain()
Decorate a function to make it a Runnable.
runnables.base.coerce_to_runnable(thing)
Coerce a runnable-like object into a Runnable.
runnables.config.acall_func_with_variable_args(...)
Call function that may optionally accept a run_manager and/or config.
runnables.config.call_func_with_variable_args(...)
Call function that may optionally accept a run_manager and/or config.
runnables.config.ensure_config([config])
Ensure that a config is a dict with all keys present.
runnables.config.get_async_callback_manager_for_config(config)
Get an async callback manager for a config.
runnables.config.get_callback_manager_for_config(config)
Get a callback manager for a config.
runnables.config.get_config_list(config, length)
Get a list of configs from a single config or a list of configs.
runnables.config.get_executor_for_config(config)
Get an executor for a config.
runnables.config.merge_configs(*configs)
Merge multiple configs into one. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-18 | runnables.config.merge_configs(*configs)
Merge multiple configs into one.
runnables.config.patch_config(config, *[, ...])
Patch a config with new values.
runnables.config.run_in_executor(...)
Run a function in an executor.
runnables.configurable.make_options_spec(...)
Make a ConfigurableFieldSpec for a ConfigurableFieldSingleOption or ConfigurableFieldMultiOption.
runnables.configurable.prefix_config_spec(...)
Prefix the id of a ConfigurableFieldSpec.
runnables.graph.is_uuid(value)
Check if a string is a valid UUID.
runnables.graph.node_data_json(node, *[, ...])
Convert the data of a node to a JSON-serializable format.
runnables.graph.node_data_str(node)
Convert the data of a node to a string.
runnables.graph_ascii.draw_ascii(vertices, edges)
Build a DAG and draw it in ASCII.
runnables.graph_mermaid.draw_mermaid(nodes, ...)
Draws a Mermaid graph using the provided graph data
runnables.graph_mermaid.draw_mermaid_png(...)
Draws a Mermaid graph as PNG using provided syntax.
runnables.passthrough.aidentity(x)
Async identity function
runnables.passthrough.identity(x)
Identity function
runnables.utils.aadd(addables)
Asynchronously add a sequence of addable objects together.
runnables.utils.accepts_config(callable)
Check if a callable accepts a config argument.
runnables.utils.accepts_context(callable)
Check if a callable accepts a context argument.
runnables.utils.accepts_run_manager(callable)
Check if a callable accepts a run_manager argument.
runnables.utils.add(addables)
Add a sequence of addable objects together. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-19 | runnables.utils.add(addables)
Add a sequence of addable objects together.
runnables.utils.create_model(__model_name, ...)
Create a pydantic model with the given field definitions.
runnables.utils.gated_coro(semaphore, coro)
Run a coroutine with a semaphore.
runnables.utils.gather_with_concurrency(n, ...)
Gather coroutines with a limit on the number of concurrent coroutines.
runnables.utils.get_function_first_arg_dict_keys(func)
Get the keys of the first argument of a function if it is a dict.
runnables.utils.get_function_nonlocals(func)
Get the nonlocal variables accessed by a function.
runnables.utils.get_lambda_source(func)
Get the source code of a lambda function.
runnables.utils.get_unique_config_specs(specs)
Get the unique config specs from a sequence of config specs.
runnables.utils.indent_lines_after_first(...)
Indent all lines of text after the first line.
runnables.utils.is_async_callable(func)
Check if a function is async.
runnables.utils.is_async_generator(func)
Check if a function is an async generator.
langchain_core.stores¶
Store implements the key-value stores and storage helpers.
Module provides implementations of various key-value stores that conform
to a simple key-value interface.
The primary goal of these storages is to support implementation of caching.
Classes¶
stores.BaseStore()
Abstract interface for a key-value store.
stores.InMemoryBaseStore()
In-memory implementation of the BaseStore using a dictionary.
stores.InMemoryByteStore()
In-memory store for bytes.
stores.InMemoryStore()
In-memory store for any type of data.
stores.InvalidKeyException
Raised when a key is invalid; e.g., uses incorrect characters. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-20 | stores.InvalidKeyException
Raised when a key is invalid; e.g., uses incorrect characters.
langchain_core.structured_query¶
Internal representation of a structured query language.
Classes¶
structured_query.Comparator(value)
Enumerator of the comparison operators.
structured_query.Comparison
Comparison to a value.
structured_query.Expr
Base class for all expressions.
structured_query.FilterDirective
Filtering expression.
structured_query.Operation
Llogical operation over other directives.
structured_query.Operator(value)
Enumerator of the operations.
structured_query.StructuredQuery
Structured query.
structured_query.Visitor()
Defines interface for IR translation using visitor pattern.
langchain_core.sys_info¶
sys_info prints information about the system and langchain packages
for debugging purposes.
Functions¶
sys_info.print_sys_info(*[, additional_pkgs])
Print information about the environment for debugging purposes.
langchain_core.tools¶
Tools are classes that an Agent uses to interact with the world.
Each tool has a description. Agent uses the description to choose the right
tool for the job.
Class hierarchy:
RunnableSerializable --> BaseTool --> <name>Tool # Examples: AIPluginTool, BaseGraphQLTool
<name> # Examples: BraveSearch, HumanInputRun
Main helpers:
CallbackManagerForToolRun, AsyncCallbackManagerForToolRun
Classes¶
tools.BaseTool
Interface LangChain tools must implement.
tools.BaseToolkit
Base Toolkit representing a collection of related tools.
tools.RetrieverInput
Input to the retriever.
tools.SchemaAnnotationError
Raised when 'args_schema' is missing or has an incorrect type annotation.
tools.StructuredTool
Tool that can operate on any number of inputs.
tools.Tool
Tool that takes in function or coroutine directly.
tools.ToolException
Optional exception that tool throws when execution error occurs.
Functions¶ | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-21 | tools.ToolException
Optional exception that tool throws when execution error occurs.
Functions¶
tools.create_retriever_tool(retriever, name, ...)
Create a tool to do retrieval of documents.
tools.create_schema_from_function(...)
Create a pydantic schema from a function's signature.
tools.render_text_description(tools)
Render the tool name and description in plain text.
tools.render_text_description_and_args(tools)
Render the tool name, description, and args in plain text.
tools.tool(*args[, return_direct, ...])
Make tools out of functions, can be used with or without arguments.
langchain_core.tracers¶
Tracers are classes for tracing runs.
Class hierarchy:
BaseCallbackHandler --> BaseTracer --> <name>Tracer # Examples: LangChainTracer, RootListenersTracer
--> <name> # Examples: LogStreamCallbackHandler
Classes¶
tracers.base.AsyncBaseTracer(*[, _schema_format])
Async Base interface for tracers.
tracers.base.BaseTracer(*[, _schema_format])
Base interface for tracers.
tracers.evaluation.EvaluatorCallbackHandler(...)
Tracer that runs a run evaluator whenever a run is persisted.
tracers.event_stream.RunInfo
Information about a run.
tracers.langchain.LangChainTracer([...])
Implementation of the SharedTracer that POSTS to the LangChain endpoint.
tracers.log_stream.LogEntry
A single entry in the run log.
tracers.log_stream.LogStreamCallbackHandler(*)
Tracer that streams run logs to a stream.
tracers.log_stream.RunLog(*ops, state)
Run log.
tracers.log_stream.RunLogPatch(*ops)
Patch to the run log.
tracers.log_stream.RunState
State of the run. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-22 | Patch to the run log.
tracers.log_stream.RunState
State of the run.
tracers.root_listeners.AsyncRootListenersTracer(*, ...)
Async Tracer that calls listeners on run start, end, and error.
tracers.root_listeners.RootListenersTracer(*, ...)
Tracer that calls listeners on run start, end, and error.
tracers.run_collector.RunCollectorCallbackHandler([...])
Tracer that collects all nested runs in a list.
tracers.schemas.BaseRun
[Deprecated] Base class for Run.
tracers.schemas.ChainRun
[Deprecated] Class for ChainRun.
tracers.schemas.LLMRun
[Deprecated] Class for LLMRun.
tracers.schemas.Run
Run schema for the V2 API in the Tracer.
tracers.schemas.ToolRun
[Deprecated] Class for ToolRun.
tracers.schemas.TracerSession
[Deprecated] TracerSessionV1 schema for the V2 API.
tracers.schemas.TracerSessionBase
[Deprecated] Base class for TracerSession.
tracers.schemas.TracerSessionV1
[Deprecated] TracerSessionV1 schema.
tracers.schemas.TracerSessionV1Base
[Deprecated] Base class for TracerSessionV1.
tracers.schemas.TracerSessionV1Create
[Deprecated] Create class for TracerSessionV1.
tracers.stdout.ConsoleCallbackHandler(**kwargs)
Tracer that prints to the console.
tracers.stdout.FunctionCallbackHandler(...)
Tracer that calls a function with a single str parameter.
Functions¶
tracers.context.collect_runs()
Collect all run traces in context.
tracers.context.register_configure_hook(...)
Register a configure hook.
tracers.context.tracing_enabled([session_name])
Throw an error because this has been replaced by tracing_v2_enabled. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-23 | Throw an error because this has been replaced by tracing_v2_enabled.
tracers.context.tracing_v2_enabled([...])
Instruct LangChain to log all runs in context to LangSmith.
tracers.evaluation.wait_for_all_evaluators()
Wait for all tracers to finish.
tracers.langchain.get_client()
Get the client.
tracers.langchain.log_error_once(method, ...)
Log an error once.
tracers.langchain.wait_for_all_tracers()
Wait for all tracers to finish.
tracers.langchain_v1.LangChainTracerV1(...)
Throw an error because this has been replaced by LangChainTracer.
tracers.langchain_v1.get_headers(*args, **kwargs)
Throw an error because this has been replaced by get_headers.
tracers.schemas.RunTypeEnum()
[Deprecated] RunTypeEnum.
tracers.stdout.elapsed(run)
Get the elapsed time of a run.
tracers.stdout.try_json_stringify(obj, fallback)
Try to stringify an object to JSON.
langchain_core.utils¶
Utility functions for LangChain.
These functions do not depend on any other LangChain module.
Classes¶
utils.aiter.NoLock()
Dummy lock that provides the proper interface but no protection
utils.aiter.Tee(iterable[, n, lock])
Create n separate asynchronous iterators over iterable.
utils.aiter.atee
alias of Tee
utils.formatting.StrictFormatter()
Formatter that checks for extra keys.
utils.function_calling.FunctionDescription
Representation of a callable function to send to an LLM.
utils.function_calling.ToolDescription
Representation of a callable function to the OpenAI API.
utils.iter.NoLock()
Dummy lock that provides the proper interface but no protection
utils.iter.Tee(iterable[, n, lock]) | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-24 | utils.iter.Tee(iterable[, n, lock])
Create n separate asynchronous iterators over iterable
utils.iter.safetee
alias of Tee
utils.mustache.ChevronError
Custom exception for Chevron errors.
Functions¶
utils.aiter.py_anext(iterator[, default])
Pure-Python implementation of anext() for testing purposes.
utils.aiter.tee_peer(iterator, buffer, ...)
An individual iterator of a tee()
utils.env.env_var_is_set(env_var)
Check if an environment variable is set.
utils.env.get_from_dict_or_env(data, key, ...)
Get a value from a dictionary or an environment variable.
utils.env.get_from_env(key, env_key[, default])
Get a value from a dictionary or an environment variable.
utils.function_calling.convert_pydantic_to_openai_function(...)
[Deprecated] Converts a Pydantic model to a function description for the OpenAI API.
utils.function_calling.convert_pydantic_to_openai_tool(...)
[Deprecated] Converts a Pydantic model to a function description for the OpenAI API.
utils.function_calling.convert_python_function_to_openai_function(...)
[Deprecated] Convert a Python function to an OpenAI function-calling API compatible dict.
utils.function_calling.convert_to_openai_function(...)
Convert a raw function/class to an OpenAI function.
utils.function_calling.convert_to_openai_tool(tool)
Convert a raw function/class to an OpenAI tool.
utils.function_calling.format_tool_to_openai_function(tool)
[Deprecated] Format tool into the OpenAI function API.
utils.function_calling.format_tool_to_openai_tool(tool)
[Deprecated] Format tool into the OpenAI function API.
utils.function_calling.tool_example_to_messages(...)
Convert an example into a list of messages that can be fed into an LLM. | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-25 | Convert an example into a list of messages that can be fed into an LLM.
utils.html.extract_sub_links(raw_html, url, *)
Extract all links from a raw html string and convert into absolute paths.
utils.html.find_all_links(raw_html, *[, pattern])
Extract all links from a raw html string.
utils.image.encode_image(image_path)
Get base64 string from image URI.
utils.image.image_to_data_url(image_path)
utils.input.get_bolded_text(text)
Get bolded text.
utils.input.get_color_mapping(items[, ...])
Get mapping for items to a support color.
utils.input.get_colored_text(text, color)
Get colored text.
utils.input.print_text(text[, color, end, file])
Print text with highlighting and no end characters.
utils.interactive_env.is_interactive_env()
Determine if running within IPython or Jupyter.
utils.iter.batch_iterate(size, iterable)
Utility batching function.
utils.iter.tee_peer(iterator, buffer, peers, ...)
An individual iterator of a tee()
utils.json.parse_and_check_json_markdown(...)
Parse a JSON string from a Markdown string and check that it contains the expected keys.
utils.json.parse_json_markdown(json_string, *)
Parse a JSON string from a Markdown string.
utils.json.parse_partial_json(s, *[, strict])
Parse a JSON string that may be missing closing braces.
utils.json_schema.dereference_refs(schema_obj, *)
Try to substitute $refs in JSON Schema.
utils.loading.try_load_from_hub(*args, **kwargs)
[Deprecated]
utils.mustache.grab_literal(template, l_del)
Parse a literal from the template.
utils.mustache.l_sa_check(template, literal, ...) | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-26 | utils.mustache.l_sa_check(template, literal, ...)
Do a preliminary check to see if a tag could be a standalone.
utils.mustache.parse_tag(template, l_del, r_del)
Parse a tag from a template.
utils.mustache.r_sa_check(template, ...)
Do a final check to see if a tag could be a standalone.
utils.mustache.render([template, data, ...])
Render a mustache template.
utils.mustache.tokenize(template[, ...])
Tokenize a mustache template.
utils.pydantic.get_pydantic_major_version()
Get the major version of Pydantic.
utils.strings.comma_list(items)
Convert a list to a comma-separated string.
utils.strings.stringify_dict(data)
Stringify a dictionary.
utils.strings.stringify_value(val)
Stringify a value.
utils.utils.build_extra_kwargs(extra_kwargs, ...)
Build extra kwargs from values and extra_kwargs.
utils.utils.check_package_version(package[, ...])
Check the version of a package.
utils.utils.convert_to_secret_str(value)
Convert a string to a SecretStr if needed.
utils.utils.get_pydantic_field_names(...)
Get field names, including aliases, for a pydantic class.
utils.utils.guard_import(module_name, *[, ...])
Dynamically import a module and raise an exception if the module is not installed.
utils.utils.mock_now(dt_value)
Context manager for mocking out datetime.now() in unit tests.
utils.utils.raise_for_status_with_text(response)
Raise an error with the response text.
utils.utils.xor_args(*arg_groups)
Validate specified keyword args are mutually exclusive.
langchain_core.vectorstores¶
Vector store stores embedded data and performs vector search.
One of the most common ways to store and search over unstructured data is to | https://api.python.langchain.com/en/latest/core_api_reference.html |
af6afe90327e-27 | One of the most common ways to store and search over unstructured data is to
embed it and store the resulting embedding vectors, and then query the store
and retrieve the data that are ‘most similar’ to the embedded query.
Class hierarchy:
VectorStore --> <name> # Examples: Annoy, FAISS, Milvus
BaseRetriever --> VectorStoreRetriever --> <name>Retriever # Example: VespaRetriever
Main helpers:
Embeddings, Document
Classes¶
vectorstores.VectorStore()
Interface for vector store.
vectorstores.VectorStoreRetriever
Base Retriever class for VectorStore. | https://api.python.langchain.com/en/latest/core_api_reference.html |
a9c2d2f7c8d4-0 | langchain_anthropic 0.1.15¶
langchain_anthropic.chat_models¶
Classes¶
chat_models.AnthropicTool
chat_models.ChatAnthropic
Anthropic chat model integration.
chat_models.ChatAnthropicMessages
[Deprecated]
Functions¶
chat_models.convert_to_anthropic_tool(tool)
langchain_anthropic.experimental¶
Classes¶
experimental.ChatAnthropicTools
[Deprecated] Chat model for interacting with Anthropic functions.
Functions¶
experimental.get_system_message(tools)
langchain_anthropic.llms¶
Classes¶
llms.Anthropic
[Deprecated]
llms.AnthropicLLM
Anthropic large language model.
langchain_anthropic.output_parsers¶
Classes¶
output_parsers.ToolsOutputParser
Create a new model by parsing and validating input data from keyword arguments.
Functions¶
output_parsers.extract_tool_calls(content) | https://api.python.langchain.com/en/latest/anthropic_api_reference.html |
452437e2ccd7-0 | langchain_astradb 0.3.3¶
langchain_astradb.cache¶
Classes¶
cache.AstraDBCache(*[, collection_name, ...])
Cache that uses Astra DB as a backend.
cache.AstraDBSemanticCache(*[, ...])
Cache that uses Astra DB as a vector-store backend for semantic (i.e.
langchain_astradb.chat_message_histories¶
Astra DB - based chat message history, based on astrapy.
Classes¶
chat_message_histories.AstraDBChatMessageHistory(*, ...)
Chat message history that stores history in Astra DB.
langchain_astradb.document_loaders¶
Classes¶
document_loaders.AstraDBLoader(...)
Load DataStax Astra DB documents.
langchain_astradb.storage¶
Classes¶
storage.AstraDBBaseStore(*args, **kwargs)
Base class for the DataStax AstraDB data store.
storage.AstraDBByteStore(*, collection_name)
ByteStore implementation using DataStax AstraDB as the underlying store.
storage.AstraDBStore(collection_name, *[, ...])
BaseStore implementation using DataStax AstraDB as the underlying store.
langchain_astradb.utils¶
Classes¶
utils.astradb.SetupMode(value)
An enumeration.
Functions¶
utils.mmr.cosine_similarity(X, Y)
Row-wise cosine similarity between two equal-width matrices.
utils.mmr.maximal_marginal_relevance(...[, ...])
Calculate maximal marginal relevance.
langchain_astradb.vectorstores¶
Classes¶
vectorstores.AstraDBVectorStore(*, ...[, ...])
Wrapper around DataStax Astra DB for vector-store workloads. | https://api.python.langchain.com/en/latest/astradb_api_reference.html |
09aeddc69ef9-0 | langchain_azure_dynamic_sessions 0.1.0¶
langchain_azure_dynamic_sessions.tools¶
Classes¶
tools.sessions.RemoteFileMetadata(filename, ...)
Metadata for a file in the session.
tools.sessions.SessionsPythonREPLTool
A tool for running Python code in an Azure Container Apps dynamic sessions code interpreter. | https://api.python.langchain.com/en/latest/azure_dynamic_sessions_api_reference.html |
3b042c54a9aa-0 | langchain_core.callbacks.manager.CallbackManagerForChainRun¶
class langchain_core.callbacks.manager.CallbackManagerForChainRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Callback manager for chain run.
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Methods
__init__(*, run_id, handlers, ...[, ...])
Initialize the run manager.
get_child([tag])
Get a child callback manager.
get_noop_manager()
Return a manager that doesn't perform any operations.
on_agent_action(action, **kwargs)
Run when agent action is received.
on_agent_finish(finish, **kwargs)
Run when agent finish is received.
on_chain_end(outputs, **kwargs)
Run when chain ends running.
on_chain_error(error, **kwargs)
Run when chain errors.
on_retry(retry_state, **kwargs)
Run on a retry event. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForChainRun.html |
3b042c54a9aa-1 | on_retry(retry_state, **kwargs)
Run on a retry event.
on_text(text, **kwargs)
Run when text is received.
__init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Return type
None
get_child(tag: Optional[str] = None) → CallbackManager¶
Get a child callback manager.
Parameters
tag (str, optional) – The tag for the child callback manager.
Defaults to None.
Returns
The child callback manager.
Return type
CallbackManager
classmethod get_noop_manager() → BRM¶
Return a manager that doesn’t perform any operations.
Returns
The noop manager.
Return type
BaseRunManager
on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Run when agent action is received.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForChainRun.html |
3b042c54a9aa-2 | Run when agent action is received.
Parameters
action (AgentAction) – The agent action.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any
on_agent_finish(finish: AgentFinish, **kwargs: Any) → Any[source]¶
Run when agent finish is received.
Parameters
finish (AgentFinish) – The agent finish.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any
on_chain_end(outputs: Union[Dict[str, Any], Any], **kwargs: Any) → None[source]¶
Run when chain ends running.
Parameters
outputs (Union[Dict[str, Any], Any]) – The outputs of the chain.
kwargs (Any) –
Return type
None
on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when chain errors.
Parameters
error (Exception or KeyboardInterrupt) – The error.
kwargs (Any) –
Return type
None
on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
kwargs (Any) –
Return type
None
on_text(text: str, **kwargs: Any) → Any¶
Run when text is received.
Parameters
text (str) – The received text.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForChainRun.html |
1e5e77e8a563-0 | langchain_core.callbacks.base.RetrieverManagerMixin¶
class langchain_core.callbacks.base.RetrieverManagerMixin[source]¶
Mixin for Retriever callbacks.
Methods
__init__()
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
__init__()¶
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶
Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.RetrieverManagerMixin.html |
90fee4b1c14f-0 | langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler¶
class langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler(approve: ~typing.Callable[[~typing.Any], ~typing.Awaitable[bool]] = <function _adefault_approve>, should_check: ~typing.Callable[[~typing.Dict[str, ~typing.Any]], bool] = <function _default_true>)[source]¶
Asynchronous callback for manually validating values.
Attributes
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__([approve, should_check])
on_agent_action(action, *, run_id[, ...])
Run on agent action.
on_agent_finish(finish, *, run_id[, ...])
Run on agent end.
on_chain_end(outputs, *, run_id[, ...])
Run when chain ends running.
on_chain_error(error, *, run_id[, ...])
Run when chain errors.
on_chain_start(serialized, inputs, *, run_id)
Run when chain starts running.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running.
on_llm_end(response, *, run_id[, ...])
Run when LLM ends running.
on_llm_error(error, *, run_id[, ...])
Run when LLM errors.
on_llm_new_token(token, *[, chunk, ...])
Run on new LLM token. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-1 | Run on new LLM token.
on_llm_start(serialized, prompts, *, run_id)
Run when LLM starts running.
on_retriever_end(documents, *, run_id[, ...])
Run on retriever end.
on_retriever_error(error, *, run_id[, ...])
Run on retriever error.
on_retriever_start(serialized, query, *, run_id)
Run on retriever start.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, *, run_id[, parent_run_id, tags])
Run on arbitrary text.
on_tool_end(output, *, run_id[, ...])
Run when tool ends running.
on_tool_error(error, *, run_id[, ...])
Run when tool errors.
on_tool_start(serialized, input_str, *, run_id)
Run when tool starts running.
Parameters
approve (Callable[[Any], Awaitable[bool]]) –
should_check (Callable[[Dict[str, Any]], bool]) –
__init__(approve: ~typing.Callable[[~typing.Any], ~typing.Awaitable[bool]] = <function _adefault_approve>, should_check: ~typing.Callable[[~typing.Dict[str, ~typing.Any]], bool] = <function _default_true>)[source]¶
Parameters
approve (Callable[[Any], Awaitable[bool]]) –
should_check (Callable[[Dict[str, Any]], bool]) –
async on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on agent action. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-2 | Run on agent action.
Parameters
action (AgentAction) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on agent end.
Parameters
finish (AgentFinish) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when chain ends running.
Parameters
outputs (Dict[str, Any]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_chain_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when chain errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-3 | kwargs (Any) –
Return type
None
async on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
None
async on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
async on_llm_end(response: LLMResult, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when LLM ends running.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-4 | Run when LLM ends running.
Parameters
response (LLMResult) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_llm_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when LLM errors.
Parameters
error (BaseException) – The error that occurred.
kwargs (Any) – Additional keyword arguments.
- response (LLMResult): The response which was generated before
the error occurred.
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
Return type
None
async on_llm_new_token(token: str, *, chunk: Optional[Union[GenerationChunk, ChatGenerationChunk]] = None, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on new LLM token. Only available when streaming is enabled.
Parameters
token (str) –
chunk (Optional[Union[GenerationChunk, ChatGenerationChunk]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-5 | Run when LLM starts running.
ATTENTION: This method is called for non-chat models (regular LLMs). Ifyou’re implementing a handler for a chat model,
you should use on_chat_model_start instead.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
None
async on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on retriever end.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on retriever error.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → None¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-6 | Run on retriever start.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
None
async on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
async on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run on arbitrary text.
Parameters
text (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_tool_end(output: Any, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when tool ends running.
Parameters
output (Any) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
90fee4b1c14f-7 | kwargs (Any) –
Return type
None
async on_tool_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, **kwargs: Any) → None¶
Run when tool errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
kwargs (Any) –
Return type
None
async on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.human.AsyncHumanApprovalCallbackHandler.html |
1d22f70928a4-0 | langchain_community.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState¶
class langchain_community.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState(value)[source]¶
Enumerator of the LLMThought state.
THINKING = 'THINKING'¶
RUNNING_TOOL = 'RUNNING_TOOL'¶
COMPLETE = 'COMPLETE'¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.streamlit.streamlit_callback_handler.LLMThoughtState.html |
7c8ef0dd13fb-0 | langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler¶
class langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler(logger: Logger, handler: Any)[source]¶
Callback Handler for logging to WhyLabs. This callback handler utilizes
langkit to extract features from the prompts & responses when interacting with
an LLM. These features can be used to guardrail, evaluate, and observe interactions
over time to detect issues relating to hallucinations, prompt engineering,
or output validation. LangKit is an LLM monitoring toolkit developed by WhyLabs.
Here are some examples of what can be monitored with LangKit:
* Text Quality
readability score
complexity and grade scores
Text Relevance
- Similarity scores between prompt/responses
- Similarity scores against user-defined themes
- Topic classification
Security and Privacy
- patterns - count of strings matching a user-defined regex pattern group
- jailbreaks - similarity scores with respect to known jailbreak attempts
- prompt injection - similarity scores with respect to known prompt attacks
- refusals - similarity scores with respect to known LLM refusal responses
Sentiment and Toxicity
- sentiment analysis
- toxicity analysis
For more information, see https://docs.whylabs.ai/docs/language-model-monitoring
or check out the LangKit repo here: https://github.com/whylabs/langkit
—
:param api_key: WhyLabs API key. Optional because the preferred
way to specify the API key is with environment variable
WHYLABS_API_KEY.
Parameters
org_id (Optional[str]) – WhyLabs organization id to write profiles to.
Optional because the preferred way to specify the organization id is
with environment variable WHYLABS_DEFAULT_ORG_ID.
dataset_id (Optional[str]) – WhyLabs dataset id to write profiles to.
Optional because the preferred way to specify the dataset id is | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-1 | Optional because the preferred way to specify the dataset id is
with environment variable WHYLABS_DEFAULT_DATASET_ID.
sentiment (bool) – Whether to enable sentiment analysis. Defaults to False.
toxicity (bool) – Whether to enable toxicity analysis. Defaults to False.
themes (bool) – Whether to enable theme analysis. Defaults to False.
logger (Logger) –
handler (Any) –
Initiate the rolling logger.
Attributes
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__(logger, handler)
Initiate the rolling logger.
close()
Close any loggers to allow writing out of any profiles before exiting.
flush()
Explicitly write current profile if using a rolling logger.
from_params(*[, api_key, org_id, ...])
Instantiate whylogs Logger from params.
on_agent_action(action, *, run_id[, ...])
Run on agent action.
on_agent_finish(finish, *, run_id[, ...])
Run on agent end.
on_chain_end(outputs, *, run_id[, parent_run_id])
Run when chain ends running.
on_chain_error(error, *, run_id[, parent_run_id])
Run when chain errors.
on_chain_start(serialized, inputs, *, run_id)
Run when chain starts running.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-2 | Run when a chat model starts running.
on_llm_end(response, *, run_id[, parent_run_id])
Run when LLM ends running.
on_llm_error(error, *, run_id[, parent_run_id])
Run when LLM errors. :param error: The error that occurred. :type error: BaseException :param kwargs: Additional keyword arguments. - response (LLMResult): The response which was generated before the error occurred. :type kwargs: Any.
on_llm_new_token(token, *[, chunk, ...])
Run on new LLM token.
on_llm_start(serialized, prompts, *, run_id)
Run when LLM starts running.
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, *, run_id[, parent_run_id])
Run on arbitrary text.
on_tool_end(output, *, run_id[, parent_run_id])
Run when tool ends running.
on_tool_error(error, *, run_id[, parent_run_id])
Run when tool errors.
on_tool_start(serialized, input_str, *, run_id)
Run when tool starts running.
__init__(logger: Logger, handler: Any)[source]¶
Initiate the rolling logger.
Parameters
logger (Logger) –
handler (Any) – | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-3 | Parameters
logger (Logger) –
handler (Any) –
close() → None[source]¶
Close any loggers to allow writing out of any profiles before exiting.
Return type
None
flush() → None[source]¶
Explicitly write current profile if using a rolling logger.
Return type
None
classmethod from_params(*, api_key: Optional[str] = None, org_id: Optional[str] = None, dataset_id: Optional[str] = None, sentiment: bool = False, toxicity: bool = False, themes: bool = False, logger: Optional[Logger] = None) → WhyLabsCallbackHandler[source]¶
Instantiate whylogs Logger from params.
Parameters
api_key (Optional[str]) – WhyLabs API key. Optional because the preferred
way to specify the API key is with environment variable
WHYLABS_API_KEY.
org_id (Optional[str]) – WhyLabs organization id to write profiles to.
If not set must be specified in environment variable
WHYLABS_DEFAULT_ORG_ID.
dataset_id (Optional[str]) – The model or dataset this callback is gathering
telemetry for. If not set must be specified in environment variable
WHYLABS_DEFAULT_DATASET_ID.
sentiment (bool) – If True will initialize a model to perform
sentiment analysis compound score. Defaults to False and will not gather
this metric.
toxicity (bool) – If True will initialize a model to score
toxicity. Defaults to False and will not gather this metric.
themes (bool) – If True will initialize a model to calculate
distance to configured themes. Defaults to None and will not gather this
metric.
logger (Optional[Logger]) – If specified will bind the configured logger as
the telemetry gathering agent. Defaults to LangKit schema with periodic
WhyLabs writer.
Return type | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-4 | WhyLabs writer.
Return type
WhyLabsCallbackHandler
on_agent_action(action: AgentAction, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on agent action.
Parameters
action (AgentAction) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_agent_finish(finish: AgentFinish, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on agent end.
Parameters
finish (AgentFinish) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_chain_end(outputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when chain ends running.
Parameters
outputs (Dict[str, Any]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_chain_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when chain errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-5 | kwargs (Any) –
Return type
Any
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_llm_end(response: LLMResult, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when LLM ends running.
Parameters
response (LLMResult) – | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-6 | Run when LLM ends running.
Parameters
response (LLMResult) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_llm_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when LLM errors.
:param error: The error that occurred.
:type error: BaseException
:param kwargs: Additional keyword arguments.
response (LLMResult): The response which was generated beforethe error occurred.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_llm_new_token(token: str, *, chunk: Optional[Union[GenerationChunk, ChatGenerationChunk]] = None, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on new LLM token. Only available when streaming is enabled.
Parameters
token (str) – The new token.
chunk (GenerationChunk | ChatGenerationChunk) – The new generated chunk,
information. (containing content and other) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when LLM starts running. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-7 | Run when LLM starts running.
ATTENTION: This method is called for non-chat models (regular LLMs). Ifyou’re implementing a handler for a chat model,
you should use on_chat_model_start instead.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) – | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-8 | query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on arbitrary text.
Parameters
text (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_tool_end(output: Any, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when tool ends running.
Parameters
output (Any) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_tool_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when tool errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
7c8ef0dd13fb-9 | kwargs (Any) –
Return type
Any
on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inputs: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inputs (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
Examples using WhyLabsCallbackHandler¶
WhyLabs | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.whylabs_callback.WhyLabsCallbackHandler.html |
06252e99e1f0-0 | langchain_core.callbacks.manager.CallbackManagerForRetrieverRun¶
class langchain_core.callbacks.manager.CallbackManagerForRetrieverRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Callback manager for retriever run.
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Methods
__init__(*, run_id, handlers, ...[, ...])
Initialize the run manager.
get_child([tag])
Get a child callback manager.
get_noop_manager()
Return a manager that doesn't perform any operations.
on_retriever_end(documents, **kwargs)
Run when retriever ends running.
on_retriever_error(error, **kwargs)
Run when retriever errors.
on_retry(retry_state, **kwargs)
Run on a retry event.
on_text(text, **kwargs)
Run when text is received. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForRetrieverRun.html |
06252e99e1f0-1 | on_text(text, **kwargs)
Run when text is received.
__init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Return type
None
get_child(tag: Optional[str] = None) → CallbackManager¶
Get a child callback manager.
Parameters
tag (str, optional) – The tag for the child callback manager.
Defaults to None.
Returns
The child callback manager.
Return type
CallbackManager
classmethod get_noop_manager() → BRM¶
Return a manager that doesn’t perform any operations.
Returns
The noop manager.
Return type
BaseRunManager
on_retriever_end(documents: Sequence[Document], **kwargs: Any) → None[source]¶
Run when retriever ends running.
Parameters
documents (Sequence[Document]) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForRetrieverRun.html |
06252e99e1f0-2 | kwargs (Any) –
Return type
None
on_retriever_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when retriever errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
kwargs (Any) –
Return type
None
on_text(text: str, **kwargs: Any) → Any¶
Run when text is received.
Parameters
text (str) – The received text.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any
Examples using CallbackManagerForRetrieverRun¶
How to add scores to retriever results
How to create a custom Retriever | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManagerForRetrieverRun.html |
40df20695947-0 | langchain_core.callbacks.manager.AsyncCallbackManager¶
class langchain_core.callbacks.manager.AsyncCallbackManager(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Async callback manager that handles callbacks from LangChain.
Initialize callback manager.
Attributes
is_async
Return whether the handler is async.
Methods
__init__(handlers[, inheritable_handlers, ...])
Initialize callback manager.
add_handler(handler[, inherit])
Add a handler to the callback manager.
add_metadata(metadata[, inherit])
add_tags(tags[, inherit])
configure([inheritable_callbacks, ...])
Configure the async callback manager.
copy()
Copy the callback manager.
on_chain_start(serialized, inputs[, run_id])
Run when chain starts running.
on_chat_model_start(serialized, messages[, ...])
Run when LLM starts running.
on_llm_start(serialized, prompts[, run_id])
Run when LLM starts running.
on_retriever_start(serialized, query[, ...])
Run when retriever starts running.
on_tool_start(serialized, input_str[, ...])
Run when tool starts running.
remove_handler(handler)
Remove a handler from the callback manager.
remove_metadata(keys)
remove_tags(tags)
set_handler(handler[, inherit])
Set handler as the only handler on the callback manager.
set_handlers(handlers[, inherit])
Set handlers as the only handlers on the callback manager.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
40df20695947-1 | Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
__init__(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
Return type
None
add_handler(handler: BaseCallbackHandler, inherit: bool = True) → None¶
Add a handler to the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
add_metadata(metadata: Dict[str, Any], inherit: bool = True) → None¶
Parameters
metadata (Dict[str, Any]) –
inherit (bool) –
Return type
None
add_tags(tags: List[str], inherit: bool = True) → None¶
Parameters
tags (List[str]) – | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
40df20695947-2 | Parameters
tags (List[str]) –
inherit (bool) –
Return type
None
classmethod configure(inheritable_callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, local_callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, verbose: bool = False, inheritable_tags: Optional[List[str]] = None, local_tags: Optional[List[str]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None, local_metadata: Optional[Dict[str, Any]] = None) → AsyncCallbackManager[source]¶
Configure the async callback manager.
Parameters
inheritable_callbacks (Optional[Callbacks], optional) – The inheritable
callbacks. Defaults to None.
local_callbacks (Optional[Callbacks], optional) – The local callbacks.
Defaults to None.
verbose (bool, optional) – Whether to enable verbose mode. Defaults to False.
inheritable_tags (Optional[List[str]], optional) – The inheritable tags.
Defaults to None.
local_tags (Optional[List[str]], optional) – The local tags.
Defaults to None.
inheritable_metadata (Optional[Dict[str, Any]], optional) – The inheritable
metadata. Defaults to None.
local_metadata (Optional[Dict[str, Any]], optional) – The local metadata.
Defaults to None.
Returns
The configured async callback manager.
Return type
AsyncCallbackManager
copy() → T¶
Copy the callback manager.
Parameters
self (T) –
Return type
T
async on_chain_start(serialized: Dict[str, Any], inputs: Union[Dict[str, Any], Any], run_id: Optional[UUID] = None, **kwargs: Any) → AsyncCallbackManagerForChainRun[source]¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) – The serialized chain. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
40df20695947-3 | Parameters
serialized (Dict[str, Any]) – The serialized chain.
inputs (Union[Dict[str, Any], Any]) – The inputs to the chain.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
The async callback managerfor the chain run.
Return type
AsyncCallbackManagerForChainRun
async on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], run_id: Optional[UUID] = None, **kwargs: Any) → List[AsyncCallbackManagerForLLMRun][source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) – The serialized LLM.
messages (List[List[BaseMessage]]) – The list of messages.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
The list ofasync callback managers, one for each LLM Run
corresponding to each inner message list.
Return type
List[AsyncCallbackManagerForLLMRun]
async on_llm_start(serialized: Dict[str, Any], prompts: List[str], run_id: Optional[UUID] = None, **kwargs: Any) → List[AsyncCallbackManagerForLLMRun][source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) – The serialized LLM.
prompts (List[str]) – The list of prompts.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
The list of asynccallback managers, one for each LLM Run corresponding
to each prompt.
Return type
List[AsyncCallbackManagerForLLMRun] | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
40df20695947-4 | to each prompt.
Return type
List[AsyncCallbackManagerForLLMRun]
async on_retriever_start(serialized: Dict[str, Any], query: str, run_id: Optional[UUID] = None, parent_run_id: Optional[UUID] = None, **kwargs: Any) → AsyncCallbackManagerForRetrieverRun[source]¶
Run when retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (Optional[UUID]) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
AsyncCallbackManagerForRetrieverRun
async on_tool_start(serialized: Dict[str, Any], input_str: str, run_id: Optional[UUID] = None, parent_run_id: Optional[UUID] = None, **kwargs: Any) → AsyncCallbackManagerForToolRun[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) – The serialized tool.
input_str (str) – The input to the tool.
run_id (UUID, optional) – The ID of the run. Defaults to None.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
kwargs (Any) –
Returns
The async callback managerfor the tool run.
Return type
AsyncCallbackManagerForToolRun
remove_handler(handler: BaseCallbackHandler) → None¶
Remove a handler from the callback manager.
Parameters
handler (BaseCallbackHandler) –
Return type
None
remove_metadata(keys: List[str]) → None¶
Parameters
keys (List[str]) –
Return type
None
remove_tags(tags: List[str]) → None¶
Parameters
tags (List[str]) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
40df20695947-5 | Parameters
tags (List[str]) –
Return type
None
set_handler(handler: BaseCallbackHandler, inherit: bool = True) → None¶
Set handler as the only handler on the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
set_handlers(handlers: List[BaseCallbackHandler], inherit: bool = True) → None¶
Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inherit (bool) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManager.html |
f11443a56745-0 | langchain_community.callbacks.flyte_callback.FlyteCallbackHandler¶
class langchain_community.callbacks.flyte_callback.FlyteCallbackHandler[source]¶
Callback handler that is used within a Flyte task.
Initialize callback handler.
Attributes
always_verbose
Whether to call verbose callbacks even if verbose is False.
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__()
Initialize callback handler.
get_custom_callback_meta()
on_agent_action(action, **kwargs)
Run on agent action.
on_agent_finish(finish, **kwargs)
Run when agent ends running.
on_chain_end(outputs, **kwargs)
Run when chain ends running.
on_chain_error(error, **kwargs)
Run when chain errors.
on_chain_start(serialized, inputs, **kwargs)
Run when chain starts running.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running.
on_llm_end(response, **kwargs)
Run when LLM ends running.
on_llm_error(error, **kwargs)
Run when LLM errors.
on_llm_new_token(token, **kwargs)
Run when LLM generates a new token.
on_llm_start(serialized, prompts, **kwargs)
Run when LLM starts.
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...]) | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.FlyteCallbackHandler.html |
f11443a56745-1 | on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, **kwargs)
Run when agent is ending.
on_tool_end(output, **kwargs)
Run when tool ends running.
on_tool_error(error, **kwargs)
Run when tool errors.
on_tool_start(serialized, input_str, **kwargs)
Run when tool starts running.
reset_callback_meta()
Reset the callback metadata.
__init__() → None[source]¶
Initialize callback handler.
Return type
None
get_custom_callback_meta() → Dict[str, Any]¶
Return type
Dict[str, Any]
on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Run on agent action.
Parameters
action (AgentAction) –
kwargs (Any) –
Return type
Any
on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶
Run when agent ends running.
Parameters
finish (AgentFinish) –
kwargs (Any) –
Return type
None
on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain ends running.
Parameters
outputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when chain errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.FlyteCallbackHandler.html |
f11443a56745-2 | error (BaseException) –
kwargs (Any) –
Return type
None
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶
Run when LLM ends running.
Parameters
response (LLMResult) –
kwargs (Any) –
Return type
None
on_llm_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when LLM errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_llm_new_token(token: str, **kwargs: Any) → None[source]¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.FlyteCallbackHandler.html |
f11443a56745-3 | on_llm_new_token(token: str, **kwargs: Any) → None[source]¶
Run when LLM generates a new token.
Parameters
token (str) –
kwargs (Any) –
Return type
None
on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶
Run when LLM starts.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
kwargs (Any) –
Return type
None
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) – | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.FlyteCallbackHandler.html |
f11443a56745-4 | query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, **kwargs: Any) → None[source]¶
Run when agent is ending.
Parameters
text (str) –
kwargs (Any) –
Return type
None
on_tool_end(output: str, **kwargs: Any) → None[source]¶
Run when tool ends running.
Parameters
output (str) –
kwargs (Any) –
Return type
None
on_tool_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when tool errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
kwargs (Any) –
Return type
None
reset_callback_meta() → None¶
Reset the callback metadata.
Return type
None
Examples using FlyteCallbackHandler¶
Flyte | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.FlyteCallbackHandler.html |
f727072f3409-0 | langchain_core.callbacks.manager.atrace_as_chain_group¶
langchain_core.callbacks.manager.atrace_as_chain_group(group_name: str, callback_manager: Optional[AsyncCallbackManager] = None, *, inputs: Optional[Dict[str, Any]] = None, project_name: Optional[str] = None, example_id: Optional[Union[str, UUID]] = None, run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None) → AsyncGenerator[AsyncCallbackManagerForChainGroup, None][source]¶
Get an async callback manager for a chain group in a context manager.
Useful for grouping different async calls together as a single run even if
they aren’t composed in a single chain.
Parameters
group_name (str) – The name of the chain group.
callback_manager (AsyncCallbackManager, optional) – The async callback manager to use,
which manages tracing and other callback behavior.
project_name (str, optional) – The name of the project.
Defaults to None.
example_id (str or UUID, optional) – The ID of the example.
Defaults to None.
run_id (UUID, optional) – The ID of the run.
tags (List[str], optional) – The inheritable tags to apply to all runs.
Defaults to None.
metadata (Dict[str, Any], optional) – The metadata to apply to all runs.
Defaults to None.
inputs (Optional[Dict[str, Any]]) –
Returns
The async callback manager for the chain group.
Return type
AsyncCallbackManager
Note: must have LANGCHAIN_TRACING_V2 env var set to true to see the trace in LangSmith.
Example
llm_input = "Foo"
async with atrace_as_chain_group("group_name", inputs={"input": llm_input}) as manager: | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.atrace_as_chain_group.html |
f727072f3409-1 | # Use the async callback manager for the chain group
res = await llm.ainvoke(llm_input, {"callbacks": manager})
await manager.on_chain_end({"output": res}) | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.atrace_as_chain_group.html |
0a485b457467-0 | langchain_core.callbacks.manager.BaseRunManager¶
class langchain_core.callbacks.manager.BaseRunManager(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Base class for run manager (a bound callback manager).
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Methods
__init__(*, run_id, handlers, ...[, ...])
Initialize the run manager.
get_noop_manager()
Return a manager that doesn't perform any operations.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, *, run_id[, parent_run_id])
Run on arbitrary text. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.BaseRunManager.html |
0a485b457467-1 | Run on arbitrary text.
__init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None[source]¶
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Return type
None
classmethod get_noop_manager() → BRM[source]¶
Return a manager that doesn’t perform any operations.
Returns
The noop manager.
Return type
BaseRunManager
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.BaseRunManager.html |
0a485b457467-2 | Run on arbitrary text.
Parameters
text (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.BaseRunManager.html |
d12fb8337a00-0 | langchain_community.callbacks.tracers.comet.import_comet_llm_api¶
langchain_community.callbacks.tracers.comet.import_comet_llm_api() → SimpleNamespace[source]¶
Import comet_llm api and raise an error if it is not installed.
Return type
SimpleNamespace | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.tracers.comet.import_comet_llm_api.html |
8e3152136ab6-0 | langchain_core.callbacks.base.BaseCallbackManager¶
class langchain_core.callbacks.base.BaseCallbackManager(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Base callback manager that handles callbacks from LangChain.
Initialize callback manager.
Attributes
is_async
Whether the callback manager is async.
Methods
__init__(handlers[, inheritable_handlers, ...])
Initialize callback manager.
add_handler(handler[, inherit])
Add a handler to the callback manager.
add_metadata(metadata[, inherit])
add_tags(tags[, inherit])
copy()
Copy the callback manager.
on_chain_start(serialized, inputs, *, run_id)
Run when chain starts running.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running.
on_llm_start(serialized, prompts, *, run_id)
Run when LLM starts running.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running.
on_tool_start(serialized, input_str, *, run_id)
Run when tool starts running.
remove_handler(handler)
Remove a handler from the callback manager.
remove_metadata(keys)
remove_tags(tags)
set_handler(handler[, inherit])
Set handler as the only handler on the callback manager.
set_handlers(handlers[, inherit])
Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) – | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackManager.html |
8e3152136ab6-1 | Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
__init__(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None[source]¶
Initialize callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
Return type
None
add_handler(handler: BaseCallbackHandler, inherit: bool = True) → None[source]¶
Add a handler to the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
add_metadata(metadata: Dict[str, Any], inherit: bool = True) → None[source]¶
Parameters
metadata (Dict[str, Any]) –
inherit (bool) –
Return type
None
add_tags(tags: List[str], inherit: bool = True) → None[source]¶
Parameters
tags (List[str]) –
inherit (bool) – | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackManager.html |
8e3152136ab6-2 | Parameters
tags (List[str]) –
inherit (bool) –
Return type
None
copy() → T[source]¶
Copy the callback manager.
Parameters
self (T) –
Return type
T
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackManager.html |
8e3152136ab6-3 | kwargs (Any) –
Return type
Any
on_llm_start(serialized: Dict[str, Any], prompts: List[str], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when LLM starts running.
ATTENTION: This method is called for non-chat models (regular LLMs). Ifyou’re implementing a handler for a chat model,
you should use on_chat_model_start instead.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackManager.html |
8e3152136ab6-4 | kwargs (Any) –
Return type
Any
on_tool_start(serialized: Dict[str, Any], input_str: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inputs: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inputs (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
remove_handler(handler: BaseCallbackHandler) → None[source]¶
Remove a handler from the callback manager.
Parameters
handler (BaseCallbackHandler) –
Return type
None
remove_metadata(keys: List[str]) → None[source]¶
Parameters
keys (List[str]) –
Return type
None
remove_tags(tags: List[str]) → None[source]¶
Parameters
tags (List[str]) –
Return type
None
set_handler(handler: BaseCallbackHandler, inherit: bool = True) → None[source]¶
Set handler as the only handler on the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
set_handlers(handlers: List[BaseCallbackHandler], inherit: bool = True) → None[source]¶
Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inherit (bool) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackManager.html |
d46a6e2123c9-0 | langchain_community.callbacks.mlflow_callback.get_text_complexity_metrics¶
langchain_community.callbacks.mlflow_callback.get_text_complexity_metrics() → List[str][source]¶
Get the text complexity metrics from textstat.
Return type
List[str] | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.mlflow_callback.get_text_complexity_metrics.html |
639c3703d430-0 | langchain_community.callbacks.utils.load_json¶
langchain_community.callbacks.utils.load_json(json_path: Union[str, Path]) → str[source]¶
Load json file to a string.
Parameters
json_path (str) – The path to the json file.
Returns
The string representation of the json file.
Return type
(str) | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.utils.load_json.html |
89ca614517a9-0 | langchain_community.callbacks.uptrain_callback.import_uptrain¶
langchain_community.callbacks.uptrain_callback.import_uptrain() → Any[source]¶
Import the uptrain package.
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.uptrain_callback.import_uptrain.html |
f76a82b8e767-0 | langchain_core.callbacks.manager.AsyncCallbackManagerForChainRun¶
class langchain_core.callbacks.manager.AsyncCallbackManagerForChainRun(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Async callback manager for chain run.
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Methods
__init__(*, run_id, handlers, ...[, ...])
Initialize the run manager.
get_child([tag])
Get a child callback manager.
get_noop_manager()
Return a manager that doesn't perform any operations.
get_sync()
Get the equivalent sync RunManager.
on_agent_action(action, **kwargs)
Run when agent action is received.
on_agent_finish(finish, **kwargs)
Run when agent finish is received.
on_chain_end(outputs, **kwargs)
Run when chain ends running.
on_chain_error(error, **kwargs)
Run when chain errors. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManagerForChainRun.html |
f76a82b8e767-1 | on_chain_error(error, **kwargs)
Run when chain errors.
on_retry(retry_state, **kwargs)
Run on a retry event.
on_text(text, **kwargs)
Run when text is received.
__init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Return type
None
get_child(tag: Optional[str] = None) → AsyncCallbackManager¶
Get a child callback manager.
Parameters
tag (str, optional) – The tag for the child callback manager.
Defaults to None.
Returns
The child callback manager.
Return type
AsyncCallbackManager
classmethod get_noop_manager() → BRM¶
Return a manager that doesn’t perform any operations.
Returns
The noop manager.
Return type
BaseRunManager
get_sync() → CallbackManagerForChainRun[source]¶
Get the equivalent sync RunManager. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManagerForChainRun.html |
f76a82b8e767-2 | Get the equivalent sync RunManager.
Returns
The sync RunManager.
Return type
CallbackManagerForChainRun
async on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Run when agent action is received.
Parameters
action (AgentAction) – The agent action.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any
async on_agent_finish(finish: AgentFinish, **kwargs: Any) → Any[source]¶
Run when agent finish is received.
Parameters
finish (AgentFinish) – The agent finish.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any
async on_chain_end(outputs: Union[Dict[str, Any], Any], **kwargs: Any) → None[source]¶
Run when chain ends running.
Parameters
outputs (Union[Dict[str, Any], Any]) – The outputs of the chain.
kwargs (Any) –
Return type
None
async on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when chain errors.
Parameters
error (Exception or KeyboardInterrupt) – The error.
kwargs (Any) –
Return type
None
async on_retry(retry_state: RetryCallState, **kwargs: Any) → None¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
kwargs (Any) –
Return type
None
async on_text(text: str, **kwargs: Any) → Any¶
Run when text is received.
Parameters
text (str) – The received text.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.AsyncCallbackManagerForChainRun.html |
5af1b3c40c9a-0 | langchain_community.callbacks.utils.flatten_dict¶
langchain_community.callbacks.utils.flatten_dict(nested_dict: Dict[str, Any], parent_key: str = '', sep: str = '_') → Dict[str, Any][source]¶
Flatten a nested dictionary into a flat dictionary.
Parameters
nested_dict (dict) – The nested dictionary to flatten.
parent_key (str) – The prefix to prepend to the keys of the flattened dict.
sep (str) – The separator to use between the parent key and the key of the
flattened dictionary.
Returns
A flat dictionary.
Return type
(dict) | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.utils.flatten_dict.html |
f2cd4c93c04d-0 | langchain_core.callbacks.manager.CallbackManager¶
class langchain_core.callbacks.manager.CallbackManager(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Callback manager that handles callbacks from LangChain.
Initialize callback manager.
Attributes
is_async
Whether the callback manager is async.
Methods
__init__(handlers[, inheritable_handlers, ...])
Initialize callback manager.
add_handler(handler[, inherit])
Add a handler to the callback manager.
add_metadata(metadata[, inherit])
add_tags(tags[, inherit])
configure([inheritable_callbacks, ...])
Configure the callback manager.
copy()
Copy the callback manager.
on_chain_start(serialized, inputs[, run_id])
Run when chain starts running.
on_chat_model_start(serialized, messages[, ...])
Run when LLM starts running.
on_llm_start(serialized, prompts[, run_id])
Run when LLM starts running.
on_retriever_start(serialized, query[, ...])
Run when retriever starts running.
on_tool_start(serialized, input_str[, ...])
Run when tool starts running.
remove_handler(handler)
Remove a handler from the callback manager.
remove_metadata(keys)
remove_tags(tags)
set_handler(handler[, inherit])
Set handler as the only handler on the callback manager.
set_handlers(handlers[, inherit])
Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) – | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
f2cd4c93c04d-1 | Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
__init__(handlers: List[BaseCallbackHandler], inheritable_handlers: Optional[List[BaseCallbackHandler]] = None, parent_run_id: Optional[UUID] = None, *, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inheritable_handlers (Optional[List[BaseCallbackHandler]]) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
inheritable_tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
inheritable_metadata (Optional[Dict[str, Any]]) –
Return type
None
add_handler(handler: BaseCallbackHandler, inherit: bool = True) → None¶
Add a handler to the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
add_metadata(metadata: Dict[str, Any], inherit: bool = True) → None¶
Parameters
metadata (Dict[str, Any]) –
inherit (bool) –
Return type
None
add_tags(tags: List[str], inherit: bool = True) → None¶
Parameters
tags (List[str]) –
inherit (bool) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
f2cd4c93c04d-2 | tags (List[str]) –
inherit (bool) –
Return type
None
classmethod configure(inheritable_callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, local_callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, verbose: bool = False, inheritable_tags: Optional[List[str]] = None, local_tags: Optional[List[str]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None, local_metadata: Optional[Dict[str, Any]] = None) → CallbackManager[source]¶
Configure the callback manager.
Parameters
inheritable_callbacks (Optional[Callbacks], optional) – The inheritable
callbacks. Defaults to None.
local_callbacks (Optional[Callbacks], optional) – The local callbacks.
Defaults to None.
verbose (bool, optional) – Whether to enable verbose mode. Defaults to False.
inheritable_tags (Optional[List[str]], optional) – The inheritable tags.
Defaults to None.
local_tags (Optional[List[str]], optional) – The local tags.
Defaults to None.
inheritable_metadata (Optional[Dict[str, Any]], optional) – The inheritable
metadata. Defaults to None.
local_metadata (Optional[Dict[str, Any]], optional) – The local metadata.
Defaults to None.
Returns
The configured callback manager.
Return type
CallbackManager
copy() → T¶
Copy the callback manager.
Parameters
self (T) –
Return type
T
on_chain_start(serialized: Dict[str, Any], inputs: Union[Dict[str, Any], Any], run_id: Optional[UUID] = None, **kwargs: Any) → CallbackManagerForChainRun[source]¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) – The serialized chain. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
f2cd4c93c04d-3 | Parameters
serialized (Dict[str, Any]) – The serialized chain.
inputs (Union[Dict[str, Any], Any]) – The inputs to the chain.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
The callback manager for the chain run.
Return type
CallbackManagerForChainRun
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], run_id: Optional[UUID] = None, **kwargs: Any) → List[CallbackManagerForLLMRun][source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) – The serialized LLM.
messages (List[List[BaseMessage]]) – The list of messages.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
A callback manager for eachlist of messages as an LLM run.
Return type
List[CallbackManagerForLLMRun]
on_llm_start(serialized: Dict[str, Any], prompts: List[str], run_id: Optional[UUID] = None, **kwargs: Any) → List[CallbackManagerForLLMRun][source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) – The serialized LLM.
prompts (List[str]) – The list of prompts.
run_id (UUID, optional) – The ID of the run. Defaults to None.
kwargs (Any) –
Returns
A callback manager for eachprompt as an LLM run.
Return type
List[CallbackManagerForLLMRun] | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
f2cd4c93c04d-4 | Return type
List[CallbackManagerForLLMRun]
on_retriever_start(serialized: Dict[str, Any], query: str, run_id: Optional[UUID] = None, parent_run_id: Optional[UUID] = None, **kwargs: Any) → CallbackManagerForRetrieverRun[source]¶
Run when retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (Optional[UUID]) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
CallbackManagerForRetrieverRun
on_tool_start(serialized: Dict[str, Any], input_str: str, run_id: Optional[UUID] = None, parent_run_id: Optional[UUID] = None, inputs: Optional[Dict[str, Any]] = None, **kwargs: Any) → CallbackManagerForToolRun[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) – Serialized representation of the tool.
input_str (str) – The input to the tool as a string.
Non-string inputs are cast to strings.
run_id (Optional[UUID]) – ID for the run. Defaults to None.
parent_run_id (Optional[UUID]) – The ID of the parent run. Defaults to None.
inputs (Optional[Dict[str, Any]]) – The original input to the tool if provided.
Recommended for usage instead of input_str when the original
input is needed.
If provided, the inputs are expected to be formatted as a dict.
The keys will correspond to the named-arguments in the tool.
kwargs (Any) –
Returns
The callback manager for the tool run.
Return type
CallbackManagerForToolRun
remove_handler(handler: BaseCallbackHandler) → None¶
Remove a handler from the callback manager. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
f2cd4c93c04d-5 | Remove a handler from the callback manager.
Parameters
handler (BaseCallbackHandler) –
Return type
None
remove_metadata(keys: List[str]) → None¶
Parameters
keys (List[str]) –
Return type
None
remove_tags(tags: List[str]) → None¶
Parameters
tags (List[str]) –
Return type
None
set_handler(handler: BaseCallbackHandler, inherit: bool = True) → None¶
Set handler as the only handler on the callback manager.
Parameters
handler (BaseCallbackHandler) –
inherit (bool) –
Return type
None
set_handlers(handlers: List[BaseCallbackHandler], inherit: bool = True) → None¶
Set handlers as the only handlers on the callback manager.
Parameters
handlers (List[BaseCallbackHandler]) –
inherit (bool) –
Return type
None
Examples using CallbackManager¶
ChatLiteLLM
ChatLiteLLMRouter
GPTRouter
Llama.cpp
Run LLMs locally
Titan Takeoff
ZHIPU AI | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.CallbackManager.html |
e14662daf8ca-0 | langchain_community.callbacks.mlflow_callback.mlflow_callback_metrics¶
langchain_community.callbacks.mlflow_callback.mlflow_callback_metrics() → List[str][source]¶
Get the metrics to log to MLFlow.
Return type
List[str] | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.mlflow_callback.mlflow_callback_metrics.html |
a77d0fdc8368-0 | langchain_community.callbacks.labelstudio_callback.LabelStudioMode¶
class langchain_community.callbacks.labelstudio_callback.LabelStudioMode(value)[source]¶
Label Studio mode enumerator.
PROMPT = 'prompt'¶
CHAT = 'chat'¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.labelstudio_callback.LabelStudioMode.html |
b34f3954c38b-0 | langchain_community.callbacks.tracers.wandb.WandbRunArgs¶
class langchain_community.callbacks.tracers.wandb.WandbRunArgs[source]¶
Arguments for the WandbTracer.
job_type: Optional[str]¶
dir: Optional[StrPath]¶
config: Union[Dict, str, None]¶
project: Optional[str]¶
entity: Optional[str]¶
reinit: Optional[bool]¶
tags: Optional[Sequence]¶
group: Optional[str]¶
name: Optional[str]¶
notes: Optional[str]¶
magic: Optional[Union[dict, str, bool]]¶
config_exclude_keys: Optional[List[str]]¶
config_include_keys: Optional[List[str]]¶
anonymous: Optional[str]¶
mode: Optional[str]¶
allow_val_change: Optional[bool]¶
resume: Optional[Union[bool, str]]¶
force: Optional[bool]¶
tensorboard: Optional[bool]¶
sync_tensorboard: Optional[bool]¶
monitor_gym: Optional[bool]¶
save_code: Optional[bool]¶
id: Optional[str]¶
settings: Union[WBSettings, Dict[str, Any], None]¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.tracers.wandb.WandbRunArgs.html |
ac2732f93d6f-0 | langchain_community.callbacks.arthur_callback.ArthurCallbackHandler¶
class langchain_community.callbacks.arthur_callback.ArthurCallbackHandler(arthur_model: ArthurModel)[source]¶
Callback Handler that logs to Arthur platform.
Arthur helps enterprise teams optimize model operations
and performance at scale. The Arthur API tracks model
performance, explainability, and fairness across tabular,
NLP, and CV models. Our API is model- and platform-agnostic,
and continuously scales with complex and dynamic enterprise needs.
To learn more about Arthur, visit our website at
https://www.arthur.ai/ or read the Arthur docs at
https://docs.arthur.ai/
Initialize callback handler.
Attributes
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__(arthur_model)
Initialize callback handler.
from_credentials(model_id[, arthur_url, ...])
Initialize callback handler from Arthur credentials.
on_agent_action(action, **kwargs)
Do nothing when agent takes a specific action.
on_agent_finish(finish, **kwargs)
Do nothing
on_chain_end(outputs, **kwargs)
On chain end, do nothing.
on_chain_error(error, **kwargs)
Do nothing when LLM chain outputs an error.
on_chain_start(serialized, inputs, **kwargs)
On chain start, do nothing.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running.
on_llm_end(response, **kwargs)
On LLM end, send data to Arthur. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
ac2732f93d6f-1 | On LLM end, send data to Arthur.
on_llm_error(error, **kwargs)
Do nothing when LLM outputs an error.
on_llm_new_token(token, **kwargs)
On new token, pass.
on_llm_start(serialized, prompts, **kwargs)
On LLM start, save the input prompts
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, **kwargs)
Do nothing
on_tool_end(output[, observation_prefix, ...])
Do nothing when tool ends.
on_tool_error(error, **kwargs)
Do nothing when tool outputs an error.
on_tool_start(serialized, input_str, **kwargs)
Do nothing when tool starts.
Parameters
arthur_model (ArthurModel) –
__init__(arthur_model: ArthurModel) → None[source]¶
Initialize callback handler.
Parameters
arthur_model (ArthurModel) –
Return type
None
classmethod from_credentials(model_id: str, arthur_url: Optional[str] = 'https://app.arthur.ai', arthur_login: Optional[str] = None, arthur_password: Optional[str] = None) → ArthurCallbackHandler[source]¶
Initialize callback handler from Arthur credentials.
Parameters
model_id (str) – The ID of the arthur model to log to. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
ac2732f93d6f-2 | Parameters
model_id (str) – The ID of the arthur model to log to.
arthur_url (str, optional) – The URL of the Arthur instance to log to.
Defaults to “https://app.arthur.ai”.
arthur_login (str, optional) – The login to use to connect to Arthur.
Defaults to None.
arthur_password (str, optional) – The password to use to connect to
Arthur. Defaults to None.
Returns
The initialized callback handler.
Return type
ArthurCallbackHandler
on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Do nothing when agent takes a specific action.
Parameters
action (AgentAction) –
kwargs (Any) –
Return type
Any
on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶
Do nothing
Parameters
finish (AgentFinish) –
kwargs (Any) –
Return type
None
on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶
On chain end, do nothing.
Parameters
outputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Do nothing when LLM chain outputs an error.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶
On chain start, do nothing.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
ac2732f93d6f-3 | kwargs (Any) –
Return type
None
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶
On LLM end, send data to Arthur.
Parameters
response (LLMResult) –
kwargs (Any) –
Return type
None
on_llm_error(error: BaseException, **kwargs: Any) → None[source]¶
Do nothing when LLM outputs an error.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_llm_new_token(token: str, **kwargs: Any) → None[source]¶
On new token, pass.
Parameters
token (str) –
kwargs (Any) –
Return type
None
on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶
On LLM start, save the input prompts
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
ac2732f93d6f-4 | On LLM start, save the input prompts
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
kwargs (Any) –
Return type
None
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶ | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
ac2732f93d6f-5 | Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, **kwargs: Any) → None[source]¶
Do nothing
Parameters
text (str) –
kwargs (Any) –
Return type
None
on_tool_end(output: Any, observation_prefix: Optional[str] = None, llm_prefix: Optional[str] = None, **kwargs: Any) → None[source]¶
Do nothing when tool ends.
Parameters
output (Any) –
observation_prefix (Optional[str]) –
llm_prefix (Optional[str]) –
kwargs (Any) –
Return type
None
on_tool_error(error: BaseException, **kwargs: Any) → None[source]¶
Do nothing when tool outputs an error.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶
Do nothing when tool starts.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
kwargs (Any) –
Return type
None
Examples using ArthurCallbackHandler¶
Arthur | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.arthur_callback.ArthurCallbackHandler.html |
616b35e7ecaf-0 | langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler¶
class langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler[source]¶
Callback handler for streaming. Only works with LLMs that support streaming.
Attributes
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__()
on_agent_action(action, **kwargs)
Run on agent action.
on_agent_finish(finish, **kwargs)
Run on agent end.
on_chain_end(outputs, **kwargs)
Run when chain ends running.
on_chain_error(error, **kwargs)
Run when chain errors.
on_chain_start(serialized, inputs, **kwargs)
Run when chain starts running.
on_chat_model_start(serialized, messages, ...)
Run when LLM starts running.
on_llm_end(response, **kwargs)
Run when LLM ends running.
on_llm_error(error, **kwargs)
Run when LLM errors.
on_llm_new_token(token, **kwargs)
Run on new LLM token.
on_llm_start(serialized, prompts, **kwargs)
Run when LLM starts running.
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler.html |
616b35e7ecaf-1 | Run when Retriever starts running.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, **kwargs)
Run on arbitrary text.
on_tool_end(output, **kwargs)
Run when tool ends running.
on_tool_error(error, **kwargs)
Run when tool errors.
on_tool_start(serialized, input_str, **kwargs)
Run when tool starts running.
__init__()¶
on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Run on agent action.
Parameters
action (AgentAction) –
kwargs (Any) –
Return type
Any
on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶
Run on agent end.
Parameters
finish (AgentFinish) –
kwargs (Any) –
Return type
None
on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain ends running.
Parameters
outputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when chain errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler.html |
616b35e7ecaf-2 | kwargs (Any) –
Return type
None
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], **kwargs: Any) → None[source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
kwargs (Any) –
Return type
None
on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶
Run when LLM ends running.
Parameters
response (LLMResult) –
kwargs (Any) –
Return type
None
on_llm_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when LLM errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_llm_new_token(token: str, **kwargs: Any) → None[source]¶
Run on new LLM token. Only available when streaming is enabled.
Parameters
token (str) –
kwargs (Any) –
Return type
None
on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶
Run when LLM starts running.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
kwargs (Any) –
Return type
None
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler.html |
616b35e7ecaf-3 | kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, **kwargs: Any) → None[source]¶
Run on arbitrary text.
Parameters
text (str) –
kwargs (Any) –
Return type
None
on_tool_end(output: Any, **kwargs: Any) → None[source]¶
Run when tool ends running.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler.html |
616b35e7ecaf-4 | Run when tool ends running.
Parameters
output (Any) –
kwargs (Any) –
Return type
None
on_tool_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when tool errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
kwargs (Any) –
Return type
None
Examples using StreamingStdOutCallbackHandler¶
Arthur
Bedrock
C Transformers
Chat Over Documents with Vectara
ChatEverlyAI
ChatLiteLLM
ChatLiteLLMRouter
DeepInfra
Eden AI
ExLlamaV2
GPT4All
GPTRouter
Huggingface Endpoints
Llama.cpp
Replicate
Run LLMs locally
TextGen
Titan Takeoff
Yuan2.0
ZHIPU AI | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.streaming_stdout.StreamingStdOutCallbackHandler.html |
6518ca259ebf-0 | langchain_community.callbacks.flyte_callback.import_flytekit¶
langchain_community.callbacks.flyte_callback.import_flytekit() → Tuple[flytekit, renderer][source]¶
Import flytekit and flytekitplugins-deck-standard.
Return type
Tuple[flytekit, renderer] | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.flyte_callback.import_flytekit.html |
4839211431fb-0 | langchain_core.callbacks.manager.RunManager¶
class langchain_core.callbacks.manager.RunManager(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None)[source]¶
Sync Run Manager.
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Methods
__init__(*, run_id, handlers, ...[, ...])
Initialize the run manager.
get_noop_manager()
Return a manager that doesn't perform any operations.
on_retry(retry_state, **kwargs)
Run on a retry event.
on_text(text, **kwargs)
Run when text is received. | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.RunManager.html |
4839211431fb-1 | on_text(text, **kwargs)
Run when text is received.
__init__(*, run_id: UUID, handlers: List[BaseCallbackHandler], inheritable_handlers: List[BaseCallbackHandler], parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, inheritable_tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, inheritable_metadata: Optional[Dict[str, Any]] = None) → None¶
Initialize the run manager.
Parameters
run_id (UUID) – The ID of the run.
handlers (List[BaseCallbackHandler]) – The list of handlers.
inheritable_handlers (List[BaseCallbackHandler]) – The list of inheritable handlers.
parent_run_id (UUID, optional) – The ID of the parent run.
Defaults to None.
tags (Optional[List[str]]) – The list of tags.
inheritable_tags (Optional[List[str]]) – The list of inheritable tags.
metadata (Optional[Dict[str, Any]]) – The metadata.
inheritable_metadata (Optional[Dict[str, Any]]) – The inheritable metadata.
Return type
None
classmethod get_noop_manager() → BRM¶
Return a manager that doesn’t perform any operations.
Returns
The noop manager.
Return type
BaseRunManager
on_retry(retry_state: RetryCallState, **kwargs: Any) → None[source]¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
kwargs (Any) –
Return type
None
on_text(text: str, **kwargs: Any) → Any[source]¶
Run when text is received.
Parameters
text (str) – The received text.
kwargs (Any) –
Returns
The result of the callback.
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.manager.RunManager.html |
8f25ec5de7e9-0 | langchain_community.callbacks.streamlit.mutable_expander.ChildRecord¶
class langchain_community.callbacks.streamlit.mutable_expander.ChildRecord(type: ChildType, kwargs: Dict[str, Any], dg: DeltaGenerator)[source]¶
Child record as a NamedTuple.
Create new instance of ChildRecord(type, kwargs, dg)
Attributes
dg
Alias for field number 2
kwargs
Alias for field number 1
type
Alias for field number 0
Methods
__init__()
count(value, /)
Return number of occurrences of value.
index(value[, start, stop])
Return first index of value.
Parameters
type (ChildType) –
kwargs (Dict[str, Any]) –
dg (DeltaGenerator) –
__init__()¶
count(value, /)¶
Return number of occurrences of value.
index(value, start=0, stop=9223372036854775807, /)¶
Return first index of value.
Raises ValueError if the value is not present. | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.streamlit.mutable_expander.ChildRecord.html |
6e2d02004315-0 | langchain_community.callbacks.manager.get_bedrock_anthropic_callback¶
langchain_community.callbacks.manager.get_bedrock_anthropic_callback() → Generator[BedrockAnthropicTokenUsageCallbackHandler, None, None][source]¶
Get the Bedrock anthropic callback handler in a context manager.
which conveniently exposes token and cost information.
Returns
The Bedrock anthropic callback handler.
Return type
BedrockAnthropicTokenUsageCallbackHandler
Example
>>> with get_bedrock_anthropic_callback() as cb:
... # Use the Bedrock anthropic callback handler
Examples using get_bedrock_anthropic_callback¶
How to track token usage in ChatModels | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.manager.get_bedrock_anthropic_callback.html |
cf8c4a00e836-0 | langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler¶
class langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler(run: Any)[source]¶
Callback Handler that logs prompt artifacts and metrics to SageMaker Experiments.
Parameters
run (sagemaker.experiments.run.Run) – Run object where the experiment is logged.
Initialize callback handler.
Attributes
ignore_agent
Whether to ignore agent callbacks.
ignore_chain
Whether to ignore chain callbacks.
ignore_chat_model
Whether to ignore chat model callbacks.
ignore_llm
Whether to ignore LLM callbacks.
ignore_retriever
Whether to ignore retriever callbacks.
ignore_retry
Whether to ignore retry callbacks.
raise_error
run_inline
Methods
__init__(run)
Initialize callback handler.
flush_tracker()
Reset the steps and delete the temporary local directory.
jsonf(data, data_dir, filename[, is_output])
To log the input data as json file artifact.
on_agent_action(action, **kwargs)
Run on agent action.
on_agent_finish(finish, **kwargs)
Run when agent ends running.
on_chain_end(outputs, **kwargs)
Run when chain ends running.
on_chain_error(error, **kwargs)
Run when chain errors.
on_chain_start(serialized, inputs, **kwargs)
Run when chain starts running.
on_chat_model_start(serialized, messages, *, ...)
Run when a chat model starts running.
on_llm_end(response, **kwargs)
Run when LLM ends running.
on_llm_error(error, **kwargs)
Run when LLM errors.
on_llm_new_token(token, **kwargs)
Run when LLM generates a new token.
on_llm_start(serialized, prompts, **kwargs) | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
cf8c4a00e836-1 | on_llm_start(serialized, prompts, **kwargs)
Run when LLM starts.
on_retriever_end(documents, *, run_id[, ...])
Run when Retriever ends running.
on_retriever_error(error, *, run_id[, ...])
Run when Retriever errors.
on_retriever_start(serialized, query, *, run_id)
Run when Retriever starts running.
on_retry(retry_state, *, run_id[, parent_run_id])
Run on a retry event.
on_text(text, **kwargs)
Run when agent is ending.
on_tool_end(output, **kwargs)
Run when tool ends running.
on_tool_error(error, **kwargs)
Run when tool errors.
on_tool_start(serialized, input_str, **kwargs)
Run when tool starts running.
__init__(run: Any) → None[source]¶
Initialize callback handler.
Parameters
run (Any) –
Return type
None
flush_tracker() → None[source]¶
Reset the steps and delete the temporary local directory.
Return type
None
jsonf(data: Dict[str, Any], data_dir: str, filename: str, is_output: Optional[bool] = True) → None[source]¶
To log the input data as json file artifact.
Parameters
data (Dict[str, Any]) –
data_dir (str) –
filename (str) –
is_output (Optional[bool]) –
Return type
None
on_agent_action(action: AgentAction, **kwargs: Any) → Any[source]¶
Run on agent action.
Parameters
action (AgentAction) –
kwargs (Any) –
Return type
Any | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
cf8c4a00e836-2 | action (AgentAction) –
kwargs (Any) –
Return type
Any
on_agent_finish(finish: AgentFinish, **kwargs: Any) → None[source]¶
Run when agent ends running.
Parameters
finish (AgentFinish) –
kwargs (Any) –
Return type
None
on_chain_end(outputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain ends running.
Parameters
outputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chain_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when chain errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_chain_start(serialized: Dict[str, Any], inputs: Dict[str, Any], **kwargs: Any) → None[source]¶
Run when chain starts running.
Parameters
serialized (Dict[str, Any]) –
inputs (Dict[str, Any]) –
kwargs (Any) –
Return type
None
on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when a chat model starts running.
ATTENTION: This method is called for chat models. If you’re implementinga handler for a non-chat model, you should use on_llm_start instead.
Parameters
serialized (Dict[str, Any]) –
messages (List[List[BaseMessage]]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) – | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
cf8c4a00e836-3 | parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_llm_end(response: LLMResult, **kwargs: Any) → None[source]¶
Run when LLM ends running.
Parameters
response (LLMResult) –
kwargs (Any) –
Return type
None
on_llm_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when LLM errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_llm_new_token(token: str, **kwargs: Any) → None[source]¶
Run when LLM generates a new token.
Parameters
token (str) –
kwargs (Any) –
Return type
None
on_llm_start(serialized: Dict[str, Any], prompts: List[str], **kwargs: Any) → None[source]¶
Run when LLM starts.
Parameters
serialized (Dict[str, Any]) –
prompts (List[str]) –
kwargs (Any) –
Return type
None
on_retriever_end(documents: Sequence[Document], *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever ends running.
Parameters
documents (Sequence[Document]) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_error(error: BaseException, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run when Retriever errors.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
cf8c4a00e836-4 | Run when Retriever errors.
Parameters
error (BaseException) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_retriever_start(serialized: Dict[str, Any], query: str, *, run_id: UUID, parent_run_id: Optional[UUID] = None, tags: Optional[List[str]] = None, metadata: Optional[Dict[str, Any]] = None, **kwargs: Any) → Any¶
Run when Retriever starts running.
Parameters
serialized (Dict[str, Any]) –
query (str) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
tags (Optional[List[str]]) –
metadata (Optional[Dict[str, Any]]) –
kwargs (Any) –
Return type
Any
on_retry(retry_state: RetryCallState, *, run_id: UUID, parent_run_id: Optional[UUID] = None, **kwargs: Any) → Any¶
Run on a retry event.
Parameters
retry_state (RetryCallState) –
run_id (UUID) –
parent_run_id (Optional[UUID]) –
kwargs (Any) –
Return type
Any
on_text(text: str, **kwargs: Any) → None[source]¶
Run when agent is ending.
Parameters
text (str) –
kwargs (Any) –
Return type
None
on_tool_end(output: Any, **kwargs: Any) → None[source]¶
Run when tool ends running.
Parameters
output (Any) –
kwargs (Any) –
Return type
None
on_tool_error(error: BaseException, **kwargs: Any) → None[source]¶
Run when tool errors.
Parameters | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
cf8c4a00e836-5 | Run when tool errors.
Parameters
error (BaseException) –
kwargs (Any) –
Return type
None
on_tool_start(serialized: Dict[str, Any], input_str: str, **kwargs: Any) → None[source]¶
Run when tool starts running.
Parameters
serialized (Dict[str, Any]) –
input_str (str) –
kwargs (Any) –
Return type
None
Examples using SageMakerCallbackHandler¶
AWS
SageMaker Tracking | https://api.python.langchain.com/en/latest/callbacks/langchain_community.callbacks.sagemaker_callback.SageMakerCallbackHandler.html |
Subsets and Splits