🤗 Transformers integration API reference
API reference for the callback and utilities of the Neptune-🤗 Transformers integration.
You can log model training metadata by passing a Neptune callback to the Trainer callbacks.
NeptuneCallback()
Creates a Neptune callback that you pass to thecallbacksargument of theTrainerconstructor.
Parameters
| Name | Type | Default | Description |
|---|---|---|---|
| api_token | str, optional | None | Neptune API token obtained upon registration. You can leave this argument out if you have saved your token to theNEPTUNE_API_TOKENenvironment variable (strongly recommended). |
| project | str, optional | None | Name of an existing Neptune project, in the form:"workspace-name/project-name". In Neptune, you can copy the name from theproject settings() →Details & privacy.If None, the value of theNEPTUNE_PROJECTenvironment variable will be used.If you just want to try logging anonymously, you can use the public project "common/huggingface-integration". |
| name | str, optional | None | Custom name for the run. |
| base_namespace | str, optional | "finetuning" | In the Neptune run, the root namespace (folder) that will contain all of the logged metadata. |
| run | Run, optional | None | Pass a Neptune run object if you want to continue logging to an existing run.SeeResume a runandPass object between functions. |
| log_parameters | bool, optional | True | Log all Trainer arguments and model parameters provided by the Trainer. |
| log_checkpoints | str, optional | None | If"same", uploads checkpoints whenever they are saved by the Trainer.If"last", uploads only the most recently saved checkpoint.If"best", uploads the best checkpoint (among the ones saved by the Trainer).If None, does not upload checkpoints. |
| **neptune_run_kwargs | (optional) | - | Additional keyword arguments to be passed directly to theinit_run()function when a new run is created. |
If None, the value of theNEPTUNE_PROJECTenvironment variable will be used.
If you just want to try logging anonymously, you can use the public project "common/huggingface-integration".
SeeResume a runandPass object between functions.
- If
"same", uploads checkpoints whenever they are saved by the Trainer. - If
"last", uploads only the most recently saved checkpoint. - If
"best", uploads the best checkpoint (among the ones saved by the Trainer). - If
None, does not upload checkpoints.
Example
from transformers.integrations import NeptuneCallback
# Create Neptune callback
neptune_callback = NeptuneCallback(
name="timid-rhino",
description="DistilBERT fine-tuned on GLUE/MRPC",
tags=["args-callback", "fine-tune", "MRPC"], # tags help you manage runs
base_namespace="custom_name", # the default is "finetuning"
log_checkpoints="best", # other options are "last", "same", and None
capture_hardware_metrics=False, # additional kwargs for a Neptune run
)
# Create training arguments
training_args = TrainingArguments(
"quick-training-distilbert-mrpc",
evaluation_strategy="steps",
eval_steps=20,
report_to="none", # (1)!
)
# Pass Neptune callback to Trainer
trainer = Trainer(
model,
training_args,
callbacks=[neptune_callback],
)
trainer.train()
- To avoid creating several callbacks, set the
report_toargument to"none".
get_run()
Returns the current Neptune run.
Parameters
| Name | Type | Default | Description |
|---|---|---|---|
| trainer | transformers.Trainer | - | The transformers Trainer instance. |
Returns
Runobject that can be used for logging.
Examples
trainer.train()
# Logging additional metadata after training
run = NeptuneCallback.get_run(trainer)
run["predictions"] = ...
See alsoLogging additional metadata after trainingin the Neptune-Transformers integration guide.
.run
If you have aNeptuneCallbackinstance available, you can access the Neptune run with the.runproperty:
trainer.train()
# Logging additional metadata after training
neptune_callback.run["predictions"] = ...
See also
NeptuneCallbackin the🤗 Transformers API reference
Related Documentation
This page is originally sourced from the legacy docs.