Skip to content
158 changes: 78 additions & 80 deletions docs/source/en/model_doc/ctrl.md
Original file line number Diff line number Diff line change
@@ -1,110 +1,108 @@
<!--Copyright 2020 The HuggingFace Team. All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
specific language governing permissions and limitations under the License.

⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
rendered properly in your Markdown viewer.

-->

# CTRL

<div class="flex flex-wrap space-x-1">
<img alt="PyTorch" src="https://img.shields.io/badge/PyTorch-DE3412?style=flat&logo=pytorch&logoColor=white">
<img alt="TensorFlow" src="https://img.shields.io/badge/TensorFlow-FF6F00?style=flat&logo=tensorflow&logoColor=white">
</div>

## Overview

---
license: apache-2.0
tags:
- text-generation
- causal-lm
- control-codes
library_name: transformers
model_type: ctrl
pipeline_tag: text-generation
---

# [CTRL](https://arxiv.org/abs/1909.05858)

CTRL (Conditional Transformer Language Model) is a large language model developed by Salesforce Research that enables **controllable text generation**.
What makes it unique is its use of **control codes**—special prefixes like `Reviews:`, `Books:`, `Legal:`, etc.—that guide the model to produce text in specific domains or styles.
CTRL model was proposed in [CTRL: A Conditional Transformer Language Model for Controllable Generation](https://huggingface.co/papers/1909.05858) by Nitish Shirish Keskar*, Bryan McCann*, Lav R. Varshney, Caiming Xiong and
Richard Socher. It's a causal (unidirectional) transformer pre-trained using language modeling on a very large corpus
of ~140 GB of text data with the first token reserved as a control code (such as Links, Books, Wikipedia etc.).

The abstract from the paper is the following:
CTRL was trained on a large corpus of structured datasets, including Wikipedia, web data, Amazon reviews, and more.

*Large-scale language models show promising text generation capabilities, but users cannot easily control particular
aspects of the generated text. We release CTRL, a 1.63 billion-parameter conditional transformer language model,
trained to condition on control codes that govern style, content, and task-specific behavior. Control codes were
derived from structure that naturally co-occurs with raw text, preserving the advantages of unsupervised learning while
providing more explicit control over text generation. These codes also allow CTRL to predict which parts of the
training data are most likely given a sequence. This provides a potential method for analyzing large amounts of data
via model-based source attribution.*
You can find all the original CTRL checkpoints under the [CTRL model page on Hugging Face](https://huggingface.co/ctrl).

This model was contributed by [keskarnitishr](https://huggingface.co/keskarnitishr). The original code can be found
[here](https://github.com/salesforce/ctrl).
> [!TIP]
> This model was contributed by [salesforce](https://huggingface.co/salesforce).
> Click on the [CTRL](https://huggingface.co/ctrl) model in the right sidebar for more examples of how to apply CTRL to different text generation tasks.

## Usage tips
## Usage

- CTRL makes use of control codes to generate text: it requires generations to be started by certain words, sentences
or links to generate coherent text. Refer to the [original implementation](https://github.com/salesforce/ctrl) for
more information.
- CTRL is a model with absolute position embeddings so it's usually advised to pad the inputs on the right rather than
the left.
- CTRL was trained with a causal language modeling (CLM) objective and is therefore powerful at predicting the next
token in a sequence. Leveraging this feature allows CTRL to generate syntactically coherent text as it can be
observed in the *run_generation.py* example script.
- The PyTorch models can take the `past_key_values` as input, which is the previously computed key/value attention pairs.
TensorFlow models accepts `past` as input. Using the `past_key_values` value prevents the model from re-computing
pre-computed values in the context of text generation. See the [`forward`](model_doc/ctrl#transformers.CTRLModel.forward)
method for more information on the usage of this argument.
<hfoptions>

<hfoption id="pipeline">

## Resources
```python
from transformers import pipeline

- [Text classification task guide](../tasks/sequence_classification)
- [Causal language modeling task guide](../tasks/language_modeling)
generator = pipeline("text-generation", model="ctrl")
output = generator("Reviews: This product was", max_length=50, do_sample=True)
print(output[0]["generated_text"])
```

## CTRLConfig
</hfoption>
<hfoption id="AutoModel">

[[autodoc]] CTRLConfig
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

## CTRLTokenizer
tokenizer = AutoTokenizer.from_pretrained("ctrl")
model = AutoModelForCausalLM.from_pretrained("ctrl")

[[autodoc]] CTRLTokenizer
- save_vocabulary
inputs = tokenizer("Books: Once upon a time", return_tensors="pt")
outputs = model.generate(**inputs, max_length=50, do_sample=True)

<frameworkcontent>
<pt>
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```

## CTRLModel
</hfoption>
<hfoption id="transformers-cli">

[[autodoc]] CTRLModel
- forward
```bash
transformers-cli run text-generation \
--model_name_or_path=ctrl \
--prompt "Legal: The contract states" \
--max_length 50 \
--do_sample
```

## CTRLLMHeadModel
</hfoption>
<hfoption id="Quantization">

[[autodoc]] CTRLLMHeadModel
- forward
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

## CTRLForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("ctrl")
model = AutoModelForCausalLM.from_pretrained(
"ctrl",
load_in_8bit=True,
device_map="auto"
)
```

[[autodoc]] CTRLForSequenceClassification
- forward
</hfoption>
</hfoptions>

</pt>
<tf>
<!-- Attention visualizer is not currently supported for CTRL, but section is added for future compatibility. -->

## TFCTRLModel
<!-- Not applicable for CTRL as it does not support attention mask visualization yet. -->

[[autodoc]] TFCTRLModel
- call
## Notes

## TFCTRLLMHeadModel
- CTRL relies on **control codes** to guide generation to specific domains like reviews, books, or legal text.
- Using an appropriate prefix such as `Books:` or `Reviews:` is crucial for meaningful output.
- This model is **not compatible** with attention visualization tools.

[[autodoc]] TFCTRLLMHeadModel
- call
```python
# Control code example
prompt = "Books: Once upon a time"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50, do_sample=True)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```

## TFCTRLForSequenceClassification

[[autodoc]] TFCTRLForSequenceClassification
- call
## Resources

</tf>
</frameworkcontent>
- [CTRL paper (ArXiv)](https://arxiv.org/abs/1909.05858)
- [Salesforce CTRL GitHub](https://github.com/salesforce/ctrl)
- [CTRL on Hugging Face](https://huggingface.co/ctrl)
34 changes: 34 additions & 0 deletions docs/source/en/model_doc/ctrl.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
CTRL
====

.. autoclass:: transformers.CTRLConfig
:members:
:undoc-members:

.. autoclass:: transformers.CTRLForSequenceClassification
:members:
:undoc-members:

.. autoclass:: transformers.CTRLLMHeadModel
:members:
:undoc-members:

.. autoclass:: transformers.CTRLModel
:members:
:undoc-members:

.. autoclass:: transformers.CTRLTokenizer
:members:
:undoc-members:

.. autoclass:: transformers.TFCTRLForSequenceClassification
:members:
:undoc-members:

.. autoclass:: transformers.TFCTRLLMHeadModel
:members:
:undoc-members:

.. autoclass:: transformers.TFCTRLModel
:members:
:undoc-members: