Techiio-author
Started by Peter DanielsSep 22, 2021

Open
Is there a maximum sequence length for the output of a transformer?

0 VIEWES 0 LIKES 0 DISLIKES SHARE
0 LIKES 0 DISLIKES 0 VIEWES SHARE

There's just one thing that I can't find an answer to : When putting the output back in the transformer, we compute it similarly to the inputs (with added masks), so is there also a sequence size limit ?

Even BERT has an input size limit of 512 tokens, so transformers are limited in how much they can take in. So is there something to make the output length as big as wanted or is there a fixed max length ?

If I wasn't clear enough, does the network generate words infinitely until the < end > token or is there a token limit for the outputs?

0 Replies

You must be Logged in to reply
Techiio-logo

Techiio is on the journey to build an ocean of technical knowledge, scouring the emerging stars in process and proffering them to the corporate world.

Follow us on:

Subscribe to get latest updates

You can unsubscribe anytime from getting updates from us
Developed and maintained by Wikiance
Developed and maintained by Wikiance