Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

models.rst 2.6 KB

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
  1. .. role:: hidden
  2. :class: hidden-section
  3. .. module:: fairseq.models
  4. .. _Models:
  5. Models
  6. ======
  7. A Model defines the neural network's ``forward()`` method and encapsulates all
  8. of the learnable parameters in the network. Each model also provides a set of
  9. named *architectures* that define the precise network configuration (e.g.,
  10. embedding dimension, number of layers, etc.).
  11. Both the model type and architecture are selected via the ``--arch``
  12. command-line argument. Once selected, a model may expose additional command-line
  13. arguments for further configuration.
  14. .. note::
  15. All fairseq Models extend :class:`BaseFairseqModel`, which in turn extends
  16. :class:`torch.nn.Module`. Thus any fairseq Model can be used as a
  17. stand-alone Module in other PyTorch code.
  18. Convolutional Neural Networks (CNN)
  19. -----------------------------------
  20. .. module:: fairseq.models.fconv
  21. .. autoclass:: fairseq.models.fconv.FConvModel
  22. :members:
  23. .. autoclass:: fairseq.models.fconv.FConvEncoder
  24. :members:
  25. :undoc-members:
  26. .. autoclass:: fairseq.models.fconv.FConvDecoder
  27. :members:
  28. Long Short-Term Memory (LSTM) networks
  29. --------------------------------------
  30. .. module:: fairseq.models.lstm
  31. .. autoclass:: fairseq.models.lstm.LSTMModel
  32. :members:
  33. .. autoclass:: fairseq.models.lstm.LSTMEncoder
  34. :members:
  35. .. autoclass:: fairseq.models.lstm.LSTMDecoder
  36. :members:
  37. Transformer (self-attention) networks
  38. -------------------------------------
  39. .. module:: fairseq.models.transformer
  40. .. autoclass:: fairseq.models.transformer.TransformerModel
  41. :members:
  42. .. autoclass:: fairseq.models.transformer.TransformerEncoder
  43. :members:
  44. .. autoclass:: fairseq.models.transformer.TransformerEncoderLayer
  45. :members:
  46. .. autoclass:: fairseq.models.transformer.TransformerDecoder
  47. :members:
  48. .. autoclass:: fairseq.models.transformer.TransformerDecoderLayer
  49. :members:
  50. Adding new models
  51. -----------------
  52. .. currentmodule:: fairseq.models
  53. .. autofunction:: fairseq.models.register_model
  54. .. autofunction:: fairseq.models.register_model_architecture
  55. .. autoclass:: fairseq.models.BaseFairseqModel
  56. :members:
  57. :undoc-members:
  58. .. autoclass:: fairseq.models.FairseqModel
  59. :members:
  60. :undoc-members:
  61. .. autoclass:: fairseq.models.FairseqLanguageModel
  62. :members:
  63. :undoc-members:
  64. .. autoclass:: fairseq.models.FairseqEncoder
  65. :members:
  66. .. autoclass:: fairseq.models.CompositeEncoder
  67. :members:
  68. .. autoclass:: fairseq.models.FairseqDecoder
  69. :members:
  70. .. _Incremental decoding:
  71. Incremental decoding
  72. --------------------
  73. .. autoclass:: fairseq.models.FairseqIncrementalDecoder
  74. :members:
  75. :undoc-members:
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...