Skip to content

Commit a81063b

Browse files
committed
DOC: Add API documents for 0.3 release
1 parent 6e6d2a5 commit a81063b

File tree

114 files changed

+25439
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

114 files changed

+25439
-0
lines changed
Lines changed: 123 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,123 @@
1+
redirects:
2+
- from: /api_docs/python/tfa/image/distance_transform/euclidean_dist_transform
3+
to: /api_docs/python/tfa/image/euclidean_dist_transform
4+
- from: /api_docs/python/tfa/image/distort_image_ops/adjust_hsv_in_yiq
5+
to: /api_docs/python/tfa/image/adjust_hsv_in_yiq
6+
- from: /api_docs/python/tfa/image/distort_image_ops/random_hsv_in_yiq
7+
to: /api_docs/python/tfa/image/random_hsv_in_yiq
8+
- from: /api_docs/python/tfa/image/filters/median_filter2d
9+
to: /api_docs/python/tfa/image/median_filter2d
10+
- from: /api_docs/python/tfa/image/transform_ops/rotate
11+
to: /api_docs/python/tfa/image/rotate
12+
- from: /api_docs/python/tfa/image/transform_ops/transform
13+
to: /api_docs/python/tfa/image/transform
14+
- from: /api_docs/python/tfa/layers/maxout/Maxout
15+
to: /api_docs/python/tfa/layers/Maxout
16+
- from: /api_docs/python/tfa/layers/normalizations/GroupNormalization
17+
to: /api_docs/python/tfa/layers/GroupNormalization
18+
- from: /api_docs/python/tfa/layers/normalizations/InstanceNormalization
19+
to: /api_docs/python/tfa/layers/InstanceNormalization
20+
- from: /api_docs/python/tfa/layers/poincare/PoincareNormalize
21+
to: /api_docs/python/tfa/layers/PoincareNormalize
22+
- from: /api_docs/python/tfa/layers/sparsemax/Sparsemax
23+
to: /api_docs/python/tfa/layers/Sparsemax
24+
- from: /api_docs/python/tfa/layers/sparsemax/sparsemax
25+
to: /api_docs/python/tfa/activations/sparsemax
26+
- from: /api_docs/python/tfa/layers/wrappers/WeightNormalization
27+
to: /api_docs/python/tfa/layers/WeightNormalization
28+
- from: /api_docs/python/tfa/losses/contrastive/ContrastiveLoss
29+
to: /api_docs/python/tfa/losses/ContrastiveLoss
30+
- from: /api_docs/python/tfa/losses/contrastive/contrastive_loss
31+
to: /api_docs/python/tfa/losses/contrastive_loss
32+
- from: /api_docs/python/tfa/losses/focal_loss/SigmoidFocalCrossEntropy
33+
to: /api_docs/python/tfa/losses/SigmoidFocalCrossEntropy
34+
- from: /api_docs/python/tfa/losses/focal_loss/sigmoid_focal_crossentropy
35+
to: /api_docs/python/tfa/losses/sigmoid_focal_crossentropy
36+
- from: /api_docs/python/tfa/losses/lifted/LiftedStructLoss
37+
to: /api_docs/python/tfa/losses/LiftedStructLoss
38+
- from: /api_docs/python/tfa/losses/lifted/lifted_struct_loss
39+
to: /api_docs/python/tfa/losses/lifted_struct_loss
40+
- from: /api_docs/python/tfa/losses/triplet/TripletSemiHardLoss
41+
to: /api_docs/python/tfa/losses/TripletSemiHardLoss
42+
- from: /api_docs/python/tfa/losses/triplet/triplet_semihard_loss
43+
to: /api_docs/python/tfa/losses/triplet_semihard_loss
44+
- from: /api_docs/python/tfa/optimizers/lazy_adam/LazyAdam
45+
to: /api_docs/python/tfa/optimizers/LazyAdam
46+
- from: /api_docs/python/tfa/optimizers/moving_average/MovingAverage
47+
to: /api_docs/python/tfa/optimizers/MovingAverage
48+
- from: /api_docs/python/tfa/optimizers/weight_decay_optimizers/AdamW
49+
to: /api_docs/python/tfa/optimizers/AdamW
50+
- from: /api_docs/python/tfa/optimizers/weight_decay_optimizers/SGDW
51+
to: /api_docs/python/tfa/optimizers/SGDW
52+
- from: /api_docs/python/tfa/optimizers/weight_decay_optimizers/extend_with_decoupled_weight_decay
53+
to: /api_docs/python/tfa/optimizers/extend_with_decoupled_weight_decay
54+
- from: /api_docs/python/tfa/rnn/cell/LayerNormLSTMCell
55+
to: /api_docs/python/tfa/rnn/LayerNormLSTMCell
56+
- from: /api_docs/python/tfa/rnn/cell/NASCell
57+
to: /api_docs/python/tfa/rnn/NASCell
58+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/AttentionMechanism
59+
to: /api_docs/python/tfa/seq2seq/AttentionMechanism
60+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/AttentionWrapper
61+
to: /api_docs/python/tfa/seq2seq/AttentionWrapper
62+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/AttentionWrapperState
63+
to: /api_docs/python/tfa/seq2seq/AttentionWrapperState
64+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/BahdanauAttention
65+
to: /api_docs/python/tfa/seq2seq/BahdanauAttention
66+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/BahdanauMonotonicAttention
67+
to: /api_docs/python/tfa/seq2seq/BahdanauMonotonicAttention
68+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/LuongAttention
69+
to: /api_docs/python/tfa/seq2seq/LuongAttention
70+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/LuongMonotonicAttention
71+
to: /api_docs/python/tfa/seq2seq/LuongMonotonicAttention
72+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/hardmax
73+
to: /api_docs/python/tfa/seq2seq/hardmax
74+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/monotonic_attention
75+
to: /api_docs/python/tfa/seq2seq/monotonic_attention
76+
- from: /api_docs/python/tfa/seq2seq/attention_wrapper/safe_cumprod
77+
to: /api_docs/python/tfa/seq2seq/safe_cumprod
78+
- from: /api_docs/python/tfa/seq2seq/basic_decoder/BasicDecoder
79+
to: /api_docs/python/tfa/seq2seq/BasicDecoder
80+
- from: /api_docs/python/tfa/seq2seq/basic_decoder/BasicDecoderOutput
81+
to: /api_docs/python/tfa/seq2seq/BasicDecoderOutput
82+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/BeamSearchDecoder
83+
to: /api_docs/python/tfa/seq2seq/BeamSearchDecoder
84+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/BeamSearchDecoderOutput
85+
to: /api_docs/python/tfa/seq2seq/BeamSearchDecoderOutput
86+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/BeamSearchDecoderState
87+
to: /api_docs/python/tfa/seq2seq/BeamSearchDecoderState
88+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/FinalBeamSearchDecoderOutput
89+
to: /api_docs/python/tfa/seq2seq/FinalBeamSearchDecoderOutput
90+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/gather_tree_from_array
91+
to: /api_docs/python/tfa/seq2seq/gather_tree_from_array
92+
- from: /api_docs/python/tfa/seq2seq/beam_search_decoder/tile_batch
93+
to: /api_docs/python/tfa/seq2seq/tile_batch
94+
- from: /api_docs/python/tfa/seq2seq/decoder/BaseDecoder
95+
to: /api_docs/python/tfa/seq2seq/BaseDecoder
96+
- from: /api_docs/python/tfa/seq2seq/decoder/Decoder
97+
to: /api_docs/python/tfa/seq2seq/Decoder
98+
- from: /api_docs/python/tfa/seq2seq/decoder/dynamic_decode
99+
to: /api_docs/python/tfa/seq2seq/dynamic_decode
100+
- from: /api_docs/python/tfa/seq2seq/loss/SequenceLoss
101+
to: /api_docs/python/tfa/seq2seq/SequenceLoss
102+
- from: /api_docs/python/tfa/seq2seq/loss/sequence_loss
103+
to: /api_docs/python/tfa/seq2seq/sequence_loss
104+
- from: /api_docs/python/tfa/seq2seq/sampler/CustomSampler
105+
to: /api_docs/python/tfa/seq2seq/CustomSampler
106+
- from: /api_docs/python/tfa/seq2seq/sampler/GreedyEmbeddingSampler
107+
to: /api_docs/python/tfa/seq2seq/GreedyEmbeddingSampler
108+
- from: /api_docs/python/tfa/seq2seq/sampler/InferenceSampler
109+
to: /api_docs/python/tfa/seq2seq/InferenceSampler
110+
- from: /api_docs/python/tfa/seq2seq/sampler/SampleEmbeddingSampler
111+
to: /api_docs/python/tfa/seq2seq/SampleEmbeddingSampler
112+
- from: /api_docs/python/tfa/seq2seq/sampler/Sampler
113+
to: /api_docs/python/tfa/seq2seq/Sampler
114+
- from: /api_docs/python/tfa/seq2seq/sampler/ScheduledEmbeddingTrainingSampler
115+
to: /api_docs/python/tfa/seq2seq/ScheduledEmbeddingTrainingSampler
116+
- from: /api_docs/python/tfa/seq2seq/sampler/ScheduledOutputTrainingSampler
117+
to: /api_docs/python/tfa/seq2seq/ScheduledOutputTrainingSampler
118+
- from: /api_docs/python/tfa/seq2seq/sampler/TrainingSampler
119+
to: /api_docs/python/tfa/seq2seq/TrainingSampler
120+
- from: /api_docs/python/tfa/text/skip_gram_ops/skip_gram_sample
121+
to: /api_docs/python/tfa/text/skip_gram_sample
122+
- from: /api_docs/python/tfa/text/skip_gram_ops/skip_gram_sample_with_text_vocab
123+
to: /api_docs/python/tfa/text/skip_gram_sample_with_text_vocab

0 commit comments

Comments
 (0)