[Doc]: Fix typo in fused_moe layer (#29731)

Signed-off-by: BowTen <bowten@qq.com>
This commit is contained in:
2025-11-30 14:29:13 +08:00
committed by GitHub
parent 66b5840287
commit 9381b5cde0

View File

@@ -1422,7 +1422,7 @@ class FusedMoE(CustomOp):
# do nothing.
return p
# Do not update the layer paramater as the layer's MoE operations would
# Do not update the layer parameter as the layer's MoE operations would
# expect the parameter's tensor to the same shape / stride. Instead,
# make a new torch.nn.Parameter that is used just in the context of
# EPLB.