fix: TransformerEngineBaseModule quantizers init values type#2927
fix: TransformerEngineBaseModule quantizers init values type#2927muutot wants to merge 1 commit intoNVIDIA:mainfrom
Conversation
Signed-off-by: Muu <koimuu@163.com>
Greptile SummaryThis PR corrects the initial placeholder type of Confidence Score: 5/5Safe to merge — single-line corrective change with no functional risk. The only changed line corrects the initial type of two placeholder values that are always overwritten before use. No logic paths access these values as dicts, and the list type matches the actual runtime type produced by No files require special attention. Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A["__init__\nself.quantizers = {scaling_fwd: [], scaling_bwd: []}"] --> B["prepare_forward / init_fp8_metadata"]
B --> C["init_fp8_meta_tensors(recipe)"]
C --> D["set_meta_tensor(fwd=True)"]
C --> E["set_meta_tensor(fwd=False)"]
D --> F["recipe_state = RecipeState.create(...)"]
E --> G["recipe_state = RecipeState.create(...)"]
F --> H["self.quantizers['scaling_fwd'] = recipe_state.make_quantizers() → list"]
G --> I["self.quantizers['scaling_bwd'] = recipe_state.make_quantizers() → list"]
H --> J["reset_parameters\nself.quantizers['scaling_fwd'][fp8_meta_index] ← integer index"]
I --> J
H --> K["adjust_amax_history_length\nself.quantizers[meta_key] = ...make_quantizers()"]
Reviews (1): Last reviewed commit: "fix: TransformerEngineBaseModule quantiz..." | Re-trigger Greptile |
|
/te-ci |
Description
Make TransformerEngineBaseModule quantizer initialization values correct
Type of change
Changes
Please list the changes introduced in this PR:
Checklist: