Skip to content
Snippets Groups Projects

Repository graph

You can move around the graph by using the arrow keys.
Select Git revision
  • GLM-130B
  • MoE
  • adamop
  • adapt_env
  • args
  • beamsearch
  • bert
  • bert_large
  • bert_new
  • cait
  • chatglm-rotary
  • clever_dataset
  • clip
  • cogvideo
  • config
  • cp_support
  • cross
  • dev
  • dev_fixnan
  • develop
  • v0.1.10
21 results
Created with Raphaël 2.2.015Jun1412109876543227May262523212019181715141312115428Apr2625222120191816141312109828Mar27241623Feb1918161427Jan24201918171514131211976531Dec3029282322212019181716151312111095432130Nov262523211914987653231Oct30292827262524232221201918141098765123Sep2216830Aug20181022Jun181716add cogview2Merge pull request #44 from THUDM/bert_largesupport robertabert_largebert_largesupport bert largeMerge pull request #43 from THUDM/main_fromv0.2update readmesave ckpt create model_config.jsonlayernorm-order and tokenizer-type argsMerge branch 'main_from' of github.com:THUDM/SwissArmyTransformer into main_frommake training_main tokenizer-free, load hfmerge main and resolve conflictupdate examples to new layernorm argsadd pre/post/sandwich optionsMerge branch 'main_from' of github.com:THUDM/SwissArmyTransformer into main_from move init distributed and seed to get_argsadapt yolos to new versionadapt clip to new versionadapt cait to new versionadapt deit to new versionadapt vit to new versionadd model type argsadapt bert to new versionchange logMerge branch 'main_from' of github.com:THUDM/SwissArmyTransformer into main_fromfix lock and update_argstmp_from_pretrainupdate model name and urlmergenew param formatreformat args & add default zero-stageupdate from_pretrained for more modelsmove transformer.py to model and out, make ops foldersplit hooks outmixout_expmixout_expmixout_expMerge branch 'mixout' into finetunetmpmixoutmixoutmixoutnew versionMerge branch 'main' of https://github.com/THUDM/SwissArmyTransformer into main_from
Loading