2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Starting merged model save process 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Arguments: {'lambdas_path': '/work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/task-wise/cross_entropy-ep3-lr0001-10%dataset-lambda10_wo_scheduler/llm_adamerge_lambdas.json', 'model_config': '/work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/task-wise/cross_entropy-ep3-lr0001-10%dataset-lambda10_wo_scheduler/model_config.yaml', 'output_dir': '/work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/llmadamerge/task-wise-cross_entropy-lr0001-ep3-10%dataset/lambda10_wo_scheduler', 'model_name': 'merged-model', 'push_to_hub': False, 'hub_repo_id': 'lejelly/deepseek-ep3-10%dataset-taskwise-lambda10', 'private': False, 'device': 'cuda', 'debug': False} 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Loading lambdas from /work/gj26/b20042/LLM-AdaMerge/outputs/deepseek-7b/task-wise/cross_entropy-ep3-lr0001-10%dataset-lambda10_wo_scheduler/llm_adamerge_lambdas.json 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Auto-detected parameter-wise merge from JSON structure 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Merge type: parameter_wise 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - [Initial] Memory Usage: 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Process: 0.54 GB (0.3%) 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - System: 8.01 GB / 212.49 GB (8.4%) 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Available: 194.66 GB 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:23:33 - experiment_save_merged_model - INFO - Loading models 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - [After loading models] Memory Usage: 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - Process: 38.80 GB (18.3%) 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - System: 47.15 GB / 212.49 GB (30.1%) 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - Available: 148.54 GB 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:23:49 - experiment_save_merged_model - INFO - Initializing parameter_wise AdaMerge 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Loading learned lambdas 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Deleting original models to free memory (task vectors already computed) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - [Before deleting models] Memory Usage: 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Process: 64.58 GB (30.4%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - System: 75.45 GB / 212.49 GB (43.4%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Available: 120.18 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Clearing model_loader references 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Deleting model variables 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Running garbage collection 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - [After deleting models and GC] Memory Usage: 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Process: 38.82 GB (18.3%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - System: 49.69 GB / 212.49 GB (31.3%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Available: 145.95 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - [After loading lambdas] Memory Usage: 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Process: 38.82 GB (18.3%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - System: 49.69 GB / 212.49 GB (31.3%) 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Available: 145.95 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Creating merged model with learned lambdas 2025-10-09 19:24:57 - experiment_save_merged_model - INFO - Using merge_models_for_save() 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - [After merging models] Memory Usage: 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Process: 38.84 GB (18.3%) 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - System: 76.41 GB / 212.49 GB (43.9%) 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Available: 119.22 GB 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - GPU 0: Allocated: 12.87 GB, Reserved: 25.74 GB, Total: 94.50 GB 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Freeing memory from AdaMerge object (task vectors and base params no longer needed) 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Deleting task vectors 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Deleting base params 2025-10-09 19:26:50 - experiment_save_merged_model - INFO - Deleting functional model 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - [After freeing AdaMerge memory] Memory Usage: 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - Process: 0.19 GB (0.1%) 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - System: 24.87 GB / 212.49 GB (19.6%) 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - Available: 170.77 GB 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - GPU 0: Allocated: 12.87 GB, Reserved: 12.87 GB, Total: 94.50 GB 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - Saving merged model to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/llmadamerge/task-wise-cross_entropy-lr0001-ep3-10%dataset/lambda10_wo_scheduler 2025-10-09 19:26:51 - experiment_save_merged_model - INFO - Moving merged model to CPU for saving 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Successfully saved 3 safetensors files: 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - - model-00003-of-00003.safetensors (3674.14 MB) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - - model-00002-of-00003.safetensors (4750.20 MB) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - - model-00001-of-00003.safetensors (4756.17 MB) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - [After saving model] Memory Usage: 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Process: 13.07 GB (6.1%) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - System: 21.58 GB / 212.49 GB (18.1%) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Available: 174.05 GB 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Saving tokenizer 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Copied lambdas file to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/llmadamerge/task-wise-cross_entropy-lr0001-ep3-10%dataset/lambda10_wo_scheduler/learned_lambdas.json 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Creating model card 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Cleaning up models 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - [After cleanup] Memory Usage: 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Process: 13.07 GB (6.2%) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - System: 21.59 GB / 212.49 GB (18.1%) 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Available: 174.05 GB 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - GPU 0: Allocated: 0.00 GB, Reserved: 0.00 GB, Total: 94.50 GB 2025-10-09 19:27:44 - experiment_save_merged_model - INFO - Model saved successfully to /work/gj26/b20042/LLM-AdaMerge/mergekit/outputs/deepseek-7b/llmadamerge/task-wise-cross_entropy-lr0001-ep3-10%dataset/lambda10_wo_scheduler