How to Load Model from Checkpoint Without the Class for Ensemble #20668
Unanswered
aditya0by0
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 1 reply
-
@mauvilsa, Can you please help if you have any idea regarding the above.? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
I am looking for a way to load PyTorch Lightning models ( and/or even torch models) from checkpoints and perform a forward pass, particularly in cases where I don't have access to the model class implementation saved in the checkpoint. I want to make this process more generic.
Currently, I am trying to use
load_from_checkpoint
onLightningModule
, but I’m encountering issues when the model class is not available. The challenge is that I only have access to the checkpoint files, and not the model class implementations.Here’s an abstract example of my current implementation:
Problem & Question
In scenarios where I don’t have access to the class of the model saved in the checkpoint but still need to load it and perform a forward pass, how can I handle this? Is there a way to generalize the process so that the ensemble model can dynamically handle different types of models without explicitly referencing their classes?
Any insights or best practices for making the model loading and inference process more flexible would be greatly appreciated!
Beta Was this translation helpful? Give feedback.
All reactions