Skip to content

Conversation

cl676767
Copy link

@cl676767 cl676767 commented Sep 29, 2025

Accept plz

@@ -0,0 +1,66 @@
import torch
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you write a docstring at the start with the explanation on how to use the training pipeline?

def save(self,path):
torch.save(self.model.state_dict(),path)


No newline at end of file
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you also write a simple test to make sure the code is working in the test folder?

self.optimizer = optim.Adam(self.model.parameters(),lr = lr)
self.epochs = epochs

#train test split(0.8:0.2)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we abstract this into a default parameter?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants