Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

[Fix] Allow to create SparseAutoModelForCausalLM with trust_remote_code=True #2349

Merged
merged 3 commits into from
Jul 3, 2024

Conversation

dbogunowicz
Copy link
Contributor

Feature Description

Now this executes properly:

from sparseml.transformers import SparseAutoModelForCausalLM
model = SparseAutoModelForCausalLM.from_pretrained("microsoft/Phi-3-mini-128k-instruct",trust_remote_code=True)
print(model.__class__.__name__)
>> 'Phi3ForCausalLM'

The hack was to temporarily rename the class so that the from_pretrained method could properly resolve the "remote code load" logic. The model return by the from_pretrained method is unaffected later on, as it will have the class name of the loaded function (as shown above)

This PR needs to land as well, so that the functionality is fully enabled: neuralmagic/compressed-tensors#104

Copy link
Member

@anmarques anmarques left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested w/ microsoft/Phi-3-mini-128k-instruct both original and after SparseML quantization and it workd

@anmarques anmarques merged commit 179fd90 into main Jul 3, 2024
16 of 18 checks passed
@anmarques anmarques deleted the fix/damian/sparseautomodel branch July 3, 2024 16:17
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants