-
Notifications
You must be signed in to change notification settings - Fork 59
Unit Tests for On Device Sampling #463
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
8417d8f
27d8dd5
067f9b5
931860f
79b6c95
75eac30
eb6e2eb
48b35e3
c83a631
f698a24
7b34a07
5fa7269
0ee201a
c074768
115505e
3ac7503
02669e0
137cc4a
54a926a
6acf446
6083f5b
c2d7e83
1069109
7d67132
0e3f257
908e67e
a8e55da
f3f89d3
81ae15a
c485bfd
0e3b383
7d91470
e36add0
127ec74
dad96ca
126cbb0
30d025d
0b4575b
7f1d5f4
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -235,7 +235,6 @@ def main( | |
tokenizer, | ||
prompts=prompt, | ||
device_id=device_group, | ||
prompt=prompt, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Some of the inference tests in the later stages of CI were failing. From the logs, prompts = 'My name is', device_id = None, runtime_ai100 = True
kwargs = {'prompt': 'My name is', 'prompts_txt_file_path': 'examples/prompts.txt'}
> return QEfficient.cloud_ai_100_exec_kv(
tokenizer,
self.qpc_path,
prompt=prompts,
device_id=device_id,
generation_len=generation_len,
is_tlm=self.is_tlm,
**kwargs,
)
E TypeError: QEfficient.generation.text_generation_inference.cloud_ai_100_exec_kv() got multiple values for keyword argument 'prompt' @quic-rishinr please feel free to comment. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As rightly shown update the infer to follow the same. |
||
prompts_txt_file_path=prompts_txt_file_path, | ||
generation_len=generation_len, | ||
) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refer to my comment above: #463 (comment)