Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
huggingface
/
inference-playground
like
235
Running
on
CPU Upgrade
App
Files
Files
Community
11
Fetching metadata from the HF Docker repository...
003aab5
inference-playground
/
src
/
lib
/
components
/
InferencePlayground
42.8 kB
6 contributors
History:
120 commits
mishig
HF Staff
correct highlgihting of selected model onmount
003aab5
over 1 year ago
InferencePlayground.svelte
11.8 kB
format
over 1 year ago
InferencePlaygroundCodeSnippets.svelte
9.44 kB
format
over 1 year ago
InferencePlaygroundConversation.svelte
Safe
1.64 kB
snippets "showToken" feature
over 1 year ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
over 1 year ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.84 kB
format
over 1 year ago
InferencePlaygroundMessage.svelte
Safe
1.54 kB
order imports
over 1 year ago
InferencePlaygroundModelSelector.svelte
2.09 kB
correct highlgihting of selected model onmount
over 1 year ago
InferencePlaygroundModelSelectorModal.svelte
5.65 kB
correct highlgihting of selected model onmount
over 1 year ago
generationConfigSettings.ts
Safe
933 Bytes
Rm advanced options for config
over 1 year ago
inferencePlaygroundUtils.ts
Safe
2.16 kB
make tokens count working for non-streaming as well
over 1 year ago
types.ts
Safe
607 Bytes
System message as part of Conversation
over 1 year ago