note: to expensive to run
#1
by
						
MichaelBoll
	
							
						- opened
							
					
better luck next year
jeety please
This comment has been hidden (marked as Off-Topic)
			
			
				
					
	
	
				
	
	
You cannot and will not run this model.
Beautiful
It runs in my laptop at a very usable tok/s. Great job, OpenAI!
20b can run on your laptop!
That's actually funny
benchmaxxed
if you want to play with 120b inference, here's a working code using vllm and free h100 http://playground.tracto.ai/playground?pr=notebooks/bulk-inference-gpt-oss-120b

 
						 
						 
						 
						 
						 
						 
						