Request: DOI
#26 opened about 2 hours ago
		by
		
				
 sreeshanthpeddi
							
						sreeshanthpeddi
	
Request for 2000 Samples from training data for NVFP4 QUANTIZATION
#25 opened about 9 hours ago
		by
		
				
 jasonface
							
						jasonface
	
Prepare support transformers
#24 opened about 11 hours ago
		by
		
				
 
							 rogeryoungh
						rogeryoungh
	
what data and its volume were used to train the model?
								3
#21 opened 3 days ago
		by
		
				
 Pep0pi
							
						Pep0pi
	
 
							230B vs 235B: Why no comparison against Qwen3-235B-A22B-Thinking-2507 ?
								1
#20 opened 3 days ago
		by
		
				
 rtzurtz
							
						rtzurtz
	
AWQ Please
								2
#18 opened 4 days ago
		by
		
				
 darkstar3537
							
						darkstar3537
	
GGUF support
👍
							➕
							
						7
				
								2
#17 opened 4 days ago
		by
		
				
 geboh67859
							
						geboh67859
	
Why does it keep trying to connect to huggingface?
								1
#16 opened 4 days ago
		by
		
				
 surak
							
						surak
	
 
							Was the training done with FP8 or BF16?
									1
	#14 opened 4 days ago
		by
		
				
 mindkrypted
							
						mindkrypted
	
 
							About the LCB evaluation
➕
							
						2
				
									2
	#13 opened 4 days ago
		by
		
				
 sayhitoday
							
						sayhitoday
	
YES!!
🚀
							
						10
				
								1
#12 opened 4 days ago
		by
		
				
 CyborgPaloma
							
						CyborgPaloma
	
When will transformers support Minimax-M2?
👍
							👀
							
						7
				#11 opened 4 days ago
		by
		
				
 zx-modelcloud
							
						zx-modelcloud
	
 
							Speculative decoding
#9 opened 5 days ago
		by
		
				
 adsfdgfhgjhk11
							
						adsfdgfhgjhk11
	
No lightning attention?
								3
#8 opened 5 days ago
		by
		
				
 djuna
							
						djuna
	
 
							