Spaces:
				
			
			
	
			
			
		Runtime error
		
	
	
	
			
			
	
	
	
	
		
		
		Runtime error
		
	Update blog link and text
Browse files
    	
        app.py
    CHANGED
    
    | @@ -32,9 +32,9 @@ output_box = gr.Textbox(label="Lyrics by The Beatles and chosen language model:" | |
| 32 | 
             
            # Layout and text above the App
         | 
| 33 | 
             
            title='Beatles lyrics generator'
         | 
| 34 | 
             
            description="<p style='text-align: center'>Multiple language models were fine-tuned on lyrics from The Beatles to generate Beatles-like text. Give it a try!</p>"
         | 
| 35 | 
            -
            article="""<p style='text-align: left'> | 
| 36 | 
             
                    We tried several text generation models that we were able to load in Colab: a general <a href='https://huggingface.co/gpt2-medium' target='_blank'>GPT2-medium</a> model, the Eleuther AI small-sized GPT model <a href='https://huggingface.co/EleutherAI/gpt-neo-125M' target='_blank'>GPT-Neo</a> and the new kid on the block build by the <a href='https://bigscience.notion.site/BLOOM-BigScience-176B-Model-ad073ca07cdf479398d5f95d88e218c4' target='_blank'>Bigscience</a> initiative <a href='https://huggingface.co/bigscience/bloom-560m' target='_blank'>BLOOM 560m</a>. 
         | 
| 37 | 
            -
                    Further we've put together a <a href='https://huggingface.co/datasets/cmotions/Beatles_lyrics' target='_blank'> Huggingface dataset</a> containing all known lyrics created by The Beatles. <a href='https://www.theanalyticslab.nl/blogs/' target='_blank'> | 
| 38 | 
             
                    The default output contains 100 tokens and has a repetition penalty of 1.0. 
         | 
| 39 | 
             
                     </p>"""
         | 
| 40 |  | 
|  | |
| 32 | 
             
            # Layout and text above the App
         | 
| 33 | 
             
            title='Beatles lyrics generator'
         | 
| 34 | 
             
            description="<p style='text-align: center'>Multiple language models were fine-tuned on lyrics from The Beatles to generate Beatles-like text. Give it a try!</p>"
         | 
| 35 | 
            +
            article="""<p style='text-align: left'>These text generation models that output Beatles-like text were created by data scientists working for <a href='https://cmotions.nl/' target="_blank">Cmotions.</a>  
         | 
| 36 | 
             
                    We tried several text generation models that we were able to load in Colab: a general <a href='https://huggingface.co/gpt2-medium' target='_blank'>GPT2-medium</a> model, the Eleuther AI small-sized GPT model <a href='https://huggingface.co/EleutherAI/gpt-neo-125M' target='_blank'>GPT-Neo</a> and the new kid on the block build by the <a href='https://bigscience.notion.site/BLOOM-BigScience-176B-Model-ad073ca07cdf479398d5f95d88e218c4' target='_blank'>Bigscience</a> initiative <a href='https://huggingface.co/bigscience/bloom-560m' target='_blank'>BLOOM 560m</a>. 
         | 
| 37 | 
            +
                    Further we've put together a <a href='https://huggingface.co/datasets/cmotions/Beatles_lyrics' target='_blank'> Huggingface dataset</a> containing all known lyrics created by The Beatles. Currently we are fine-tuning models and are evaluating the results. Once finished we will publish a blog at this <a href='https://www.theanalyticslab.nl/blogs/' target='_blank'>location </a> with all the steps we took including a Python notebook using Huggingface.
         | 
| 38 | 
             
                    The default output contains 100 tokens and has a repetition penalty of 1.0. 
         | 
| 39 | 
             
                     </p>"""
         | 
| 40 |  | 
