Hyper-v

GPT-NeoX-20B – Open-Source huge language model by EleutherAI (Interview w/ co-founder Connor Leahy)



#eleuther #gptneo #gptj

EleutherAI announces GPT-NeoX-20B, a 20 billion parameter open-source language model, inspired by GPT-3. Connor joins me to discuss the process of training, how the group got their hands on the necessary hardware, what the new model can do, and how anyone can try it out!

OUTLINE:
0:00 – Intro
1:00 – Start of interview
2:00 – How did you get all the hardware?
3:50 – What’s the scale of this model?
6:00 – A look into the experimental results
11:15 – Why are there GPT-Neo, GPT-J, and GPT-NeoX?
14:15 – How difficult is training these big models?
17:00 – Try out the model on GooseAI
19:00 – Final thoughts

Read the announcement:
Try out the model:
Check out EleutherAI:
Read the code:
Hardware sponsor:

Links:
TabNine Code Completion (Referral):
YouTube:
Twitter:
Discord:
BitChute:
LinkedIn:
BiliBili:

If you want to support me, the best thing to do is to share out the content 🙂

If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar:
Patreon:
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n

source

 

To see the full content, share this page by clicking one of the buttons below

Related Articles

Leave a Reply