Subscribe
Sign in
Home
Featherless AI
About
Latest
Top
Discussions
Featherless Feud: a dip into LLM-powered game development
rebuilding a TV classic
Aug 20
•
Erik Cadieux
and
Wesley George
Share this post
Featherless Feud: a dip into LLM-powered game development
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
minmodmon: A quickstart to local RWKV
In April we launched our RWKV-based model, EagleX v2. EagleX goes toe-to-toe with modern transformers on performance, while being much cheaper to run…
Aug 14
•
Layl Bongers
Share this post
minmodmon: A quickstart to local RWKV
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
Model Support Summary + new this week
Nemo 12B and Qwen2 32B latest additions for a total of 12 model families
Aug 3
•
Wesley George
Share this post
Model Support Summary + new this week
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
June 2024
🪶 Featherless.ai referral program
Invite a friend, and if they sign up for a plan, its $10 off both of your next bill
Jun 24
•
Recursal AI
Share this post
🪶 Featherless.ai referral program
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
🚀 Launching 🪶 Featherless.AI
Run any 🦙 model from Hugging Face, instantly.
Jun 24
•
Recursal AI
Share this post
🚀 Launching 🪶 Featherless.AI
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
May 2024
Featherless: an introduction
making every hugging face model available for inference and why it matters
May 31
•
Wesley George
Share this post
Featherless: an introduction
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
April 2024
🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5)
You have seen the teaser with the EagleX 1.7T, now its here - the definitive version of linear transformer trained past, LLaMA 2 7B.
Published on RWKV Open Source Development Blog
•
Apr 18
🦅 Eagle & 🐦 Finch - architecture paper is here
Available at your local arxiv
Published on RWKV Open Source Development Blog
•
Apr 10
Dear VC’s, please stop throwing money at AI founders with no commercial plan, besides AGI
Execute a commercial plan from day 0, not next year in the future, when the bank is empty
Published on Tech Talk CTO
•
Apr 8
March 2024
🦅 EagleX v1 : Soaring past LLaMA 7B 2T in both English and Multi-lang evals (RWKV-v5)
A linear transformer has just cross the gold standard in transformer models, LLaMA 7B, with less tokens trained in both English and multi-lingual evals…
Mar 16
•
Eugene Cheah
8
Share this post
🦅 EagleX v1 : Soaring past LLaMA 7B 2T in both English and Multi-lang evals (RWKV-v5)
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
10
January 2024
Launching Eagle 7B - into our public demo, and open router (till March 2024)
Brining the worlds strongest multi-lingual model to the world
Jan 30
•
Eugene Cheah
Share this post
Launching Eagle 7B - into our public demo, and open router (till March 2024)
substack.recursal.ai
Copy link
Facebook
Email
Note
Other
🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages (RWKV-v5)
A brand new era for the RWKV-v5 architecture and linear transformer's has arrived - with the strongest multi-lingual model in open source today
Published on RWKV Open Source Development Blog
•
Jan 29
Share
Copy link
Facebook
Email
Note
Other
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts