Car Game
Race a 3D car through a sunset track
Sweet thank you David
Can you train a Gemma 4 26B A4B model? I like your models David, I like how you expand upon existing models and experiment and make improvements (I was a big huge fan of your Dark Champion Llama series where you had the creative idea of training multiple different llamas and putting it together as an MoE model, that was one of my favorites for a long time) This Gemma 4 release is really awesome - the 26B a4b model runs so smooth on my setup, I was wondering if you could work your magic on this MoE model and see if you could make improvements on that one. I know you're focused on the big one, the 31B one. But this MoE model is so awesome its hard to ignore. I like it because its fast. Its really creative, its accurate, and its super fast. Its answers it puts out, the quality is nearly that of the 70+ to 100+b parameter model LLMs, its that good already. I was wondering if maybe you could work your magic and maybe do some training on this specific model, combine some really good uncensored ones (the Bartowski one is pretty good, its what I'm currently using) Is there a way you could frankenmerge some of the best performing ones of this 26B-a4b-it? I see there's a few ongoing attempts to train this Gemma 4 on Claude Opus 4.6 datasets, can you do that? The attempts I tried were broken, but most of your releases are pretty solid (at least 80% of the time, I know its all a work in progress and its all experimenting around with stuff) but your releases are usually really pretty solid. Think you could do something with that model? Thanks David
Google solves the RAM crisis by releasing frontier-ai-performance-tier Gemma 4 that takes up as much space as small language models. Quod erat demonstrandum, the other companies that were trying to scale up rather than work on better compression like Google did are now having to play catch up to Google lol.
Anyways thanks again Google
I'm impressed, extremely intelligent models, frontier-level intelligence without being resource hog like the 70+billion models. Fast, responsive, intelligent, flexible, usable on multi platform, huge leap up from Gemma 3. I can tell a huge leap up. This feels like bringing the 70+b parameter models to home consumers, that's what it feels like. Good job google, good job siding with the people instead of the big data centers for once. Good job. More of this, less data centers. As tech advances, the goal is to make it take up less space and not be as obtrusive in our lives. This achieves that. Thank you Google. For once, someone hit the ball out of the park with a home run. This equals the playing field and lets everyone who has a halfway semi decent system have access to AI. Makes it so everyone has access, not just rich people, and not just exploiters who rent out software as a service. Good job.