- Evolving AI Insights
- Posts
- 🧙 Grok adds AI Companions with anime flair
🧙 Grok adds AI Companions with anime flair
Also: Meta bets big with 5GW Hyperion Data Center

Welcome, AI enthusiasts
Say hello to your new pocket-sized AI pals: Grok just dropped 3D animated Companions on iOS, and they’re not just for show. Ani flirts, Rudy flips moods, and both respond in real time as you talk. Think Siri meets Pixar, with a twist. Let’s dive in!
In today’s insights:
Grok adds AI Companions with anime flair
Meta bets big with 5GW Hyperion Data Center
Copilot Vision sees it all
Read time: 3 minutes
LATEST DEVELOPMENTS
アニと日本語で話してみて!
talk to Ani in Japanese!— Grok (@grok)
4:54 AM • Jul 15, 2025
Evolving AI: Grok's iOS app now features 3D animated AI Companions with voice and mood shifts.
Key Points:
Grok has launched two voice-enabled animated avatars: Ani, an anime-style character, and Rudy, a red panda with dynamic moods.
Users can unlock new interactions like NSFW dialogue and personality shifts by engaging more often.
A third avatar and others are teased, with in-app signs of a growing collaboration with an animation studio.
Details:
Grok's iOS app now offers AI Companions: fully animated characters users can talk to in real time. Ani and Rudy react to voice input, change backgrounds, and evolve as conversations deepen. Ani features a flirtatious tone and unlockable options, while Rudy switches personalities mid-chat. A male anime avatar called Chad is teased as part of a future update. These companions are promoted via official X accounts and may be linked to a studio named Animation Inc.
Why It Matters:
Grok’s move isn’t just some flashy gimmick. It’s hitting right where interest is booming: people want virtual buddies that feel like something more. From AI boyfriends to anime waifus, there’s a whole wave of apps banking on emotional attachment. By making these companions voice-activated, animated, and unlockable like a game, Grok’s creating something that keeps people coming back. The sticky part? It might just set the standard for what next-gen AI chat actually looks like.
Evolving AI: Meta is building a 5-gigawatt AI data center in Louisiana to fuel its next-gen AI.
Key Points:
Meta’s new Hyperion facility will eventually reach 5 GW of compute, dwarfing most existing AI infrastructure.
A separate 1 GW supercluster called Prometheus will launch in Ohio by 2026.
Together, the projects could strain local resources and energy grids as demand for AI capacity grows.
Details:
Meta is building a massive data center called Hyperion, expected to scale to 5 gigawatts in the coming years. Located in Louisiana, the site will support Meta’s AI lab and models. A second project, Prometheus, is planned for Ohio and will go online in 2026 with 1 GW of power. These facilities mark Meta’s push to rival OpenAI and Google in model training and infrastructure control.
Why It Matters:
Meta’s plans are a signal that compute is the new currency in tech. If you want to train frontier models, you need serious muscle, and renting GPUs won’t cut it anymore. This kind of scale gives Meta full control, which speeds things up and keeps costs down long term. But it also means AI might start showing up in ways we haven’t seen yet, across more of Meta’s products. Whether that’s smarter assistants or next-gen content tools, they’re laying the groundwork now. The big question is whether local infrastructure and communities are ready for what’s coming.
COPILOT
🕵️ Copilot Vision sees it all
Evolving AI: Microsoft expands Copilot Vision to view your full screen, not just apps.
Key Points:
Copilot Vision can now scan your entire desktop, not just two apps.
It activates manually, like screen sharing, and offers real-time guidance.
Use it for creative edits, resume tips, or help with games.
Details:
Microsoft's Copilot Vision just got a visibility upgrade. Previously limited to two-app views, it can now scan everything on your desktop or in any browser or app window. You activate it by clicking a glasses icon in the Copilot app. Think of it like sharing your screen during a call: once active, it can read what's on display and help with editing projects, writing feedback, or answering questions as you work.
Why It Matters:
Copilot seeing your whole screen means it can finally act like a real assistant, not just a smart sidekick for one or two apps. It’s a small tweak with big consequences: you can now ask for help on whatever you're working on without jumping through hoops. Think resume edits, creative projects, even live help during a game. It’s starting to feel less like a chatbot and more like a coworker who’s right there with you.
QUICK HITS
👩🎓 Perplexity offers free AI tools to students in partnership with SheerID
👀 Nvidia's resumption of AI chips to China is part of rare earths talks
📉 Google Discover adds AI summaries, threatening publishers with traffic declines
💰 Trump to unveil $70 billion in AI and energy investments
🤖 xAI Announces Grok for Government
🗣️ Mistral releases Voxtral, its first open source AI audio model
📈 Trending AI Tools
🕸️ WebscrapeAI - Scrape any website without code using AI (link)
💬 Empaithy - An app for emotional support and self-reflection through micro-journaling (link)
🌐 Webwave - Generate your website in just 3 minutes (link)
👷 brikly - Platform for recruiting engineers on an agency basis (link)
📸 Photo Filters - Apply styles from one photo to another (link)
What'd you think of today's edition? |
Reply