- The Rundown AI
- Posts
- AI gives paralyzed patients robotic control
AI gives paralyzed patients robotic control
PLUS: Design and build a mobile app with Figma AI
Good morning, AI enthusiasts. UCLA engineers just turned a standard EEG cap into a mind-reading device powerful enough to control robotic limbs.
By pairing it with AI that interprets intent in real-time, they've given paralyzed users abilities that used to require brain surgery — and a window into the near future of non-invasive, assistive technology.
In today’s AI rundown:
AI helps paralyzed patients control robots
AI’s favorite buzzwords seep into everyday speech
Design and build mobile apps with Figma AI
MIT’s AI to predict flu vaccine success
4 new AI tools, community workflows, and more
LATEST DEVELOPMENTS
OPENAI

Image source: UCLA
The Rundown: UCLA engineers just created a wearable brain-computer interface that uses AI to interpret EEG signals, enabling paralyzed users to control robotic arms using their thoughts without any invasive surgery.
The details:
Researchers paired a custom EEG decoder with a camera-based AI to interpret a patient’s movement intent in real time.
They tested the BCI with four users, including one paralyzed participant who completed robotic tasks in 6.5 minutes versus being unable to without it.
Participants moved cursors to targets and directed robotic arms to relocate blocks, completing both tasks nearly 4x faster with AI assistance.
The system used standard EEG caps, eliminating surgical risks while still achieving performance levels similar to the invasive alternatives.
Why it matters: Decades after the first brain implants, we're finally seeing non-invasive BCIs that actually work — with AI filling the gaps where brain signals fail. AI co-pilots will eventually help not just with robotic limbs but in wheelchairs, communication devices, and smart homes that anticipate needs before users even think them.
TOGETHER WITH VANTA
The Rundown: In today's AI boom, shipping fast gets attention — but building trust gets results. Vanta and Mercury show how SOC 2, audit-ready financials, and risk controls have become day-one signals that fuel growth, not later-stage checkboxes.
In this virtual event, learn to:
Pass investor diligence with confidence and fewer follow-ups
Avoid security and procurement blockers before they stall deals
Stay ahead of compliance without hiring full teams
Register now to build enterprise-grade credibility without slowing down.
AI RESEARCH

Image source: Ideogram / The Rundown
The Rundown: A new study from Florida State University researchers found that AI-favored buzzwords have seen massive surges in podcast conversations since ChatGPT's 2022 launch, calling the linguistic changes a “seep-in effect.”
The details:
The study analyzed 22.1M words from unscripted content like podcasts, finding 75% of AI-associated terms showed increases post-ChatGPT release.
The research tracked science and tech podcasts where hosts likely use ChatGPT regularly, making them early indicators of the linguistic changes.
Words flagged included “boast”, “meticulous" and “delve”, with experts attributing them to AI training on large amounts of corporate and web content.
A separate German study found similar results, with the same words like “delve” and “meticulous” seeing upticks in YouTube and podcast content.
Why it matters: A few years is all it took for AI to start rewiring how humans talk to each other. Today, it's buzzwords creeping into podcasts, but tomorrow expect AI's fingerprints everywhere — from web designs taking similar AI-created patterns to developers largely writing code with agentic platforms.
AI TRAINING
The Rundown: In this tutorial, you will learn how to use Figma AI to design a complete mobile app from simple text prompts, turning ideas into interactive prototypes with working buttons and states in seconds.
Step-by-step:
Go to Figma.com, sign in, and click "Make" in the top menu — you'll see a chat box asking "What do you want to make?"
Type your app idea or start with a template: "Create an app for tennis players to find community courts, track stats, and share activity, similar to Strava but for tennis"
Refine with specific design directions: "Clean white background, deep green text, clay beige accents, small pops of neon yellow" — Figma adjusts spacing, padding, and corners automatically
Add features by prompting: "Add calendar integration for booking courts via Cal.com" — Figma AI suggests Supabase for auth and creates logical button interactions
Preview in a new window, share with teammates, or export the design to Cursor for full-stack development
Pro tip: Think of Figma AI as your junior designer; the clearer your direction, the better the result. Each refinement gets you closer to production-ready designs that already have working interactions built in.
PRESENTED BY FUEL iX
The Rundown: 57% of employees enter sensitive data into public GenAI tools like ChatGPT, Claude, and Gemini at work. Fuel iX’s new report, Demystifying Shadow AI in the Workplace, uncovers how employees are using GenAI and the hidden risks this creates for enterprises.
In this report, you’ll discover how to:
Assess your organization’s exposure to Shadow AI
Identify the key security risks of employees using public GenAI tools
Apply expert strategies to manage AI adoption safely and responsibly
AI RESEARCH

Image source: Ideogram / The Rundown
The Rundown: MIT researchers created VaxSeer, an AI system that predicts which flu strains will dominate future seasons and identifies the most protective vaccine candidates months in advance.
The details:
The system uses deep learning trained on decades of viral sequences and lab test data to forecast strain dominance and vaccine effectiveness.
In testing against past flu seasons, VaxSeer beat the WHO's vaccine picks 15 out of 20 times across two major flu types.
The system also spotted a winning vaccine formula in 2016 that health officials didn't choose until the following year.
VaxSeer's predictions matched up strongly with how well vaccines actually worked when given to real patients.
Why it matters: With vaccines needing to be created ahead of flu season, choosing the correct strain is a guessing game, which often results in hit-or-miss effectiveness. With VaxSeer’s ability to read patterns humans miss to help make better predictions, targeting the correct bug could mean a lot fewer illnesses come flu season.
QUICK HITS
💬 Hunyuan-MT - Tiny, open-source SOTA translation model
🍥 USO - ByteDance’s creative model for style and subject generations
🗣️ Higgsfield Speak 2.0 - Make avatars speak with lip-sync and motion
🔊 MAI-Voice-1 - Microsoft’s new in-house voice generation model
Honeycomb Observability Day SF, Sep.11 – Join Charity Majors & Liz Fong-Jones to explore the future of observability in the age of AI. Save your spot.*
OpenAI is reportedly in talks to build a 1GW minimum datacenter in India as part of its Stagate project initiative, with CEO Sam Altman set to visit the country this month.
Tencent released Hunyuan-MT-7B and Hunyuan-MT-Chimera, an open-source joint AI translation system that outperforms rivals in its size category across 33 languages.
CEO Marc Benioff revealed that Salesforce has reduced its support headcount by 45% this year, using AI agents to handle lead response and customer conversations.
Chinese president Xi Jinping spoke on AI at the Shanghai Cooperation Organization, calling for global cooperation and rejecting the “Cold War mentality” around the tech.
*Sponsored Listing
COMMUNITY
Every newsletter, we showcase how a reader is using AI to work smarter, save time, or make life easier.
Today’s workflow comes from reader Hunain A. in Kuala Lumpur, Malaysia:
“I record videos for clients. tons of them. They need to be uploading consistently every day, and it’s all personal branding, so scripting, lighting, filming, cutting, editing then publishing. This took a whole load of time for me and approval cycles from them. I decided to automate that and built a full custom N8N automation that does all of it for me. It has already freed up almost 35 hours a week AT LEAST. The automation comprises GPT for scripts, Heygen for Avatar, and coded stuff for auto-editing. All that happens from now on is a couple of approval cycles, which I will hopefully automate, and publishing to platforms."
How do you use AI? Tell us here.
Read our last AI newsletter: xAI sues ex-engineer for trade secret theft
Read our last Tech newsletter: Netflix goes full-on theme park
Read our last Robotics newsletter: Nvidia’s palm-sized ‘robot brain’
Today’s AI tool guide: Design and build a mobile app with Figma AI
Join our next live workshop: Building professional AI automation workflows
That's it for today!Before you go we’d love to know what you thought of today's newsletter to help us improve The Rundown experience for you. |
See you soon,
Rowan, Joey, Zach, Shubham, and Jennifer — the humans behind The Rundown

Reply