Robotics

Hugging Face Launches $299 Robot That Could Disrupt Entire Robotics Industry (venturebeat.com) 69

An anonymous reader quotes a report from VentureBeat: Hugging Face, the $4.5 billion artificial intelligence platform that has become the GitHub of machine learning, announced Tuesday the launch of Reachy Mini, a $299 desktop robot designed to bring AI-powered robotics to millions of developers worldwide. The 11-inch humanoid companion represents the company's boldest move yet to democratize robotics development and challenge the industry's traditional closed-source, high-cost model.

The announcement comes as Hugging Face crosses a significant milestone of 10 million AI builders using its platform, with CEO Clement Delangue revealing in an exclusive interview that "more and more of them are building in relation to robotics." The compact robot, which can sit on any desk next to a laptop, addresses what Delangue calls a fundamental barrier in robotics development: accessibility. "One of the challenges with robotics is that you know you can't just build on your laptop. You need to have some sort of robotics partner to help in your building, and most people won't be able to buy $70,000 robots," Delangue explained, referring to traditional industrial robotics systems and even newer humanoid robots like Tesla's Optimus, which is expected to cost $20,000-$30,000.

Reachy Mini emerges from Hugging Face's April acquisition of French robotics startup Pollen Robotics, marking the company's most significant hardware expansion since its founding. The robot represents the first consumer product to integrate natively with the Hugging Face Hub, allowing developers to access thousands of pre-built AI models and share robotics applications through the platform's "Spaces" feature. [...] Reachy Mini packs sophisticated capabilities into its compact form factor. The robot features six degrees of freedom in its moving head, full body rotation, animated antennas, a wide-angle camera, multiple microphones, and a 5-watt speaker. The wireless version includes a Raspberry Pi 5 computer and battery, making it fully autonomous. The robot ships as a DIY kit and can be programmed in Python, with JavaScript and Scratch support planned. Pre-installed demonstration applications include face and hand tracking, smart companion features, and dancing moves. Developers can create and share new applications through Hugging Face's Spaces platform, potentially creating what Delangue envisions as "thousands, tens of thousands, millions of apps."
Reachy Mini's $299 price point could significantly transform robotics education and research. "Universities, coding bootcamps, and individual learners could use the platform to explore robotics concepts without requiring expensive laboratory equipment," reports VentureBeat. "The open-source nature enables educational institutions to modify hardware and software to suit specific curricula. Students could progress from basic programming exercises to sophisticated AI applications using the same platform, potentially accelerating robotics education and workforce development."

"... For the first time, a major AI platform is betting that the future of robotics belongs not in corporate research labs, but in the hands of millions of individual developers armed with affordable, open-source tools."
AI

People Are Using AI Chatbots To Guide Their Psychedelic Trips 46

An anonymous reader quotes a report from Wired: Trey had struggled with alcoholism for 15 years, eventually drinking heavily each night before quitting in December. But staying sober was a struggle for the 36-year-old first responder from Atlanta, who did not wish to use his real name due to professional concerns. Then he discovered Alterd, an AI-powered journaling app that invites users to "explore new dimensions" geared towards psychedelics and cannabis consumers, meditators, and alcohol drinkers. In April, using the app as a tripsitter -- a term for someone who soberly watches over another while they trip on psychedelics to provide reassurance and support -- he took a huge dose of 700 micrograms of LSD. (A typicalrecreational doseis considered to be 100 micrograms.) "I went from craving compulsions to feeling true freedom and not needing or wanting alcohol," he says.

He recently asked the app's "chat with your mind" function how he had become more wise through all his AI-assisted psychedelic trips. It responded: "I trust my own guidance now, not just external rules or what others think. I'm more creative, less trapped by fear, and I actually live by my values, not just talk about them. The way I see, reflect, and act in the world is clearer and more grounded every day." "It's almost like your own self that you're communicating with," says Trey, adding he's tripped with his AI chatbot about a dozen times since April. "It's like your best friend. It's kind of crazy."
The article mentions several different chatbot tools and AI systems that are being used for psychedelic therapy.

ChatGPT: "Already, many millions of people are using ChatGPT on a daily basis, and the developments may have helped democratize access to psychotherapy-style guidance, albeit in a dubious Silicon Valley style with advice that is often flush with untruths," reports Wired. The general-purpose AI chatbot is being used for emotional support, intention-setting, and even real-time guidance during psychedelic trips. While not designed for therapy, it has been used informally as a trip companion, offering customized music playlists, safety reminders, and existential reflections. Experts caution that its lack of emotional nuance and clinical oversight poses significant risks during altered states.

Alterd: Alterd is a personalized AI journal app that serves as a reflective tool by analyzing a user's entries, moods, and behavior patterns. Its "mind chat" function acts like a digital subconscious, offering supportive insights while gently confronting negative habits like substance use. Users credit it with deepening self-awareness and maintaining sobriety, particularly in the context of psychedelic-assisted growth.

Mindbloom's AI Copilot: Integrated into Mindbloom's at-home ketamine therapy program, the AI copilot helps clients set pretrip intentions, process post-trip emotions, and stay grounded between sessions. It generates custom reflections and visual art based on voice journals, aiming to enhance the therapeutic journey even outside of human-guided sessions. The company plans to evolve the tool into a real-time, intelligent assistant capable of interacting more dynamically with users.

Orb AI/Shaman Concepts (Speculative): Conceptual "orb" interfaces imagine an AI-powered, shaman-like robot facilitating various aspects of psychedelic therapy, from intake to trip navigation. While still speculative, such designs hint at a future where AI plays a central, embodied role in guiding altered states. These ideas raise provocative ethical and safety questions about replacing human presence with machines in deeply vulnerable psychological contexts.

AI in Virtual Reality and Brain Modulation Systems: Researchers are exploring how AI could coordinate immersive virtual reality environments and brain-modulating devices to enhance psychedelic therapy. These systems would respond to real-time emotional and physiological signals, using haptic suits and VR to deepen and personalize the psychedelic experience. Though still in the conceptual phase, this approach represents the fusion of biotech, immersive tech, and AI in pursuit of therapeutic transformation.
Businesses

Amazon Deploys Its One Millionth Robot, Releases Generative AI Model (techcrunch.com) 13

An anonymous reader quotes a report from TechCrunch: After 13 years of deploying robots into its warehouses, Amazon reached a new milestone. The tech behemoth now has 1 million robots in its warehouses, the company announced Monday. This one millionth robot was recently delivered to an Amazon fulfillment facility in Japan. That figure puts Amazon on track to reach another landmark: Its vast network of warehouses may soon have the same number of robots working as people, according to reporting from The Wall Street Journal. The WSJ also reported that 75% of Amazon's global deliveries are now assisted in some way by a robot. Amazon also unveiled a new generative AI model called DeepFleet, built using SageMaker and trained on its own warehouse data, which improves robotic fleet speed by 10% through more efficient route coordination.
AI

How Robotic Hives and AI Are Lowering the Risk of Bee Colony Collapse (phys.org) 20

alternative_right shares a report from Phys.Org: The unit -- dubbed a BeeHome -- is an industrial upgrade from the standard wooden beehives, all clad in white metal and solar panels. Inside sits a high-tech scanner and robotic arm powered by artificial intelligence. Roughly 300,000 of these units are in use across the U.S., scattered across fields of almond, canola, pistachios and other crops that require pollination to grow. [...] AI and robotics are able to replace "90% of what a beekeeper would do in the field," said Beewise Chief Executive Officer and co-founder Saar Safra. The question is whether beekeepers are willing to switch out what's been tried and true equipment. [...]

While a new hive design alone isn't enough to save bees, Beewise's robotic hives help cut down on losses by providing a near-constant stream of information on colony health in real time -- and give beekeepers the ability to respond to issues. Equipped with a camera and a robotic arm, they're able to regularly snap images of the frames inside the BeeHome, which Safra likened to an MRI. The amount of data they capture is staggering. Each frame contains up to 6,000 cells where bees can, among other things, gestate larvae or store honey and pollen. A hive contains up to 15 frames and a BeeHome can hold up to 10 hives, providing thousands of data points for Beewise's AI to analyze.

While a trained beekeeper can quickly look at a frame and assess its health, AI can do it even faster, as well as take in information on individual bees in the photos. Should AI spot a warning sign, such as a dearth of new larvae or the presence of mites, beekeepers will get an update on an app that a colony requires attention. The company's technology earned it a BloombergNEF Pioneers award earlier this year. "There's other technologies that we've tried that can give us some of those metrics as well, but it's really a look in the rearview mirror," [said Zac Ellis, the senior director of agronomy at OFI, a global food and ingredient seller]. "What really attracted us to Beewise is their ability to not only understand what's happening in that hive, but to actually act on those different metrics."

AI

China Hosts First Fully Autonomous AI Robot Football Match (theguardian.com) 18

An anonymous reader quotes a report from The Guardian: Four teams of humanoid robots took each other on in Beijing [on Saturday], in games of three-a-side powered by artificial intelligence. While the modern game has faced accusations of becoming near-robotic in its obsession with tactical perfection, the games in China showed that AI won't be taking Kylian Mbappe's job just yet. Footage of the humanoid kickabout showed the robots struggling to kick the ball or stay upright, performing pratfalls that would have earned their flesh-and-blood counterparts a yellow card for diving. At least two robots were stretchered off after failing to regain their feet after going to ground.

[...] The competition was fought between university teams, which adapted the robots with their own algorithms. In the final match, Tsinghua University's THU Robotics defeated the China Agricultural University's Mountain Sea team with a score of 5-3 to win the championship. One Tsinghua supporter celebrated their victory while also praising the competition. "They [THU] did really well," he said. "But the Mountain Sea team was also impressive. They brought a lot of surprises."
Cheng Hao, CEO of Booster Robotics, said he envisions future matches between humans and robots, though he acknowledges current robots still lag behind in performance. He also said safety will need to be a top priority.

You can watch highlights of the match on YouTube.
Medicine

7 People Now Have Neuralink Brain Implant 29

Seven people have now received Neuralink's N1 brain implant, which enables individuals with ALS or spinal cord injuries to control a computer with their thoughts. PCMag reports: In a February 2025 update, Neuralink confirmed that three people had received its brain-computer interface (BCI). That increased to five by June, when it also reported a $650 million funding round. We're now at seven, Barrow tweeted today; Neuralink retweeted that message.

Six of the seven are participating in the PRIME study, conducted by Barrow, which handles the implantations from its Phoenix, Arizona, office. It aims to prove that the N1 implant, the R1 surgical robot, and the N1 User App on the computer are safe and effective, according to the program brochure. (No BCIs have been approved by the US Food and Drug Administration.)

Participants in the study get the implant through a surgery in which a custom-built robotic arm drills a hole in their skull and implants the device. The implant connects to a computer via Bluetooth, allowing patients to move the cursor, select words to type, browse the web, and even play video games -- a favorite activity of Neuralink's first human patient, Noland Arbaugh, who can do this all without moving any limbs or fingers. [...] Arbaugh, now 31, became paralyzed during a diving accident. Other Neuralink patients include Alex, a former machine parts builder who lost function of his arms and uses his N1 Implant to design 3D machine parts with computer-aided design (CAD). The third patient is Brad, the first person with ALS to receive the N1 implant, according to Barrow.

Mike is the fourth patient, and "the first person with a full-time job to use the N1 Implant," Barrow says. "He worked as a survey technician for city government and spent the majority of his time in the field until his ALS made the work too difficult. Like Alex, Mike has used CAD software with his Neuralink device to continue doing survey work from home and provide for his family." The fifth publicly named patient is RJ, a veteran who became paralyzed after a motorcycle accident, according to the University of Miami. The other two patients remain anonymous, but we can expect Neuralink to continue recruiting more people (here's how to apply).
Medicine

Doctors Perform First Robotic Heart Transplant In US Without Opening a Chest 38

An anonymous reader quotes a report from Neuroscience News Science Magazine: Surgeons have performed the first fully robotic heart transplant in the U.S., using advanced robotic tools to avoid opening the chest. [...] Using a surgical robot, lead surgeon Dr. Kenneth Liao and his team made small, precise incisions, eliminating the need to open the chest and break the breast bone. Liao removed the diseased heart, and the new heart was implanted through preperitoneal space, avoiding chest incision.

"Opening the chest and spreading the breastbone can affect wound healing and delay rehabilitation and prolong the patient's recovery, especially in heart transplant patients who take immunosuppressants," said Liao, professor and chief of cardiothoracic transplantation and circulatory support at Baylor College of Medicine and chief of cardiothoracic transplantation and mechanical circulatory support at Baylor St. Luke's Medical Center. "With the robotic approach, we preserve the integrity of the chest wall, which reduces the risk of infection and helps with early mobility, respiratory function and overall recovery."

In addition to less surgical trauma, the clinical benefits of robotic heart transplant surgery include avoiding excessive bleeding from cutting the bone and reducing the need for blood transfusions, which minimizes the risk of developing antibodies against the transplanted heart. Before the transplant surgery, the 45-year-old patient had been hospitalized with advanced heart failure since November 2024 and required multiple mechanical devices to support his heart function. He received a heart transplant in early March 2025 and after heart transplant surgery, he spent a month in the hospital before being discharged home, without complications.
Businesses

Uber In Talks With Founder Travis Kalanick To Fund Self-Driving Car Deal (nytimes.com) 1

Facing mounting competition from autonomous taxi services like Waymo, Uber is in early talks to help fund Travis Kalanick's potential acquisition of Pony.ai's U.S. subsidiary (source paywalled; alternative source). If completed, the deal would reunite Kalanick with Uber (now under CEO Dara Khosrowshahi) and position Pony.ai to operate independently of its Chinese parent amid rising U.S. regulatory pressures. The New York Times reports: The company, Pony.ai, was founded in Silicon Valley in 2016 but has its main presence in China, and has permits to operate robot taxis and trucks in the United States and China. The talks are preliminary, said the people, who were not authorized to speak about the confidential conversations. Mr. Kalanick will run Pony if the deal is completed, they said. It is unclear what role, if any, Uber would take in Pony as an investor. Financial details of the potential transaction could not be determined. Pony went public last year in the United States, raising $260 million in a share sale. Its market capitalization stands around $4.5 billion.

If the deal goes through, Mr. Kalanick, 48, will remain in his day job running CloudKitchens, a virtual restaurant start-up that he founded after leaving Uber in 2017. He would also work more closely with Dara Khosrowshahi, who took over as Uber's chief executive after Mr. Kalanick's ouster. The discussions are the starkest sign yet that Uber is under pressure from Waymo, the driverless car unit spun out of Google, and other autonomous car services. When Mr. Kalanick was Uber's chief executive, the company tried developing autonomous vehicle technology. It then bought Otto, a self-driving trucking start-up run by Anthony Levandowski, a former Google engineer. Google later sued Mr. Levandowski for theft of trade secrets and sued Uber to bar it from using its self-driving technology.

Under Mr. Khosrowshahi, Uber has taken a different tack to self-driving cars. The company has struck roughly 18 partnerships with autonomous vehicle companies like Wayve, May Mobility and WeRide to bring pilot programs for driverless car services into Europe, the Middle East and Asia. The goal, Mr. Khosrowshahi has said in podcast interviews, has been to put "as many cars on Uber's network as possible." He has maintained that while autonomous vehicles are growing steadily, ride-hailing networks will have both human and robot drivers for years.

Robotics

Swarms of Tiny Nose Robots Could Clear Infected Sinuses, Researchers Say (theguardian.com) 91

An anonymous reader quotes a report from The Guardian: Swarms of tiny robots, each no larger than a speck of dust, could be deployed to cure stubborn infected sinuses before being blown out through the nose into a tissue, researchers have claimed. The micro-robots are a fraction of the width of a human hair and have been inserted successfully into animal sinuses in pre-clinical trials by researchers at universities in China and Hong Kong. Swarms are injected into the sinus cavity via a duct threaded through the nostril and guided to their target by electromagnetism, where they can be made to heat up and catalyze chemical reactions to wipe out bacterial infections. There are hopes the precisely targeted technology could eventually reduce reliance on antibiotics and other generalized medicines.

[...] The latest breakthrough, based on animal rather than human trials, involves magnetic particles "doped" with copper atoms which clinicians insert with a catheter before guiding to their target under a magnetic field. The swarms can be heated up by reacting to light from an optical fibre that is also inserted into the body as part of the therapy. This allows the micro-robots to loosen up and penetrate viscous pus that forms a barrier to the infection site. The light source also prompts the micro-robots to disrupt bacterial cell walls and release reactive oxygen species that kill the bacteria.

The study, published in Nature Robotics, showed the robots were capable of eradicating bacteria from pig sinuses and could clear infections in live rabbits with "no obvious tissue damage." The researchers have produced a model of how the technology could work on a human being, with the robot swarms being deployed in operating theatre conditions, allowing doctors to see their progress by using X-rays. Future applications could include tackling bacterial infections of the respiratory tract, stomach, intestine, bladder and urethra, they suggested. "Our proposed micro-robotic therapeutic platform offers the advantages of non-invasiveness, minimal resistance, and drug-free intervention," they said.

Robotics

Google Rolls Out New Gemini Model That Can Run On Robots Locally 22

Google DeepMind has launched Gemini Robotics On-Device, a new language model that enables robots to perform complex tasks locally without internet connectivity. TechCrunch reports: Building on the company's previous Gemini Robotics model that was released in March, Gemini Robotics On-Device can control a robot's movements. Developers can control and fine-tune the model to suit various needs using natural language prompts. In benchmarks, Google claims the model performs at a level close to the cloud-based Gemini Robotics model. The company says it outperforms other on-device models in general benchmarks, though it didn't name those models.

In a demo, the company showed robots running this local model doing things like unzipping bags and folding clothes. Google says that while the model was trained for ALOHA robots, it later adapted it to work on a bi-arm Franka FR3 robot and the Apollo humanoid robot by Apptronik. Google claims the bi-arm Franka FR3 was successful in tackling scenarios and objects it hadn't "seen" before, like doing assembly on an industrial belt. Google DeepMind is also releasing a Gemini Robotics SDK. The company said developers can show robots 50 to 100 demonstrations of tasks to train them on new tasks using these models on the MuJoCo physics simulator.
The Military

Denmark Tests Unmanned Robotic Sailboat Fleet (apnews.com) 10

Denmark has deployed four uncrewed robotic sailboats (known as "Voyagers") for a three-month trial to boost maritime surveillance amid rising tensions in the Baltic region. The Associated Press reports: Built by Alameda, California-based company Saildrone, the vessels will patrol Danish and NATO waters in the Baltic and North Seas, where maritime tensions and suspected sabotage have escalated sharply since Russia's full-scale invasion of Ukraine on Feb. 24, 2022. Two of the Voyagers launched Monday from Koge Marina, about 40 kilometers (25 miles) south of the Danish capital, Copenhagen. Powered by wind and solar energy, these sea drones can operate autonomously for months at sea. Saildrone says the vessels carry advanced sensor suites -- radar, infrared and optical cameras, sonar and acoustic monitoring. Their launch comes after two others already joined a NATO patrol on June 6.

Saildrone founder and CEO Richard Jenkins compared the vessels to a "truck" that carries sensors and uses machine learning and artificial intelligence to give a "full picture of what's above and below the surface" to about 20 to 30 miles (30 to 50 kilometers) in the open ocean. He said that maritime threats like damage to undersea cables, illegal fishing and the smuggling of people, weapons and drugs are going undetected simply because "no one's observing it." Saildrone, he said, is "going to places ... where we previously didn't have eyes and ears." The Danish Defense Ministry says the trial is aimed at boosting surveillance capacity in under-monitored waters, especially around critical undersea infrastructure such as fiber-optic cables and power lines.

Robotics

Scientists Built a Badminton-Playing Robot With AI-Powered Skills (arstechnica.com) 10

An anonymous reader quotes a report from Ars Technica: The robot built by [Yuntao Ma and his team at ETH Zurich] was called ANYmal and resembled a miniature giraffe that plays badminton by holding a racket in its teeth. It was a quadruped platform developed by ANYbotics, an ETH Zurich spinoff company that mainly builds robots for the oil and gas industries. "It was an industry-grade robot," Ma said. The robot had elastic actuators in its legs, weighed roughly 50 kilograms, and was half a meter wide and under a meter long. On top of the robot, Ma's team fitted an arm with several degrees of freedom produced by another ETH Zurich spinoff called Duatic. This is what would hold and swing a badminton racket. Shuttlecock tracking and sensing the environment were done with a stereoscopic camera. "We've been working to integrate the hardware for five years," Ma said.

Along with the hardware, his team was also working on the robot's brain. State-of-the-art robots usually use model-based control optimization, a time-consuming, sophisticated approach that relies on a mathematical model of the robot's dynamics and environment. "In recent years, though, the approach based on reinforcement learning algorithms became more popular," Ma told Ars. "Instead of building advanced models, we simulated the robot in a simulated world and let it learn to move on its own." In ANYmal's case, this simulated world was a badminton court where its digital alter ego was chasing after shuttlecocks with a racket. The training was divided into repeatable units, each of which required that the robot predict the shuttlecock's trajectory and hit it with a racket six times in a row. During this training, like a true sportsman, the robot also got to know its physical limits and to work around them.

The idea behind training the control algorithms was to develop visuo-motor skills similar to human badminton players. The robot was supposed to move around the court, anticipating where the shuttlecock might go next and position its whole body, using all available degrees of freedom, for a swing that would mean a good return. This is why balancing perception and movement played such an important role. The training procedure included a perception model based on real camera data, which taught the robot to keep the shuttlecock in its field of view while accounting for the noise and resulting object-tracking errors.

Once the training was done, the robot learned to position itself on the court. It figured out that the best strategy after a successful return is to move back to the center and toward the backline, which is something human players do. It even came with a trick where it stood on its hind legs to see the incoming shuttlecock better. It also learned fall avoidance and determined how much risk was reasonable to take given its limited speed. The robot did not attempt impossible plays that would create the potential for serious damage -- it was committed, but not suicidal. But when it finally played humans, it turned out ANYmal, as a badminton player, was amateur at best.
The findings have been published in the journal Science Robotics.

You can watch a video of the four-legged robot playing badminton on YouTube.
NASA

NASA Pulls the Plug on Jupiter-Moon Lander, So Scientists Propose Landing It on Saturn (gizmodo.com) 45

"NASA engineers have spent the past decade developing a rugged, partially autonomous lander designed to explore Europa, one of Jupiter's most intriguing moons," reports Gizmodo.

But though NASA "got cold feet over the project," the engineers behind the project are now suggesting the probe could instead explore Enceladus, the sixth-largest moon of Saturn: Europa has long been a prime target in the search for extraterrestrial biology because scientists suspect it harbors a subsurface ocean beneath its icy crust, potentially teeming with microbial life. But the robot — packed with radiation shielding, cutting-edge software, and ice-drilling appendages — won't be going anywhere anytime soon.

In a recent paper in Science Robotics, engineers at NASA's Jet Propulsion Laboratory (JPL) outlined the design and testing of what was once the Europa Lander prototype, a four-legged robotic explorer built to survive the brutal surface conditions of the Jovian moon. The robot was designed to walk — as opposed to roll — analyze terrain, collect samples, and drill into Europa's icy crust — all with minimal guidance from Earth, due to the major communication lag between our planet and the moon 568 million miles (914 million kilometers) away. Designed to operate autonomously for hours at a time, the bot came equipped with stereoscopic cameras, a robotic arm, LED lights, and a suite of specialized materials tough enough to endure harsh radiation and bone-chilling cold....

According to the team, the challenges of getting to Europa — its radiation exposure, immense distance, and short observation windows — proved too daunting for NASA's higher-ups. And that's before you take into consideration the devastating budget cuts planned by the Trump administration, which would see the agency's funding fall from $7.3 billion to $3.9 billion. The lander, once the centerpiece of a bold astrobiology initiative, is now essentially mothballed.

But the engineers aren't giving up. They're now lobbying for the robot to get a second shot — on Enceladus, Saturn's ice-covered moon, which also boasts a subsurface ocean and has proven more favorable for robotic exploration. Enceladus is still frigid, but `has lower radiation and better access windows than Europa.

Google

Waymo Set To Double To 20 Million Rides As Self-Driving Reaches Tipping Point (msn.com) 47

Google's self-driving taxi service Waymo has surpassed 10 million total paid rides, marking a significant milestone in the transition of autonomous vehicles from novelty to mainstream transportation option. The company's growth trajectory, WSJ argues, shows clear signs of exponential scaling, with weekly rides jumping from 10,000 in August 2023 to over 250,000 currently. Waymo is on track to hit 20 million rides by the end of 2025. The story adds: This is not just because Waymo is expanding into new markets. It's because of the way existing markets have come to embrace self-driving cars.

In California, the most recent batch of quarterly data reported by the company was the most encouraging yet. It showed that Waymo's number of paid rides inched higher by roughly 2% in both January and February -- and then increased 27% in March. In the nearly two years that people in San Francisco have been paying for robot chauffeurs, it was the first time that Waymo's growth slowed down for several months only to dramatically speed up again.
Waymo currently operates in Phoenix, Los Angeles, and San Francisco, with expansion planned for Austin, Atlanta, Miami, and Washington D.C. The service faces incoming competition from Tesla, which plans to launch its own robotaxi service in Austin this month. Waymo remains unprofitable despite raising $5.6 billion in funding last year.
China

China Just Held the First-Ever Humanoid Robot Fight Night (vice.com) 32

"We've officially entered the age of watching robots clobber each other in fighting rings," writes Vice.com.

A kick-boxing competition was staged Sunday in Hangzhou, China using four robots from Unitree Robotics, reports Futurism. (The robots were named "AI Strategist", "Silk Artisan", "Armored Mulan", and "Energy Guardian".) "However, the robots weren't acting autonomously just yet, as they were being remotely controlled by human operator teams."

Although those ringside human controllers used quick voice commands, according to the South China Morning Post: Unlike typical remote-controlled toys, handling Unitree's G1 robots entails "a whole set of motion-control algorithms powered by large [artificial intelligence] models", said Liu Tai, deputy chief engineer at China Telecommunication Technology Labs, which is under research institute China Academy of Information and Communications Technology.
More from Vice: The G1 robots are just over 4 feet tall [130 cm] and weigh around 77 pounds [35 kg]. They wear gloves. They have headgear. They throw jabs, uppercuts, and surprisingly sharp kicks... One match even ended in a proper knockout when a robot stayed down for more than eight seconds. The fights ran three rounds and were scored based on clean hits to the head and torso, just like standard kickboxing...
Thanks to long-time Slashdot reader AmiMoJo for sharing the news.
Robotics

Hugging Face Introduces Two Open-Source Robot Designs (siliconangle.com) 8

An anonymous reader quotes a report from SiliconANGLE: Hugging Face has open-sourced the blueprints of two internally developed robots called HopeJR and Reachy Mini. The company debuted the machines on Thursday. Hugging Face is backed by more than $390 million in funding from Nvidia Corp., IBM Corp. and other investors. It operates a GitHub-like platform for sharing open-source artificial intelligence projects. It says its platform hosts more than 1 million AI models, hundreds of thousands of datasets and various other technical assets.

The company started prioritizing robotics last year after launching LeRobot, a section of its platform dedicated to autonomous machines. The portal provides access to AI models for powering robots and datasets that can be used to train those models. Hugging Face released its first hardware blueprint, a robotic arm design called the SO-100, late last year. The SO-100 was developed in partnership with a startup called The Robot Studio. Hugging Face also collaborated with the company on the HopeJR, the first new robot that debuted this week. According to TechCrunch, it's a humanoid robot that can perform 66 movements including walking.

HopeJR is equipped with a pair of robotic arms that can be remotely controlled by a human using a pair of specialized, chip-equipped gloves. HopeJR's arms replicate the movements made by the wearer of the gloves. A demo video shared by Hugging Face showed that the robot can shake hands, point to a specific text snippet on a piece of paper and perform other tasks. Hugging Face's other new robot, the Reachy Mini, likewise features an open-source design. It's based on technology that the company obtained through the acquisition of a venture-backed startup called Pollen Robotics earlier this year. Reachy Mini is a turtle-like robot that comes in a rectangular case. Its main mechanical feature is a retractable neck that allows it to follow the user with its head or withdraw into the case. This case, which is stationary, is compact and lightweight enough to be placed on a desk.
Hugging Face will offer pre-assembled versions of its open-source Reach Mini and HopeJR robots for $250 and $3,000, with the first units starting to ship by the end of the year.
Robotics

Robot Industry Split Over That Humanoid Look (axios.com) 65

An anonymous reader quotes a report from Axios: Advanced robots don't necessarily need to look like C3PO from "Star Wars" or George Jetson's maid Rosie, despite all the hype over humanoids from Wall Street and Big Tech. In fact, some of the biggest skeptics about human-shaped robots come from within the robotics industry itself. [...] The most productive -- and profitable -- bots are the ones that can do single tasks cheaply and efficiently. "If you look at where robots are really bringing value in a manufacturing environment, it is combining industrial or collaborative robots with mobility," ABB managing director Ali Raja tells Axios. "I don't see that there are any real practical applications where humanoids are bringing in a lot of value."

"The reason we have two legs is because whether Darwin or God or whoever made us, we have to figure out how to traverse an infinite number of things," like climbing a mountain or riding a bike, explains Michael Cicco, CEO of Fanuc America Corp. "When you get into the factory, even if it's a million things, it's still a finite number of things that you need to do." Human-shaped robots are over-engineered solutions to most factory chores that could be better solved by putting a robot arm on a wheeled base, he said.

"The thing about humanoids is not that it's a human factor. It's that it's more dynamically stable," counters Melonee Wise, chief product officer at Agility Robotics, which is developing a humanoid robot called Digit. When humans grab something heavy, they can shift their weight for better balance. The same is true for a humanoid, she said. Using a robotic arm on a mobile base to pick up something heavy, "it's like I'm a little teapot and you become very unstable," she said, bending at the waist.

China

China's 7-Year Tech Independence Push Yields Major Gains in AI, Robotics and Semiconductors (msn.com) 84

China has achieved substantial technological advances across robotics, AI, and semiconductor manufacturing as part of a seven-year self-reliance campaign that has tripled the country's research and development spending to $500 billion annually.

Chinese robot manufacturers captured nearly half of their domestic market by 2023, up from a quarter of installations just years earlier, while AI startups now rival OpenAI and Google in capabilities. The progress extends to semiconductors, where Huawei released a high-end smartphone powered by what industry analysts believe was a locally-produced advanced processor, despite U.S. export controls targeting China's chip access.

Morgan Stanley projects China's self-sufficiency in graphics processing units will jump from 11% in 2021 to 82% by 2027. Chinese companies have been purchasing as many industrial robots as the rest of the world combined, enabling highly automated factories that can operate in darkness. In space technology, Chinese firms won five of 11 gold medals when U.S. think tanks ranked the world's best commercial satellite systems last year, compared to four for American companies.
Businesses

Why Two Amazon Drones Crashed at a Test Facility in December (msn.com) 39

While Amazon won FAA approval to fly beyond an operators' visual line of sight, "the program remains a work in progress," reports Bloomberg: A pair of Amazon.com Inc. package delivery drones were flying through a light rain in mid-December when, within minutes of one another, they both committed robot suicide... [S]ome 217 feet (66 meters) in the air [at a drone testing facility], the aircraft cut power to its six propellers, fell to the ground and was destroyed. Four minutes later and 183 feet over the taxiway, a second Prime Air drone did the same thing.

Not long after the incidents, Amazon paused its experimental drone flights to tweak the aircraft software but said the crashes weren't the "primary reason" for halting the program. Now, five months after the twin crashes, a more detailed explanation of what happened is starting to emerge. Faulty readings from lidar sensors made the drones think they had landed, prompting the software to shut down the propellers, according to National Transportation Safety Board documents reviewed by Bloomberg. The sensors failed after a software update made them more susceptible to being confused by rain, the NTSB said.

Amazon also removed a backup sensor present that had been present on earlier iterations, according to the article — though an Amazon spokesperson said the company had found ways to replicate the removed sensors.

But Bloomberg notes Amazon's drone efforts has faced "technical challenges and crashes, including one in 2021 that set a field ablaze at the company's testing facility in Pendleton, Oregon." Deliveries are currently limited to College Station, Texas, and greater Phoenix, with plans to expand to Kansas City, Missouri, the Dallas area and San Antonio, as well as the UK and Italy. Starting with a craft that looked like a hobbyist drone — and was vulnerable to even modest gusts of wind — Amazon went through dozens of designs to toughen the vehicle and ultimately make it capable of carting about 5 pounds, giving it the capability to transport items typically ordered from its warehouses. Engineers settled on a six-propeller design that takes off vertically before cruising like a plane. The first model to make regular customer deliveries, the MK27, was succeeded last year by the MK30, which flies at about 67 miles an hour and can deliver packages up to 7.5 miles from its launch point. The craft takes off, flies and lands autonomously.
AI

When a Company Does Job Interviews with a Malfunctioning AI - and Then Rejects You (slate.com) 51

IBM laid off "a couple hundred" HR workers and replaced them with AI agents. "It's becoming a huge thing," says Mike Peditto, a Chicago-area consultant with 15 years of experience advising companies on hiring practices. He tells Slate "I do think we're heading to where this will be pretty commonplace." Although A.I. job interviews have been happening since at least 2023, the trend has received a surge of attention in recent weeks thanks to several viral TikTok videos in which users share videos of their A.I. bots glitching. Although some of the videos were fakes posted by a creator whose bio warns that his content is "all satire," some are authentic — like that of Kendiana Colin, a 20-year-old student at Ohio State University who had to interact with an A.I. bot after she applied for a summer job at a stretching studio outside Columbus. In a clip she posted online earlier this month, Colin can be seen conducting a video interview with a smiling white brunette named Alex, who can't seem to stop saying the phrase "vertical-bar Pilates" in an endless loop...

Representatives at Apriora, the startup company founded in 2023 whose software Colin was forced to engage with, did not respond to a request for comment. But founder Aaron Wang told Forbes last year that the software allowed companies to screen more talent for less money... (Apriora's website claims that the technology can help companies "hire 87 percent faster" and "interview 93 percent cheaper," but it's not clear where those stats come from or what they actually mean.)

Colin (first interviewed by 404 Media) calls the experience dehumanizing — wondering why they were told dress professionally, since "They had me going the extra mile just to talk to a robot." And after the interview, the robot — and the company — then ghosted them with no future contact. "It was very disrespectful and a waste of time."

Houston resident Leo Humphries also "donned a suit and tie in anticipation for an interview" in which the virtual recruiter immediately got stuck repeating the same phrase. Although Humphries tried in vain to alert the bot that it was broken, the interview ended only when the A.I. program thanked him for "answering the questions" and offering "great information" — despite his not being able to provide a single response. In a subsequent video, Humphries said that within an hour he had received an email, addressed to someone else, that thanked him for sharing his "wonderful energy and personality" but let him know that the company would be moving forward with other candidates.

Slashdot Top Deals