×
News

Local Governments Overwhelmed By Tennis-Pickleball Turf Wars, Documents Show 120

An anonymous reader shares a report: In late September, an arsonist set fire to a storage shed at Memorial Park used by the Santa Monica Pickleball Club, torching thousands of dollars worth of nets, rackets, balls, and other pickleball equipment. "Unknown suspect(s) caused a fire that damaged city property (Tennis Court Gate)," a police report I obtained using a public records request says. The report adds that there is body camera footage of the incident and police-shot photos, but the city refused to release them to me because there is an ongoing investigation. The arsonist is still at large.

We still don't know the motive behind the arson, but the news caught my attention because it happened while I was in the midst of trying to understand what I've been calling the pickleball wars. For the last few months I've been trying to understand what's been happening behind-the-scenes in cities large and small by filing public records requests aimed at learning how common beefs about pickleball are, and what's causing them.

If you don't already know about "the fastest growing sport," Pickleball is kind of like tennis, but played on a court a quarter of the size using a plastic ball similar to a wiffle ball and a hard racket. The smaller court, hard ball, and hard racket means that pickleball is louder than tennis, a fact that is brought up very often by homeowners and homeowner associations who claim, somewhat dubiously, that the noise from pickleball drives down their home values. My hypothesis going into researching this article was that people who live in cities are mad at the noise created during the act of playing pickleball and they have probably complained to the government about it. What I found was surprisingly more complex: Thousands of pages of documents I've reviewed show that pickleball's surging popularity is overwhelming under-resourced parks departments in city governments all over the country.
Security

Hackers Spent 2+ Years Looting Secrets of Chipmaker NXP Before Being Detected (arstechnica.com) 19

An anonymous reader quotes a report from Ars Technica: A prolific espionage hacking group with ties to China spent over two years looting the corporate network of NXP, the Netherlands-based chipmaker whose silicon powers security-sensitive components found in smartphones, smartcards, and electric vehicles, a news outlet has reported. The intrusion, by a group tracked under names including "Chimera" and "G0114," lasted from late 2017 to the beginning of 2020, according to Netherlands national news outlet NRC Handelsblad, which cited "several sources" familiar with the incident. During that time, the threat actors periodically accessed employee mailboxes and network drives in search of chip designs and other NXP intellectual property. The breach wasn't uncovered until Chimera intruders were detected in a separate company network that connected to compromised NXP systems on several occasions. Details of the breach remained a closely guarded secret until now.

NRC cited a report published (and later deleted) by security firm Fox-IT, titled Abusing Cloud Services to Fly Under the Radar. It documented Chimera using cloud services from companies including Microsoft and Dropbox to receive data stolen from the networks of semiconductor makers, including one in Europe that was hit in "early Q4 2017." Some of the intrusions lasted as long as three years before coming to light. NRC said the unidentified victim was NXP. "Once nested on a first computer -- patient zero -- the spies gradually expand their access rights, erase their tracks in between and secretly sneak to the protected parts of the network," NRC reporters wrote in an English translation. "They try to secrete the sensitive data they find there in encrypted archive files via cloud storage services such as Microsoft OneDrive. According to the log files that Fox-IT finds, the hackers come every few weeks to see whether interesting new data can be found at NXP and whether more user accounts and parts of the network can be hacked."

NXP did not alert customers or shareholders to the intrusion, other than a brief reference in a 2019 annual report. It read: "We have, from time to time, experienced cyber-attacks attempting to obtain access to our computer systems and networks. Such incidents, whether or not successful, could result in the misappropriation of our proprietary information and technology, the compromise of personal and confidential information of our employees, customers, or suppliers, or interrupt our business. For instance, in January 2020, we became aware of a compromise of certain of our systems. We are taking steps to identify the malicious activity and are implementing remedial measures to increase the security of our systems and networks to respond to evolving threats and new information. As of the date of this filing, we do not believe that this IT system compromise has resulted in a material adverse effect on our business or any material damage to us. However, the investigation is ongoing, and we are continuing to evaluate the amount and type of data compromised. There can be no assurance that this or any other breach or incident will not have a material impact on our operations and financial results in the future."

Google

Google's New Geothermal Energy Project is Up and Running (theverge.com) 28

A first-of-its-kind geothermal project is now up and running in Nevada, where it will help power Google's data centers with clean energy. From a report: Google is partnering with startup Fervo, which has developed new technology for harnessing geothermal power. Since they're using different tactics than traditional geothermal plants, it is a relatively small project with the capacity to generate 3.5 MW. For context, one megawatt is enough to meet the demand of roughly 750 homes. The project will feed electricity into the local grid that serves two of Google's data centers outside of Las Vegas and Reno.

It's part of Google's plan to run on carbon pollution-free electricity around the clock by 2030. To reach that goal, it'll have to get more sources of clean energy online. And it sees geothermal as a key part of the future electricity mix that can fill in whenever wind and solar energy wane. "If you think about how much we advanced wind and solar and lithium ion storage, here we are -- this is kind of the next set of stuff and we feel like companies have a huge role to play in advancing these technologies," says Michael Terrell, senior director of energy and climate at Google.

Security

Researchers Figure Out How To Bypass Fingerprint Readers In Most Windows PCs (arstechnica.com) 25

An anonymous reader quotes a report from Ars Technica: [L]ast week, researchers at Blackwing Intelligence published an extensive document showing how they had managed to work around some of the most popular fingerprint sensors used in Windows PCs. Security researchers Jesse D'Aguanno and Timo Teras write that, with varying degrees of reverse-engineering and using some external hardware, they were able to fool the Goodix fingerprint sensor in a Dell Inspiron 15, the Synaptic sensor in a Lenovo ThinkPad T14, and the ELAN sensor in one of Microsoft's own Surface Pro Type Covers. These are just three laptop models from the wide universe of PCs, but one of these three companies usually does make the fingerprint sensor in every laptop we've reviewed in the last few years. It's likely that most Windows PCs with fingerprint readers will be vulnerable to similar exploits.

Blackwing's post on the vulnerability is also a good overview of exactly how fingerprint sensors in a modern PC work. Most Windows Hello-compatible fingerprint readers use "match on chip" sensors, meaning that the sensor has its own processors and storage that perform all fingerprint scanning and matching independently without relying on the host PC's hardware. This ensures that fingerprint data can't be accessed or extracted if the host PC is compromised. If you're familiar with Apple's terminology, this is basically the way its Secure Enclave is set up. Communication between the fingerprint sensor and the rest of the system is supposed to be handled by the Secure Device Connection Protocol (SCDP). This is a Microsoft-developed protocol that is meant to verify that fingerprint sensors are trustworthy and uncompromised, and to encrypt traffic between the fingerprint sensor and the rest of the PC.

Each fingerprint sensor was ultimately defeated by a different weakness. The Dell laptop's Goodix fingerprint sensor implemented SCDP properly in Windows but used no such protections in Linux. Connecting the fingerprint sensor to a Raspberry Pi 4, the team was able to exploit the Linux support plus "poor code quality" to enroll a new fingerprint that would allow entry into a Windows account. As for the Synaptic and ELAN fingerprint readers used by Lenovo and Microsoft (respectively), the main issue is that both sensors supported SCDP but that it wasn't actually enabled. Synaptic's touchpad used a custom TLS implementation for communication that the Blackwing team was able to exploit, while the Surface fingerprint reader used cleartext communication over USB for communication. "In fact, any USB device can claim to be the ELAN sensor (by spoofing its VID/PID) and simply claim that an authorized user is logging in," wrote D'Aguanno and Teras.
"Though all of these exploits ultimately require physical access to a device and an attacker who is determined to break into your specific laptop, the wide variety of possible exploits means that there's no single fix that can address all of these issues, even if laptop manufacturers are motivated to implement them," concludes Ars.

Blackwing recommends all Windows Hello fingerprint sensors enable SCDP, the protocol Microsoft developed to try to prevent this exploit. PC makers should also "have a qualified expert third party audit [their] implementation" to improve code quality and security.
Data Storage

Google Drive Misplaces Months' Worth of Customer Files (theregister.com) 82

Google Drive users are reporting files mysteriously disappearing from the service, with some posters on the company's support forums claiming six or more months of work have unceremoniously vanished. From a report: The issue has been rumbling for a few days, with one user logging into Google Drive and finding things as they were in May 2023. According to the poster, almost everything saved since then has gone, and attempts at recovery failed. Others chimed in with similar experiences, and one claimed that six months of business data had gone AWOL. There is little information regarding what has happened; some users reported that synchronization had simply stopped working, so the cloud storage was out of date.

Others could get some of their information back by fiddling with cached files, although the limited advice on offer for the affected was to leave things well alone until engineers come up with a solution. A message purporting to be from Google support also advised not to make changes to the root/data folder while engineers investigate the issue. Some users speculated that it might be related to accounts being spontaneously dropped. We've asked Google for its thoughts and will update should the search giant respond.

Security

Why Do So Many Sites Have Bad Password Policies? (gatech.edu) 242

"Three out of four of the world's most popular websites are failing to meet minimum requirement standards" for password security, reports Georgia Tech's College of Computing. Which means three out of four of the world's most popular web sites are "allowing tens of millions of users to create weak passwords."

Using a first-of-its-kind automated tool that can assess a website's password creation policies, researchers also discovered that 12% of websites completely lacked password length requirements. Assistant Professor Frank Li and Ph.D. student Suood Al Roomi in Georgia Tech's School of Cybersecurity and Privacy created the automated assessment tool to explore all sites in the Google Chrome User Experience Report (CrUX), a database of one million websites and pages.

Li and Al Roomi's method of inferring password policies succeeded on over 20,000 sites in the database and showed that many sites:

- Permit very short passwords
- Do not block common passwords
- Use outdated requirements like complex characters

The researchers also discovered that only a few sites fully follow standard guidelines, while most stick to outdated guidelines from 2004... More than half of the websites in the study accepted passwords with six characters or less, with 75% failing to require the recommended eight-character minimum. Around 12% of had no length requirements, and 30% did not support spaces or special characters. Only 28% of the websites studied enforced a password block list, which means thousands of sites are vulnerable to cyber criminals who might try to use common passwords to break into a user's account, also known as a password spraying attack.

Georgia Tech describes the new research as "the largest study of its kind." ("The project was 135 times larger than previous works that relied on manual methods and smaller sample sizes.")

"As a security community, we've identified and developed various solutions and best practices for improving internet and web security," said assistant professor Li. "It's crucial that we investigate whether those solutions or guidelines are actually adopted in practice to understand whether security is improving in reality."

The Slashdot community has already noticed the problem, judging by a recent post from eggegick. "Every site I visit has its own idea of the minimum and maximum number of characters, the number of digits, the number of upper/lowercase characters, the number of punctuation characters allowed and even what punctuation characters are allowed and which are not." The limit of password size really torques me, as that suggests they are storing the password (they need to limit storage size), rather than its hash value (fixed size), which is a real security blunder. Also, the stupid dots drive me bonkers, especially when there is no "unhide" button. For crying out loud, nobody is looking over my shoulder! Make the "unhide" default.
"The 'dots' are bad security," agrees long-time Slashdot reader Spazmania. "If you're going to obscure the password you should also obscure the length of the password." But in their comment on the original submission, they also point out that there is a standard for passwords, from the National Institute of Standards and Technology: Briefly:

* Minimum 8 characters
* Must allow at least 64 characters.
* No constraints on what printing characters can be used (including high unicode)
* No requirements on what characters must be used or in what order or proportion

This is expected to be paired with a system which does some additional and critical things:

* Maintain a database of known compromised passwords (e.g. from public password dictionaries) and reject any passwords found in the database.
* Pair the password with a second authentication factor such as a security token or cell phone sms. Require both to log in.
* Limit the number of passwords which can be attempted per time period. At one attempt per second, even the smallest password dictionaries would take hundreds of years to try...

Someone attempting to brute force a password from outside on a rate-limited system is limited to the rate, regardless of how computing power advances. If the system enforces a rate limit of 1 try per second, the time to crack an 8-character password containing only lower case letters is still more than 6,000 years.

Power

In Just 15 Months, America Made $37B In Clean Energy Investments In Fossil Fuel-Reliant Regions (msn.com) 52

America passed a climate bill in August of 2022 with incentives to build wind and solar energy in regions that historically relied on fossil fuels. And sure enough, since then "a disproportionate amount of wind, solar, battery and manufacturing investment is going to areas that used to host fossil fuel plants," reports the Washington Post.

They cite a new analysis of investment trends from independent research firm Rhodium Group and MIT's Center for Energy and Environmental Policy Research: In Carbon County, Wyo. — a county named for its coal deposits — a power company is building hundreds of wind turbines. In Mingo County, W.Va., where many small towns were once coal towns, the Adams Fork Energy plant will sit on a former coal mining site and produce low-carbon ammonia... While communities that once hosted coal, oil or gas infrastructure make up only 18.6 percent of the population, they received 36.8 percent of the clean energy investment in the year after the Inflation Reduction Act's passage. "We're talking about in total $100 billion in investment in these categories," said Trevor Houser, a partner at Rhodium Group. "So $37 billion investment in a year for energy communities — that's a lot of money...."

Most significantly, 56.6 percent of investment in U.S. wind power in the past year has gone to energy communities, as well as 45.5 percent of the storage and battery investment... The analysis also found that significant amounts of clean energy investment were going to disadvantaged communities, defined as communities with environmental or climate burdens, and low-income communities. Many of the states benefiting are solidly Republican...

Josh Freed, senior vice president for climate and energy at the center-left think tank Third Way, is not sure whether the clean energy investments will make a difference for next year's election. But in the long term, he argues, rural Republican areas will become more dependent on clean energy — potentially shifting party alliances and shifting the position of the Republican Party itself. "It's going to change these fossil fuel communities," he said.

Piracy

File-Sharing Giant Uloz Bans File-Sharing Citing EU's Digital Services Act 12

TorrentFreak: File-sharing and hosting giant Uloz has announced a radical change to its business model. The Czech site has been under fire for some time and was recently branded a 'notorious market' by the MPA. However, Uloz says that an imminent ban on file-sharing in favor of a private, cloud-based storage model, is due to the strict conditions imposed by the EU's Digital Services Act.
Power

Giant Batteries Drain Economics of Gas Power Plants (reuters.com) 188

Batteries used to store power produced by renewables are becoming cheap enough to make developers abandon scores of projects for gas-fired generation worldwide. Reuters reports: The long-term economics of gas-fired plants, used in Europe and some parts of the United States primarily to compensate for the intermittent nature of wind and solar power, are changing quickly, according to Reuters' interviews with more than a dozen power plant developers, project finance bankers, analysts and consultants. They said some battery operators are already supplying back-up power to grids at a price competitive with gas power plants, meaning gas will be used less. The shift challenges assumptions about long-term gas demand and could mean natural gas has a smaller role in the energy transition than posited by the biggest, listed energy majors.

In the first half of the year, 68 gas power plant projects were put on hold or cancelled globally, according to data provided exclusively to Reuters by U.S.-based non-profit Global Energy Monitor. [...] "In the early 1990s, we were running gas plants baseload, now they are shifting to probably 40% of the time and that's going to drop off to 11%-15% in the next eight to 10 years," Keith Clarke, chief executive at Carlton Power, told Reuters. Developers can no longer use financial modelling that assumes gas power plants are used constantly throughout their 20-year-plus lifetime, analysts said. Instead, modellers need to predict how much gas generation is needed during times of peak demand and to compensate for the intermittency of renewable sources that are hard to anticipate.

The cost of lithium-ion batteries has more than halved from 2016 to 2022 to $151 per kilowatt hour of battery storage, according to BloombergNEF. At the same time, renewable generation has reached record levels. Wind and solar powered 22% of the EU's electricity last year, almost doubling their share from 2016, and surpassing the share of gas generation for the first time, according to think tank Ember's European Electricity Review. "In the early years, capacity markets were dominated by fossil fuel power stations providing the flexible electricity supply," said Simon Virley, head of energy at KPMG. Now batteries, interconnectors and consumers shifting their electricity use are also providing that flexibility, Virley added.

Earth

Forest Service Plans Carbon Dioxide Storage on Federal Lands 108

An anonymous reader shares a report: In recent years, lots of American companies have gotten behind a potential climate solution called carbon capture and storage, and the Biden administration has backed it with billions of dollars in tax incentives and direct investments. The idea is to trap planet-heating carbon dioxide from the smokestacks of factories and power plants and transport it to sites where it is injected underground and stored. But the idea is controversial, in large part because the captured carbon dioxide would be shipped to storage sites via thousands of miles of new pipelines. Communities nationwide are pushing back against these pipeline projects and underground sites, arguing they don't want the pollution running through their land.

Now the U.S. Forest Service is proposing to change a rule to allow storing this carbon dioxide pollution under the country's national forests and grasslands. "Authorizing carbon capture and storage on NFS lands would support the Administration's goal to reduce greenhouse gas emissions by 50 percent below the 2005 levels by 2030," the proposed rule change says. But environmental groups and researchers have concerns. Carbon dioxide pollution will still need to be transported to the forests via industrial pipeline for storage, says June Sekera, a research fellow with Boston University. "To get the CO2 to the injection site in the midst of our national forest, they've got to build huge pipelines," Sekera says. "All this huge industrial infrastructure that's going to go right through." Sekera says building those CO2 pipelines may require clearing a lot of trees.
Linux

Canonical Intros Microcloud: Simple, Free, On-prem Linux Clustering (theregister.com) 16

Canonical hosted an amusingly failure-filled demo of its new easy-to-install, Ubuntu-powered tool for building small-to-medium scale, on-premises high-availability clusters, Microcloud, at an event in London yesterday. From a report: The intro to the talk leaned heavily on Canonical's looming 20th anniversary, and with good reason. Ubuntu has carved out a substantial slice of the Linux market for itself on the basis of being easier to use than most of its rivals, at no cost -- something that many Linux players still seem not to fully comprehend. The presentation was as buzzword-heavy as one might expect, and it's also extensively based on Canonical's in-house tech, such as the LXD containervisor, Snap packaging, and, optionally, the Ubuntu Core snap-based immutable distro. (The only missing buzzword didn't crop up until the Q&A session, and we were pleased by its absence: it's not built on and doesn't use Kubernetes, but you can run Kubernetes on it if you wish.)

We're certain this is going to turn off or alienate a lot of the more fundamentalist Penguinistas, but we are equally sure that Canonical won't care. In the immortal words of Kevin Smith, it's not for critics. Microcloud combines several existing bits of off-the-shelf FOSS tech in order to make it easy to link from three to 50 Ubuntu machines into an in-house, private high-availability cluster, with live migration and automatic failover. It uses its own LXD containervisor to manage nodes and workloads, Ceph for distributed storage, OpenZFS for local storage, and OVN to virtualize the cluster interconnect. All the tools are packaged as snaps. It supports both x86-64 and Arm64 nodes, including Raspberry Pi kit, and clusters can mix both architectures. The event included several demonstrations using an on-stage cluster of three ODROID machines with "Intel N6005" processors, so we reckon they were ODROID H3+ units -- which we suspect the company chose because of their dual Ethernet connections.

Data Storage

Scientists Use Raspberry Pi Tech To Protect NASA Telescope Data (theregister.com) 38

Richard Speed reports via The Register: Scientists have revealed how data from a NASA telescope was secured thanks to creative thinking and a batch of Raspberry Pi computers. The telescope was the Super Pressure Balloon Imaging Telescope (SuperBIT), launched on April 16, 2023, from Wanaka Airport in New Zealand. The telescope was raised to approximately 33km in altitude by NASA's 532,000-cubic-meter (18.8-million-cubic-foot) balloon and, above circa 99.5 percent of the Earth's atmosphere, it spent over a month circumnavigating the globe and acquiring observations of astronomical objects. The plan had been for the payload to transmit its data to the ground using SpaceX's Starlink constellation and the US Tracking and Data Relay Satellite System (TDRSS). However, the Starlink connection went down soon after launch, on May 1, and the TDRSS connection became unstable on May 24. The boffins decided to attempt a landing on May 25 due to poor communications and concerns the balloon might be pulled away from further land crossings by weather.

The telescope itself was destroyed during the landing; it was dragged along the ground for 3km by a parachute that failed to detach, leaving a trail of debris in its wake. Miraculously, though, SuperBIT's solid-state drive was recovered intact. However, other than as a reference, its data was not needed thanks to the inclusion of Raspberry Pi-powered hardware in the form of four Data Recovery System (DRS) capsules. Each capsule included a Raspberry Pi 3B and 5TB of solid-state storage. A parachute, a Global Navigation Satellite System (GNSS) receiver, and an Iridium short-burst data transceiver were also included so the hardware could report its location to the recovery team. The capsules were connected to the main payload via Ethernet, and 24V DC was also available.

The plan had been to release the first DRS capsule on day 40, and then another every 20 days after that, whenever SuperBIT passed over land. However, when it became clear that SuperBIT would have to come down on May 25, it was decided to drop two DRS capsules over Argentina's Santa Cruz Province. Both of the DRS capsules released were recovered from their reported locations -- a curious cougar apparently nosed around one of them without causing damage -- and the data was fully intact. Of the unreleased DRS capsules, one failed for unknown reasons at launch -- the team speculated that perhaps a cable came loose -- but the other also contained an intact data set.

Encryption

Signal Reveals Its Operation Costs, Estimates $50 Million a Year In 2024 (wired.com) 29

gaiageek writes: Of note, given the recent Slashdot article about Signal opening up to trying out usernames, is the $6 million annual cost of sending SMS messages for account verification, which certainly suggests that getting rid of phone number verification would be a significant cost-saving solution.

Signal pays $14 million a year in infrastructure costs, for instance, including the price of servers, bandwidth, and storage. It uses about 20 petabytes per year of bandwidth, or 20 million gigabytes, to enable voice and video calling alone, which comes to $1.7 million a year. The biggest chunk of those infrastructure costs, fully $6 million annually, goes to telecom firms to pay for the SMS text messages Signal uses to send registration codes to verify new Signal accounts' phone numbers.


Windows

Windows is Now an App for iPhones, iPads, Macs, and PCs (theverge.com) 57

Microsoft has created a Windows App for iOS, iPadOS, macOS, Windows, and web browsers. From a report: The app essentially takes the previous Windows 365 app and turns it into a central hub for streaming a copy of Windows from a remote PC, Azure Virtual Desktop, Windows 365, Microsoft Dev Box, and Microsoft's Remote Desktop Services.

Microsoft supports multiple monitors through its Windows App, custom display resolutions and scaling, and device redirection for peripherals like webcams, storage devices, and printers. The preview version of the Windows App isn't currently available for Android, though. The Windows App is also limited to Microsoft's range of business accounts, but there are signs it will be available to consumers, too. The sign-in prompt on the Windows App on Windows (yes that's a mouthful) suggests you can access the app using a personal Microsoft Account, but this functionality doesn't work right now.

Earth

The Lego-Like Way To Get CO2 Out of the Atmosphere 200

An anonymous reader quotes a report from the Washington Post: For decades, scientists have tried to figure out ways to reverse climate change by pulling carbon dioxide out of the atmosphere and storing it underground. They've tried using trees, giant machines that suck CO2 out of the sky, complicated ocean methods that involve growing and burying huge quantities of kelp. Companies, researchers and the U.S. government have spent billions of dollars on the research and development of these approaches and yet they remain too expensive to make a substantial dent in carbon emissions. Now, a start-up says it has discovered a deceptively simple way to take CO2 from the atmosphere and store it for thousands of years. It involves making bricks out of smushed pieces of plants. And it could be a game changer for the growing industry working to pull carbon from the air.

Graphyte, a new company incubated by Bill Gates's investment group Breakthrough Energy Ventures, announced Monday that it has created a method for turning bits of wood chips and rice hulls into low-cost, dehydrated chunks of plant matter. Those blocks of carbon-laden plant matter -- which look a bit like shoe-box sized Lego blocks -- can then be buried deep underground for hundreds of years. The approach, the company claims, could store a ton of CO2 for around $100 a ton, a number long considered a milestone for affordably removing carbon dioxide from the air. [...] Graphyte's approach uses the power of plants and trees to photosynthesize and pull carbon dioxide from the air. While trees and plants are excellent at carbon capture, they don't store that carbon for very long -- when a plant burns or decays, its stored carbon comes spilling back out into the air and soil.

Graphyte plans to avoid that decomposition by taking plant waste from timber harvesters and farmers and drying it thoroughly, removing all the microbes that could cause it to decompose and release greenhouse gases. Then, in a process that they call "carbon casting," it will compress the waste and wrap it into Lego-like bricks, for easier storage about 10 feet underground. The company says that with the right monitoring systems, the blocks can stay there for a thousand years. [...] Graphyte is planning to build its first project in Pine Bluff, Ark., and the company hopes to sequester its first carbon for a customer in 2024. It remains to be seen whether Graphyte will be able to scale up its operation to removing millions of tons of CO2 from the atmosphere. The company will need to secure many sources of plant waste and build many small processing centers around the country to be successful.
"The simplicity of the Graphyte approach is so exciting," said Daniel Sanchez, who runs the Carbon Removal Lab at the University of California at Berkeley, and serves as a science adviser for Graphyte. "You don't need very expensive equipment or processes. And it locks up a lot of the carbon in the wood -- nearly all of it."

"People that are academics probably thought about this before and were like, 'That's way too simple,'" Sanchez said, laughing. "'No one's ever going to do that.'"
Data Storage

SanDisk Extreme Pro Failures Result From Design and Manufacturing Flaws, Says Data Recovery Firm (tomshardware.com) 38

Anton Shilov reports via Tom's Hardware: A new report from a data recovery company now points the finger at design and manufacturing flaws as the underlying issue with the recent flood of SanDisk Extreme Pro failures that eventually spurred a class action lawsuit. It became clear in May that some of Western Digital's SanDisk Extreme Pro 4TB SSDs suffered from sudden data loss; at this point, the company promised a firmware update to owners of the 4TB models. However, the 2TB and 3TB models also suffer from the same issue, and Western Digital did not promise any firmware updates for these drives.

Markus Hafele, Managing Director of Attingo, a data recovery company, told FutureZone that the problem lies in hardware, not firmware, which could explain the lack of corrective firmware updates for those models and SanDisk's continued silence about the source of the issues. Attingo, which has been in the data recovery business for over 25 years, normally sees these failed SanDisk Extreme Pro SSDs at least once a week. The problem appears to be rather complex. According to HÃfele, the components used in these SSDs are too big for the circuit board, causing weak connections (i.e., high impendence and high temperatures) and making them prone to breaking. He also says that the soldering material used to attach these components is prone to forming bubbles and breaking easily.

It remains unknown whether the cause is cheap solder, the componentry, or both contribute to the issues observed. However, newer revisions of these SanDisk Extreme Pro SSDs seem to have been modified with extra epoxy resin to secure the oversized components. This suggests that Western Digital might know about the hardware problems. Nevertheless, these newer models are still failing, thus sending data recovery service customers to firms like Attingo. According to the head of Attingo, the issue seems to be affecting multiple product lineups, including both SanDisk Extreme Portable SSD as well as the SanDisk Extreme Pro Portable SSD.

Earth

America's First Commercial Carbon-Sucking Facility Opens in California (yahoo.com) 206

"In an open-air warehouse in California's Central Valley, 40-foot-tall racks hold hundreds of trays filled with a white powder that turns crusty as it absorbs carbon dioxide from the sky," reports the New York Times.

"The start-up that built the facility, Heirloom Carbon Technologies, calls it the first commercial plant in the United States to use direct air capture, which involves vacuuming greenhouse gases from the atmosphere." Another plant is operating in Iceland, and some scientists say the technique could be crucial for fighting climate change. Heirloom will take the carbon dioxide it pulls from the air and have the gas sealed permanently in concrete, where it can't heat the planet. To earn revenue, the company is selling carbon removal credits to companies paying a premium to offset their own emissions. Microsoft has already signed a deal with Heirloom to remove 315,000 tons of carbon dioxide from the atmosphere.

The company's first facility in Tracy, California, which opens Thursday, is fairly small. The plant can absorb a maximum of 1,000 tons of carbon dioxide per year, equal to the exhaust from about 200 cars. But Heirloom hopes to expand quickly. "We want to get to millions of tons per year," said Shashank Samala, the company's chief executive. "That means copying and pasting this basic design over and over."

Heirloom's technology hinges on a simple bit of chemistry: Limestone, one of the most abundant rocks on the planet, forms when calcium oxide binds with carbon dioxide. In nature, that process takes years. Heirloom speeds it up. At the California plant, workers heat limestone to 1,650 degrees Fahrenheit in a kiln powered by renewable electricity. Carbon dioxide is released from the limestone and pumped into a storage tank. The leftover calcium oxide, which looks like flour, is then doused with water and spread onto large trays, which are carried by robots onto tower-high racks and exposed to open air. Over three days, the white powder absorbs carbon dioxide and turns into limestone again. Then it's back to the kiln and the cycle repeats. "That's the beauty of this, it's just rocks on trays," Mr. Samala, who co-founded Heirloom in 2020, said.

The hard part, he added, was years of tweaking variables like particle size, tray spacing and moisture to speed up absorption... In future projects, Heirloom also plans to pump carbon dioxide into underground storage wells, burying it.

The company received funding from Microsoft's Climate Innovation Fund and Bill Gates' Breakthrough Energy Ventures, according to Bloomberg, which adds that Heirloom's technology will later "be deployed at a major hub in Louisiana the government expects will remove 1 million tons of CO2 a year by the end of the decade."

The New York Times notes there was also federal funding, something that's been fueling the ambitions of hundreds of carbon-capture startups. "The science is clear," says America's Energy Secretary. "Cutting back carbon emissions through renewable energy alone won't stop the damage from climate change. Direct air capture technology is a game-changing tool that gives us a shot at removing the carbon pollution that has been building in the atmosphere since the Industrial Revolution."
Power

Will Sodium Batteries Become an Alternative To Lithium? (economist.com) 129

Smartphones and electric cars are both powered by lithium-ion batteries, notes the Economist. These "Li-ion" batteries "form the guts of a growing number of grid-storage systems that smooth the flow of electricity from wind and solar power stations. Without them, the electrification needed to avoid the worst effects of global warming would be unimaginable." But unfortunately, building them requires scarce metals.

"A clutch of companies, though, think they have an alternative: making batteries with sodium instead..." And the idea of building "Na-ion" batteries at scale is "gaining traction." Engineers are tweaking designs. Factories, particularly in China, are springing up. For the first time since the Li-ion revolution began, lithium's place on the electrochemical pedestal is being challenged... [A]ccording to Rory McNulty, a research analyst at Benchmark, Chinese firms have 34 Na-ion-battery factories built, being built or announced inside the country, and one planned in Malaysia. Established battery-makers in other places, by contrast, are not yet showing much interest. Even without a five-year plan to guide them, though, some non-Chinese startups are seeking to steal a march by developing alternatives to layered oxides, in the hope of improving the technology, reducing its cost, or both.

One of the most intriguing of these neophytes is Natron Energy, of Santa Clara, California... Natron claims that its cells can endure 50,000 cycles of charging and discharging — between ten and 100 times more than commercial Li-ion batteries can manage. The firm has built a factory in Michigan, which it says will begin production later this year. Other non-Chinese firms are less far advanced, but full of hope. Altris, in Sweden, which is also building a factory, employs a material called Prussian white that substitutes some of the iron in Prussian blue with sodium. Tiamat, in France, uses a polyanionic design involving vanadium. And Faradion, in Britain (now owned by Reliance, an Indian firm), intends to stick with a layered-metal-oxide system.

Thanks to Slashdot reader echo123 for sharing the article.
Android

Google Promises a Rescue Patch For Android 14's 'Ransomware' Bug (arstechnica.com) 33

Google says it'll issue a system update to fix a major storage bug in Android 14 that has caused some users to be locked out of their devices. Ars Technica reports: Apparently one more round of news reports was enough to get the gears moving at Google. Over the weekend the Issue tracker bug has been kicked up from a mid-level "P2" priority to "P0," the highest priority on the issue tracker. The bug has been assigned to someone now, and Googlers have jumped into the thread to make official statements that Google is looking into the matter. Here's the big post from Google on the bug tracker [...]. The highlights here are that Google says the bug affects devices with multiple Android users, not multiple Google accounts or (something we thought originally) users with work profiles. Setting up multiple users means going to the system settings, then "Multiple users," then "Allow multiple users," and you can add a user other than the default one. If you do this, you'll have a user switcher at the bottom of the quick settings. Multiple users all have separate data, separate apps, and separate Google accounts. Child users are probably the most popular reason to use this feature since you can lock kids out of things, like purchasing apps.

Shipping a Google Play system update as a quick Band-Aid is an interesting solution, but as Google's post suggests, this doesn't mean the problem is fixed. Play system updates (these are alternatively called Project Mainline or APEX modules) allow Google to update core system components via the Play Store, but they are really not meant for critical fixes. The big problem is that the Play system updates don't aggressively apply themselves or even let you know they have been downloaded. They just passively, silently wait for a reboot to happen so they can apply. For Pixel users, it feels like the horse has already left the barn anyway -- like most Pixel phones have automatically applied the nearly 13-day-old update by now. Users can force Play system updates to happen themselves by going to the system settings, then "Security & Privacy," then "System & updates," then "Google Play system update." If you have an update, you'll be prompted to reboot the phone. Also note that this differs from the usual OS update checker location, which is in system settings, then "System," then "System update." The system update screen will happily tell you "Your system is up to date" even if you have a pending Google Play system update. It would be great to have a single location for OS updates, Google Play System/Mainline updates, and app updates, but they are scattered everywhere and give conflicting "up to date" messages.

Businesses

Western Digital To Split Flash Memory Business (reuters.com) 10

Western Digital said on Monday it would spin off its flash memory business that has been grappling with a supply glut after talks of merging the unit with Japan's Kioxia stalled. From a report: The split will leave the data storage products maker with its traditional hard-disk drive business and create two publicly traded firms, giving into demands from activist investor Elliott. The move clears years of uncertainty over Western Digital's flash memory unit that was built through its $19 billion purchase of SanDisk in 2016 and caters to the smartphone and computer industries. Demand for flash chips has slumped after the pandemic, leaving the market awash in supply and increasing the pressure on chipmakers to consolidate. Since 2021, Western Digital and its manufacturing partner Kioxia have been in talks for a merger that would create a company that controls a third of the global NAND flash market.

Slashdot Top Deals