×
Data Storage

The World's Largest Single-Phase Battery Is Now Up and Running (electrek.co) 64

Meet Crimson Storage, the world's largest single-phase battery, which is now live in the California desert. Electrek reports: Crimson Storage is also the second-largest energy storage project currently in operation of any configuration. The 350 megawatt (MW)/1400 megawatt-hour (MWh) battery storage project, which sits on on 2,000 acres west of Blythe in Riverside County, broke ground in 2021. Canadian Solar oversaw construction and provided the battery energy storage systems, and Axium Infrastructure and solar and storage developer Recurrent Energy will be Crimson Storage's long-term owners.

Residential homes are usually served by a single-phase power supply, and this project, on average, is expected to store and dispatch enough electricity to power more than 47,000 homes each year. Crimson Storage holds two long-term contracts with local utilities: a 200 MW/800 MWh 14-year and 10-month contract with Southern California Edison, and a 150 MW/ 600MWh 15-year contract with Pacific Gas and Electric.

Data Storage

Lost Something? Search Through 91.7 Million Files From the 80s, 90s, and 2000s (arstechnica.com) 57

An anonymous reader quotes a report from Ars Technica: Today, tech archivist Jason Scott announced a new website called Discmaster that lets anyone search through 91.7 million vintage computer files pulled from CD-ROM releases and floppy disks. The files include images, text documents, music, games, shareware, videos, and much more. The files on Discmaster come from the Internet Archive, uploaded by thousands of people over the years. The new site pulls them together behind a search engine with the ability to perform detailed searches by file type, format, source, file size, file date, and many other options.

Discmaster is the work of a group of anonymous history-loving programmers who approached Scott to host it for them. Scott says that Discmaster is "99.999 percent" the work of that anonymous group, right down to the vintage gray theme that is compatible with web browsers for older machines. Scott says he slapped a name on it and volunteered to host it on his site. And while Scott is an employee of the Internet Archive, he says that Discmaster is "100 percent unaffiliated" with that organization.

One of the highlights of Discmaster is that it has already done a lot of file format conversion on the back end, making the vintage files more accessible. For example, you can search for vintage music files -- such as MIDI or even digitized Amiga sounds -- and listen to them directly in your browser without any extra tools necessary. The same thing goes for early-90s low-resolution video files, images in obscure formats, and various types of documents. "It's got all the conversion to enable you to preview things immediately," says Scott. "So there's no additional external installation. That, to me, is the fundamental power of what we're dealing with here."
"The value proposition is the value proposition of any freely accessible research database," Scott told Ars Technica. "People are enabled to do deep dives into more history, reference their findings, and encourage others to look in the same place."

"[Discmaster] is probably, to me, one of the most important computer history research project opportunities that we've had in 10 years," says Scott. "It's not done. They've analyzed 7,000 and some-odd CD-ROMs. And they're about to do another 8,000."
Data Storage

Datacenter Fire Takes Out South Korea's Top Two Web Giants (theregister.com) 10

South Korea's two largest domestic internet companies, Naver and Kakao, have experienced significant service interruptions after the datacenter that hosts much of their infrastructure was shut down by a Sunday fire. The Register reports: The datacenter in question is operated by SK C&C, one of the many arms of South Korean conglomerate SK. SK C&C offers a range of cloud and tech infrastructure services, bills itself as a "total digital transformation partner" and operates three datacenters, in which it happily houses client systems. The one in Pangyo, just south of South Korea's capital Seoul, was built in 2014, covers 66,942 square meters, and boasts what SK C&C describes as "Latest/eco-friendly technology". And it caught fire on the weekend. The company has not said what cause the facility to catch fire, nor the extent of the blaze.

But many services from Kakao and Naver were unavailable for many hours at a time, starting from Saturday afternoon. Impact of the outages was wide. The tweet below is an example of one business's reaction. Kakao has acknowledged the outage in a blog post that apologizes for the service interruption and slow restoration, and admits that disaster recovery efforts were delayed. The company has created an Emergency Response Committee and three sub-committees -- one to probe the cause of the incident, another to develop disaster countermeasures, and a third to arrange compensation for stakeholders. Naver's announcement admits that "some functions such as search, news, shopping, cafe, blog, open talk, and smart store center had errors." The company says all services have now been restored.

Data Storage

Can DNA Help Us Store Data for 1,000 Years? (bbc.com) 50

"You know you're a nerd when you store DNA in your fridge," says Dina Zielinski, a senior scientist in human genomics at the French National Institute of Health and Medical Research tells the BBC — holding up a tiny vial with a light film at the bottom: But this DNA is special. It does not store the code from a human genome, nor does it come from any animal or virus. Instead, it stores a digital representation of a museum. "That will last easily tens of years, maybe hundreds," says Zielinski.

Research into how we could store digital data inside strands of DNA has exploded over the past decade, in the wake of efforts to sequence the human genome, synthesise DNA and develop gene therapies. Scientists have already encoded films, books and computer operating systems into DNA. Netflix has even used it to store an episode of its 2020 thriller series Biohackers.

The information stored in DNA defines what it is to be human (or any other species for that matter). But many experts argue it offers an incredibly compact, durable and long-lasting form of storage that could replace the many forms of unreliable digital media available, which regularly become defunct and require huge amounts of energy to store. Meanwhile, some researchers are exploring other ways we could store data effectively forever, such as etching information onto incredibly durable glass beads, a modern take on cave drawings.

Even before the issue of the energy required to power (and cool) data centers, Zielinski points out that data stored on hard drives "lasts on average maybe 10 to 20 years, maybe 50 if you're lucky and the conditions are perfect." And yet we've already been able to recover DNA from million-year-old wooly mammoths...

Olgica Milenkovic, a professor of electrical and computer engineering at the University of Illinois at Urbana-Champaign, acknowledges that DNA can be damaged by things like humidity, acids, and radiation — "But if it's kept cold and dry, it's good for hundreds of years." And if it's stored in an ice vault, "it can last forever, pretty much." (And unlike floppy disks — DNA-formatted data will never become obsolete.)

It's not the only option. Peter Kazansky, a professor in optoelectronics at the University of Southampton, has created an optical storage technology that etches nano-structures onto glass disks. But Latchesar Ionkov, a computer scientist working on DNA storage at Los Alamos National Laboratory, believes we're just decades away from being able to store the estimated 33 zettabytes of data that humans will have produced by 2025 in a space the size of a ping-pong ball.
Open Source

Pine64 Announces 'Sub-$10, Linux-Capable' SBC - the Ox64 (liliputing.com) 90

Pine64 has announced a new "sub $10 Linux capable single board computer" called the Ox64.

Liliputing says the tiny SBC "looks a lot like a Raspberry Pi Pico. But while Raspberry Pi's tiny board is powered by an RP2040 microcontroller, the Ox64 has a dual-core RISC-V processor, 64MB of embedded RAM, and support for up to 128Mb of flash storage plus a microSD card for additional storage." It's expected to support RTOS and Linux and blurs the lines between a microcontroller and a (very low power) single-board PC. It's expected to go on sale in November with prices starting at $6 for an RTOS-ready version of the board and $8 for a Linux-compatible model.

As spotted by CNX Software earlier this month, the board is designed to be a small, inexpensive single-board computer with a RISC-V processor that's aimed at developers.

Pine64's October update also reveals that their Star64 and QuartzPro64 single-board computers "now boot Linux (and run it well too already!)"
Crime

Prison Inmate Accused of Orchestrating $11 Million Fraud Using Cell Cellphone (theregister.com) 75

An anonymous reader quotes a report from The Register: On June 8, 2020, an individual claiming to be billionaire film producer and philanthropist Sidney Kimmel contacted brokerage Charles Schwab by phone and stated that he had uploaded a wire disbursement form using the service's secure email service. The only problem was the call apparently came from prison. Still, the caller made reference to a transfer verification inquiry earlier that day by his wife -- a role said to have been played by a female co-conspirator. The individual allegedly posing as Kimmel had contacted a Schwab customer service representative three days earlier -- on June 5, 2020 -- about opening a checking account, and was told that a form of identification and a utility bill would be required. On June 6, a co-conspirator is alleged to have provided a picture of Kimmel's driver's license and a Los Angeles Water and Power utility bill. According to court documents [PDF] filed by the US Attorney's Office in the Northern District of Georgia, the uploaded documents consisted of a request for funds to be wired to an external bank and a forged letter of authorization -- both of which appeared to be signed by Kimmel.

On June 9, satisfied that Kimmel had been adequately authenticated, the brokerage sent $11 million from Kimmel's Schwab account to a Zions Bank account for Money Metal Exchange, LLC, an Eagle, Idaho-based seller of gold coins and other precious metals. The real Kimmel had no knowledge of the transaction, which resulted in the purchase of 6,106 American Eagle gold coins. The individual who orchestrated the fraudulent purchase of the coins is alleged to have hired a private security firm on June 13, 2020 to transport the coins from Boise, Idaho to Atlanta, Georgia on a chartered plane. An associate of the fraudster allegedly took possession of the coins three days later. All the while the alleged mastermind, Arthur Lee Cofield Jr, was incarcerated in a maximum security prison in Butts County, Georgia, according to the government. Cofield is serving a 14-year sentence for armed robbery and is also under indictment in Fulton County, Georgia for attempted murder.

The day after the coins were purchased, prison staff are said to have searched Cofield's cell and recovered a blue Samsung cellphone hidden under his arm. The prison forensic unit apparently determined that Cofield had been using an account on free voice and messaging service TextNow and matched the phone number with calls made to Money Metals Exchange. On December 8, 2020, a federal grand jury indicted Cofield and two co-conspirators for conspiracy to commit bank fraud and money laundering. Cofield's attorney, Steven Sadow, subsequently sought to suppress the cellphone evidence on Fourth Amendment grounds, arguing that the warrantless search of the device by prison officials was unrelated to the legitimate function of prison security and maintenance. The government said otherwise, insisting that Cofield does not have standing to contest the search, having no "legitimate expectation of privacy in the contents of a contraband cell phone." The judge overseeing the case sided with the government [PDF] and certified the case to proceed to trial.

Power

GM Created a New Energy Business To Sell Batteries, Solar Panels (theverge.com) 17

General Motors is creating a new energy business to sell batteries, charging equipment, solar panels, and software to residential and commercial customers in a broad-based effort to create a range of accessories that can help sell its lineup of electric vehicles. The Verge reports: The new division, GM Energy, is also a direct shot at Tesla as a major player in renewable energy generation and storage. GM has said it intends to eventually overtake Elon Musk's company in vehicles sales -- and now it wants to challenge it on the energy front as well. Travis Hester, GM's chief EV officer, said the company is making a serious grab for a piece of what is potentially a $120-150 billion market for energy generation and storage products. The aim is to make GM's brand synonymous with not just electric vehicles, but a whole host of products and services in orbit around EVs and their rechargeable lithium-ion batteries.

GM Energy will be comprised of three units: Ultium Home, Ultium Commercial, and Ultium Charge 360, which is the company's EV charging program. The division will sell a range of products to residential and commercial customers, including bi-directional charging equipment, vehicle-to- home (V2H) and vehicle-to-grid (V2G) equipment, stationary storage, solar products, software applications, cloud management tools, microgrid solutions, and hydrogen fuel cells. GM Energy will also be in the virtual power plant business. Many EVs with high-capacity batteries are being marketed for their ability to serve as backup power in the event of a blackout. (Hester notes that the Chevy Silverado EV, with its 200kWh battery pack, can power an average sized home for 21 days.) EVs can also feed power back into the grid during times of peak demand. GM Energy will be the entity that sells that power back to the utilities during times of high-energy consumption.

For solar energy, GM is teaming up with San Jose-based SunPower to sell solar panels and home energy storage products to residential customers. SunPower and other partners will supply the solar panels and perform the installations, with GM developing the complimentary software. Over time, as GM's battery factories come online and production of its Ultium-branded battery systems ramps up, the company intends on swapping in its own battery cells and storage units, Hester said. The automaker is also planning on manufacturing its own line of backup power generators using its Hydrotec-branded hydrogen fuel cells. (Ultium is the name of GM's electric vehicle battery and powertrain technology. Last year, the company said the Ultium Charge 360 network would be the name given to GM's own vehicle apps and software with a variety of third-party charging services, such as Blink, ChargePoint, EVgo, Flo, Greenlots, and SemaConnect.)
"But much like its approach to EVs, the dates for the launch of these new products are still a ways off in the future," adds The Verge. "GM is still testing its V2H service in partnership with PG&E with a small sample of residential customers in California, and plans on expanding it to more homes in early 2023. And its solar products won't be available until 2024."
Google

The Pixel Watch Is Official: $349, Good Looks, and a Four-Year-Old SoC 78

An anonymous reader quotes a report from Ars Technica: Google is clawing its way back into wearable relevance. Today the company took the wraps off what is officially its first self-branded smartwatch: the Pixel Watch. Google started revamping its wearable platform, Wear OS, in partnership with Samsung. While Wear OS 3, the new version of Google's wearable platform, technically launched with the Galaxy Watch 4 last year, this is the first time we'll be seeing an unskinned version on a real device. First up: prices. Google is asking a lot here, with the Wi-Fi model going for $349 and the LTE version clocking in at $399. The Galaxy Watch 4, which has a better SoC, and the Apple Watch SE, which has a way, way better SoC, both start at $250. Google is creating an uphill battle for itself with this pricing.

Google and Samsung's partnership means the Pixel Watch is running a Samsung Exynos 9110 SoC, with a cheap Cortex M33 co-processor tacked on for low-power watch face updates and 24/7 stat tracking. This SoC is a 10 nm chip with two Cortex A53 cores and an Arm Mali T720 MP1 GPU. If you can't tell from those specs, this is a chip from 2018 that was first used in the original Samsung Galaxy Watch. For whatever reason, Google couldn't get Samsung's new chip from the Galaxy Watch 4, an Exynos W920 (a big upgrade at 5 nm, dual Cortex A55s, and a Mali-G68 MP2 GPU). It's hard to understand why this is so expensive.

The display is a fully circular 1.6-inch OLED with a density of 320 ppi (that should mean around 360 pixels across). The only size available is 41 mm, the cover is Gorilla Glass 5, and the body is stainless steel in silver, black, or gold. It has 2GB of RAM, 32GB of eMMC storage, NFC, GPS, only 2.4 GHz Wi-Fi 802.11n support (Wi-Fi 4), and a 294 mAh battery. For sensors, you get SPO2 blood oxygen, heart rate, and an ECG sensor. It's water-resistant to 5 ATM, which means you're good for submersion, hand washing, and most normal water exposure. Usually 10 ATM is preferred for serious sports swimming, but the Apple Watch is 5 ATM, and Apple does all sorts of swimming promos. Google's black UI background does a good job of hiding exactly how large the display is in relation to the body, but a few screenshots reveal just how big the bezels are around this thing. They are big. Real big. Like, hard-to-imagine-we're-still-doing-this-in-2022 big.
Other things to note: the watch bands are proprietary, it'll be able to charge to 50 percent in 30 minutes, will work with any Android phone running version 8.0 and newer, and features Fitbit integration.

"Unlike the Pixel 7, which is expanding to 17 markets, the Pixel Watch is only for sale in eight countries: the US, Canada, UK, Germany, France, Australia, Japan, and Taiwan," adds Ars. "The watch is up for preorder today and ships October 13."

Further reading: Google Unveils Pixel 7 and Pixel 7 Pro Smartphones
Data Storage

Big Tech, Banks, Government Departments Shred Millions of Storage Devices They Could Reuse (ft.com) 80

Companies such as Amazon and Microsoft, as well as banks, police services and government departments, shred millions of data-storing devices each year, the Financial Times has learnt through interviews with more than 30 people who work in and around the decommissioning industry and via dozens of freedom of information requests. From the report: This is despite a growing chorus of industry insiders who say there is another, better option to safely dispose of data: using computer software to securely wipe the devices before selling them on the secondary market. "From a data security perspective, you do not need to shred," says Felice Alfieri, a European Commission official who co-authored a report about how to make data centres more sustainable and is promoting "data deletion" over device destruction. Underpinning the reluctance to move away from shredding is the fear that data could leak, triggering fury from customers and huge fines from regulators.

Last month, the US Securities and Exchange Commission fined Morgan Stanley $35mn for an "astonishing" failure to protect customer data, after the bank's decommissioned servers and hard drives were sold on without being properly wiped by an inexperienced company it had contracted. This was on top of a $60mn fine in 2020 and a $60mn class action settlement reached earlier this year. Some of the hardware containing bank data ended up being auctioned online. While the incident stemmed from a failure to wipe the devices before selling them on, the bank now mandates that every one of its data-storing devices is destroyed -- the vast majority on site. This approach is widespread. One employee at Amazon Web Services, who spoke on condition of anonymity, explained that the company shreds every single data-storing device once it is deemed obsolete, usually after three to five years of use: "If we let one [piece of data] slip through, we lose the trust of our customers." A person with knowledge of Microsoft's data disposal operations says the company shreds everything at its 200-plus Azure data centres.

Cloud

Google Picks South Africa For Its First Cloud Region In Africa (techcrunch.com) 16

An anonymous reader quotes a report from TechCrunch: Tech giant Google has today announced the launch of a cloud region in South Africa, its first in the continent, playing catch-up to other top providers like Amazon Web Services (AWS) and Microsoft Azure, which made inroads into the continent a few years ago. Google said it is also building Dedicated Cloud Interconnect sites, which link users' on-premises networks with Google's grid, in Nairobi (Kenya), Lagos (Nigeria) and South Africa (Capetown and Johannesburg), in its quest to provide full-scale cloud capabilities for its customers and partners in Africa.

Google plans to tap its private subsea cable, Equiano, which connects Africa and Europe, to power the sites. Equiano has been under development since 2019 and has so far made four landings -- in Togo, Namibia, Nigeria and South Africa. South Africa now joins Google's global network of 35 cloud regions and 106 zones worldwide, and the announcement follows the recent preview launch of regions in Malaysia, Thailand and New Zealand. Google Cloud regions allow users to deploy cloud resources from specific geographic locations, and access several services including cloud storage, compute engine and key management systems.

The decision to set up a region in South Africa was informed by the demand for cloud services and the market's potential. Still, the company is looking to launch in more markets within the continent as demand for its products soars. Its early adopters include large enterprise companies, and e-commerce firms like South Africa's TakeAlot and Kenya's Twiga. According to research by AlphaBeta Economics, commissioned by Google Cloud, the South Africa cloud region will contribute over $2.1 billion to South Africa's GDP and support the creation of more than 40,000 jobs by 2030. Google Cloud, Azure by Microsoft and AWS are the three biggest public cloud storage players in the world, according to data from Gartner, but it's unclear why, until now, Google has been absent in Africa.
"We are excited to announce the first Google Cloud region in Africa. The new region will allow for the localization of applications and services. It will make it really easier for our customers and partners to quickly deploy solutions for their businesses, whereby they're able to leverage our computer artificial intelligence or machine learning capabilities, and data analytics to make smarter business decisions as they go forward," said Google Cloud Africa director, Niral Patel.

"What we're doing here is giving customers and partners a choice on where they'd like to store their data and where they'd like to consume cloud services, especially in the context of data sovereignty. This allows customers to then store the data in the country should they choose to do so... I guess for me the most important element is that it gives customers the element of choice."
Google

Universities Adapt To Google's New Storage Fees, Or Migrate Away Entirely 91

united_notions writes: Back in February, Slashdot reported that Google would be phasing out free unlimited storage within Google Apps for Education. Google had a related blog post dressing it up in the exciting language of "empowering institutions" and so forth. Well, now universities all over are waking up to the consequences.

Universities in Korea are scrambling to reduce storage use, or migrating to competitors like Naver, while also collectively petitioning Google on the matter. California State University, Chico has a plan to shoe-horn its storage (and restrict its users) to limbo under Google's new limits. UC San Diego is coughing up for fees but apparently under a "favorable" deal, and still with some limits. The University of Cambridge will impose a 20GB per user limit in December 2022. And so on.

If you're at a university, what is your IT crowd telling you? Have they said anything? If not, you may want to ask.
Technology

Magic Leap's Smaller, Lighter Second-Gen AR Glasses Are Now Available (engadget.com) 14

Magic Leap's second take on augmented reality eyewear is available. "The glasses are still aimed at developers and pros, but they include a number of design upgrades that make them considerably more practical -- and point to where AR might be headed," reports Engadget. From the report: The design is 50 percent smaller and 20 percent lighter than the original. It should be more comfortable to wear over long periods, then. Magic Leap also promises better visibility for AR in bright light (think a well-lit office) thanks to "dynamic dimming" that makes virtual content appear more solid. Lens optics supposedly deliver higher quality imagery with easier-to-read text, and the company touts a wider field of view (70 degrees diagonal) than comparable wearables.

You can expect decent power that includes a quad-core AMD Zen 2-based processor in the "compute pack," a 12.6MP camera (plus a host of cameras for depth, eye tracking and field-of-view) and 60FPS hand tracking for gestures. You'll only get 3.5 hours of non-stop use, but the 256GB of storage (the most in any dedicated AR device, Magic Leap claims) provides room for more sophisticated apps.
The base model of the glasses costs $3,299, with the Enterprise model amounting to about $5,000.
Earth

Climate Change Is Turning Trees Into Gluttons (phys.org) 107

Hmmmmmm shares a report from Phys.Org: Trees have long been known to buffer humans from the worst effects of climate change by pulling carbon dioxide from the atmosphere. Now new research shows just how much forests have been bulking up on that excess carbon. The study, recently published in the Journal Nature Communications, finds that elevated carbon dioxide levels in the atmosphere have increased wood volume -- or the biomass -- of forests in the United States. Although other factors like climate and pests can somewhat affect a tree's volume, the study found that elevated carbon levels consistently led to an increase of wood volume in 10 different temperate forest groups across the country. This suggests that trees are helping to shield Earth's ecosystem from the impacts of global warming through their rapid growth.

Over the last two decades, forests in the United States have sequestered about 700-800 million tons of carbon dioxide per year, which, according to the study, accounts for roughly 10% to 11% of the country's total carbon dioxide emissions. While exposure to high levels of carbon dioxide can have ill effects on natural systems and infrastructure, trees have no issue gluttoning themselves on Earth's extra supply of the greenhouse gas. To put it in perspective, if you imagine a tree as just a huge cylinder, the added volume the study finds essentially amounts to an extra tree ring. Although such growth may not be noticeable to the average person, compared to the trees of 30 years ago, modern vegetation is about 20% to 30% bigger than it used to be. If applied to the Coast Redwood forests -- home to some of the largest trees in the world -- even a modest percentage increase means a lot of additional carbon storage in forests. Researchers also found that even older large trees continue adding biomass as they age due to elevated carbon dioxide levels.
"Forests are taking carbon out of the atmosphere at a rate of about 13% of our gross emissions," said Brent Sohngen, co-author of the study and professor of environmental and resource economics at The Ohio State University. "While we're putting billions of tons of carbon dioxide into the atmosphere, we're actually taking much of it out just by letting our forests grow."
Media

Better Than JPEG? Researcher Discovers That Stable Diffusion Can Compress Images (arstechnica.com) 93

An anonymous reader quotes a report from Ars Technica: Last week, Swiss software engineer Matthias Buhlmann discovered that the popular image synthesis model Stable Diffusion could compress existing bitmapped images with fewer visual artifacts than JPEG or WebP at high compression ratios, though there are significant caveats. Stable Diffusion is an AI image synthesis model that typically generates images based on text descriptions (called "prompts"). The AI model learned this ability by studying millions of images pulled from the Internet. During the training process, the model makes statistical associations between images and related words, making a much smaller representation of key information about each image and storing them as "weights," which are mathematical values that represent what the AI image model knows, so to speak.

When Stable Diffusion analyzes and "compresses" images into weight form, they reside in what researchers call "latent space," which is a way of saying that they exist as a sort of fuzzy potential that can be realized into images once they're decoded. With Stable Diffusion 1.4, the weights file is roughly 4GB, but it represents knowledge about hundreds of millions of images. While most people use Stable Diffusion with text prompts, Buhlmann cut out the text encoder and instead forced his images through Stable Diffusion's image encoder process, which takes a low-precision 512x512 image and turns it into a higher-precision 64x64 latent space representation. At this point, the image exists at a much smaller data size than the original, but it can still be expanded (decoded) back into a 512x512 image with fairly good results.

While running tests, Buhlmann found that images compressed with Stable Diffusion looked subjectively better at higher compression ratios (smaller file size) than JPEG or WebP. In one example, he shows a photo of a candy shop that is compressed down to 5.68KB using JPEG, 5.71KB using WebP, and 4.98KB using Stable Diffusion. The Stable Diffusion image appears to have more resolved details and fewer obvious compression artifacts than those compressed in the other formats. Buhlmann's method currently comes with significant limitations, however: It's not good with faces or text, and in some cases, it can actually hallucinate detailed features in the decoded image that were not present in the source image. (You probably don't want your image compressor inventing details in an image that don't exist.) Also, decoding requires the 4GB Stable Diffusion weights file and extra decoding time.
Buhlmann's code and technical details about his findings can be found on Google Colab and Towards AI.
Data Storage

Small Dongle Brings the HDD Clicking Back To SSDs In Retro PCs (hackaday.com) 117

Longtime Slashdot reader root_42 writes: Remember the clicking sounds of spinning hard disks? One "problem" with retro computing is that we replace those disks with compact flash, SD cards or even SSDs. Those do not make any noises that you can hear under usual circumstances, which is partly nice because the computer becomes quieter, but also irritating because sometimes you can't tell if the computer has crashed or is still working. This little device fixes that issue! It's called the HDD Clicker and it's a very unique little gadget. "An ATtiny and a few support components ride on a small PCB along with a piezoelectric speaker," describes Hackaday. "The dongle connects to the hard drive activity light, which triggers a series of clicks from the speaker that sound remarkably like a hard drive heading seeking tracks."

A demo of the device can be viewed at 7:09, with a full defragmentation at 13:11.
Power

When's the Best Time To Charge Your EV? Not at Night, Stanford Study Finds (stanford.edu) 190

The vast majority of electric vehicle owners charge their cars at home in the evening or overnight. We're doing it wrong, according to a new Stanford study. From the report: In March, the research team published a paper on a model they created for charging demand that can be applied to an array of populations and other factors. In the new study, published Sept. 22 in Nature Energy, they applied their model to the whole of the Western United States and examined the stress the region's electric grid will come under by 2035 from growing EV ownership. In a little over a decade, they found, rapid EV growth alone could increase peak electricity demand by up to 25%, assuming a continued dominance of residential, nighttime charging. To limit the high costs of all that new capacity for generating and storing electricity, the researchers say, drivers should move to daytime charging at work or public charging stations, which would also reduce greenhouse gas emissions. This finding has policy and investment implications for the region and its utilities, especially since California moved in late August to ban sales of gasoline-powered cars and light trucks starting in 2035. [...]

Current time-of-use rates encourage consumers to switch electricity use to nighttime whenever possible, like running the dishwasher and charging EVs. This rate structure reflects the time before significant solar and wind power supplies when demand threatened to exceed supply during the day, especially late afternoons in the summer. Today, California has excess electricity during late mornings and early afternoons, thanks mainly to its solar capacity. If most EVs were to charge during these times, then the cheap power would be used instead of wasted. Alternatively, if most EVs continue to charge at night, then the state will need to build more generators -- likely powered by natural gas -- or expensive energy storage on a large scale. Electricity going first to a huge battery and then to an EV battery loses power from the extra stop. At the local level, if a third of homes in a neighborhood have EVs and most of the owners continue to set charging to start at 11 p.m. or whenever electricity rates drop, the local grid could become unstable.

Another issue with electricity pricing design is charging commercial and industrial customers big fees based on their peak electricity use. This can disincentivize employers from installing chargers, especially once half or more of their employees have EVs. The research team compared several scenarios of charging infrastructure availability, along with several different residential time-of-use rates and commercial demand charges. Some rate changes made the situation at the grid level worse, while others improved it. Nevertheless, a scenario of having charging infrastructure that encourages more daytime charging and less home charging provided the biggest benefits, the study found.
"The findings from this paper have two profound implications: the first is that the price signals are not aligned with what would be best for the grid -- and for ratepayers. The second is that it calls for considering investments in a charging infrastructure for where people work," said Ines Azevedo, one of the co-senior authors of the study.

"We need to move quickly toward decarbonizing the transportation sector, which accounts for the bulk of emissions in California," Azevedo continued. "This work provides insight on how to get there. Let's ensure that we pursue policies and investment strategies that allow us to do so in a way that is sustainable."
The Internet

Neal Stephenson's Lamina1 Drops White Paper On Building the Open Metaverse (venturebeat.com) 49

An anonymous reader quotes a report from VentureBeat: Neal Stephenson's Lamina1 blockchain technology startup dropped a white paper today on building the open metaverse. It's quite the manifesto. In the document, the company said its mission is to deliver a Layer 1 blockchain, interoperating tools and decentralized services optimized for the open metaverse -- providing communities with infrastructure, not gatekeepers to build a more immersive internet. The effort includes some new original content: Under active early-stage development, Neal Stephenson's THEEE METAVERSE promises a richly-imagined interactive virtual world with an unforgettable origin story, the paper said. Built on the Lamina1 chain, creators will come to experience Neal's vision and stay to develop their own. Stay tuned for more details, the paper said. [...]

In the paper, Stephenson said, "Inexorable economic forces drive investors to pay artists as little as possible while steering their creative output in the directions that involve the least financial risk." The aim is to correct the sins of the past. The paper said that Web2 introduced a period of rapid innovation and unprecedented access to entertainment, information and goods on a global scale. Streamlined tools and usability brought creators and innovators to the web en masse to build digital storefronts, engage and transact with their customers. Owning and controlling that growing ecosystem of content and personal data became a primary, lucrative initiative for major corporations. Consumer behavior, recorded on centralized company servers, offered constant, privileged insight into how to monetize human emotion and attention, Lamina1 said. At its best, Web3 envisions a better world through the thoughtful redesigning of our online lives, instituting stronger advocacy for our interests, our freedom and our rights, the company said. Much as Web2 flourished with the maturity of tools and services that offered creators and consumers ease of use, the open metaverse will benefit from open protocols for payments and data, and a set of interoperating decentralized services to support virtual worlds. Lamina1 will be the rallying point for an ecosystem of open source tools, open standards and enabling technologies conceived and co-developed with a vibrant community of creators. [...]

Lamina1 said it approaches the open metaverse with a multi-pronged approach: Layer 1 blockchain, metaverse-as-a-Service (MaaS), community economic participation and incentives and original content. Lamina1 said it uses a high-speed Proof-of-Stake (PoS) consensus algorithm, customized to support the needs of content creators -- providing provenance for creatorship and enabling attributive and behavioral characteristics of an object to be minted, customized and composed on-chain. "We chose to start with Avalanche, a robust generalized blockchain that delivers the industry's most scalable and environmentally-efficient chain for managing digital assets to date. This starting point provides Lamina1 with a flexible architecture and an extendable platform to support our goals in data storage, interoperability, integration incentives, carbon-negative operation, messaging, privacy, high-scale payments and identity," the white paper said. Lamina1 said its metaverse services work will explore creating a metaverse browser and it will align itself with the Metaverse Standards Forum.
To enlist community support, the company isn't aligning with Big Tech. "We march waving the pirate flag at the front of the cultural movement, asking both creators and consumers to join the fight for greater agency and ownership -- the fight for an economy that is imagined, produced and owned by its creators," Lamina1 said. "It's going to be hard, and it's going to take heart, but the upside of providing a maker direct access to their market is staggering."

The paper added, "At Lamina1, we believe two things will power expansion and growth in the metaverse -- a straightforward and principled approach to serving a diverse, open and self-sustaining community of makers, and a powerful ecosystem of content and experiences that will drive fans and funding directly to the platform."
Earth

The World's Largest Carbon Removal Project Yet Is Headed For Wyoming (theverge.com) 76

A couple of climate tech startups plan to suck a hell of a lot of carbon dioxide out of the air and trap it underground in Wyoming. The Verge reports: The goal of the new endeavor, called Project Bison, is to build a new facility capable of drawing down 5 million metric tons of carbon dioxide annually by 2030. The CO2 can then be stored deep within the Earth, keeping it out of the atmosphere, where it would have continued to heat up the planet. A Los Angeles-based company called CarbonCapture is building the facility, called a direct air capture (DAC) plant, that is expected to start operations as early as next year. It'll start small and work up to 5 million metric tons a year. If all goes smoothly by 2030, the operation will be orders of magnitude larger than existing direct air capture projects.

CarbonCapture's equipment is modular, which is what the company says makes the technology easy to scale up. The plant itself will be made of modules that look like stacks of shipping containers with vents that air passes through. At first, the modules used for Project Bison will be made at CarbonCapture's headquarters in Los Angeles. In the first phase of the project, expected to be completed next year, around 25 modules will be deployed in Wyoming. Those modules will collectively have the capacity to remove about 12,000 tons of CO2 a year from the air. The plan is to deploy more modules in Wyoming over time and potentially manufacture the modules there one day, too.

Inside each of the 40-foot modules are about 16 "reactors" with "sorbent cartridges" that essentially act as filters that attract CO2. The filters capture about 75 percent of the CO2 from the air that passes over them. Within about 30 to 40 minutes, the filters have absorbed all the CO2 they can. Once the filters are fully saturated, the reactor goes offline so that the filters can be heated up to separate out the CO2. There are many reactors within one module, each running at its own pace so that they're constantly collecting CO2. Together, they generate concentrated streams of CO2 that can then be compressed and sent straight to underground wells for storage. DAC is still very expensive -- it can cost upwards of $600 to capture a ton of carbon dioxide. That figure is expected to come down with time as the technology advances. But for now, it takes a lot of energy to run DAC plants, which contributes to the big price tag. The filters need to reach around 85 degrees Celsius (185 degrees Fahrenheit) for a few minutes, and getting to those kinds of high temperature for DAC plants can get pretty energy-intensive. Eventually, [...] Bison plans to get enough power from new wind and solar installations. When the project is running at its full capacity in 2030, it's expected to use the equivalent of about 2GW of solar energy per year. For comparison, about 3 million photovoltaic panels together generate a gigawatt of solar energy, according to the Department of Energy. But initially, the energy used by Project Bison might have to come from natural gas, according to Corless. So Bison would first need to capture enough CO2 to cancel out the amount of emissions it generates by burning through that gas before it can go on to reduce the amount of CO2 in the atmosphere.
"The geology in Wyoming allows Project Bison to store the captured CO2 on-site near the modules," adds The Verge. "Project Bison plans to permanently store the CO2 it captures underground. Specifically, project leaders are looking at stowing it 12,000 feet underground in 'saline aquifers' -- areas of rock that are saturated with salt water."
Google

Google Partners With Framework To Launch Upgradable and Customizable Chromebook (theverge.com) 14

Framework and Google have announced the new Framework Laptop Chromebook Edition. As the name implies, this is an upgradable, customizable Chromebook from the same company that put out the Framework laptop last year. From a report: User-upgradable laptops are rare enough already, but user-upgradable Chromebooks are nigh unheard of. While the size of the audience for such a device may remain to be seen, it's certainly a step in the right direction for repairability in the laptop space as a whole. Multiple parts of the Framework are user-customizable, though it's not clear whether every part that's adjustable on the Windows Framework can be adjusted on the Chromebook as well. Each part has a QR code on it which, if scanned, brings up the purchase page for the part's replacement. Most excitingly (to me), the Chromebook Edition includes the same expansion card system as the Windows edition, meaning you can choose the ports you want and where to put them. I don't know of any other laptop, Windows or Chrome OS, where you can do this, and it's easily my personal favorite part of Framework's model. You can choose between USB-C, USB-A, microSD, HDMI, DisplayPort, Ethernet, high-speed storage, "and more," per the press release. HDMI, in particular, is a convenient option to have on a Chromebook.
Data Storage

Morgan Stanley Hard Drives With Client Data Turn Up On Auction Site (nytimes.com) 70

An anonymous reader quotes a report from the New York Times: Morgan Stanley Smith Barney has agreed to pay a $35 million fine to settle claims that it failed to protect the personal information of about 15 million customers, the Securities and Exchange Commission said on Tuesday. In a statement announcing the settlement, the S.E.C. described what it called Morgan Stanley's "extensive failures," over a five-year period beginning in 2015, to safeguard customer information, in part by not properly disposing of hard drives and servers that ended up for sale on an internet auction site.

On several occasions, the commission said, Morgan Stanley hired a moving and storage company with no experience or expertise in data destruction services to decommission thousands of hard drives and servers containing the personal information of millions of its customers. The moving company then sold thousands of the devices to a third party, and the devices were then resold on an unnamed internet auction site, the commission said. An information technology consultant in Oklahoma who bought some of the hard drives on the internet chastised Morgan Stanley after he found that he could still access the firm's data on those devices.

Morgan Stanley is "a major financial institution and should be following some very stringent guidelines on how to deal with retiring hardware," the consultant wrote in an email to Morgan Stanley in October 2017, according to the S.E.C. The firm should, at a minimum, get "some kind of verification of data destruction from the vendors you sell equipment to," the consultant wrote, according to the S.E.C. Morgan Stanley eventually bought the hard drives back from the consultant. Morgan Stanley also recovered some of the other devices that it had improperly discarded, but has not recovered the "vast majority" of them, the commission said.
The settlement also notes that Morgan Stanley "had not properly disposed of consumer report information when it decommissioned servers from local offices and branches as part of a 'hardware refresh program' in 2019," reports the Times. "Morgan Stanley later learned that the devices had been equipped with encryption capability, but that it had failed to activate the encryption software for years, the commission said."

Slashdot Top Deals