Science

The Coronavirus in a Tiny Drop (nytimes.com) 63

To better understand the coronavirus's journey from one person to another, a team of 50 scientists has for the first time created an atomic simulation of the coronavirus nestled in a tiny airborne drop of water. From a report: To create the model, the researchers needed one of the world's biggest supercomputers to assemble 1.3 billion atoms and track all their movements down to less than a millionth of a second. This computational tour de force is offering an unprecedented glimpse at how the virus survives in the open air as it spreads to a new host. "Putting a virus in a drop of water has never been done before," said Rommie Amaro, a biologist at the University of California San Diego who led the effort, which was unveiled at the International Conference for High Performance Computing, Networking, Storage and Analysis last month. "People have literally never seen what this looks like."

How the coronavirus spreads through the air became the subject of fierce debate early in the pandemic. Many scientists championed the traditional view that most of the virus's transmission was made possible by larger drops, often produced in coughs and sneezes. Those droplets can travel only a few feet before falling to the floor. But epidemiological studies showed that people with Covid-19 could infect others at a much greater distance. Even just talking without masks in a poorly ventilated indoor space like a bar, church or classroom was enough to spread the virus. Those findings pointed to much smaller drops, called aerosols, as important vehicles of infection. Scientists define droplets as having a diameter greater than 100 micrometers, or about 4 thousandths of an inch. Aerosols are smaller -- in some cases so small that only a single virus can fit inside them. And thanks to their minuscule size, aerosols can drift in the air for hours.

Space

The Largest Comet We've Ever Seen Just Delivered a Curious Surprise (sciencealert.com) 18

schwit1 shares a report from ScienceAlert: The comet Bernardinelli-Bernstein (BB) -- the largest our telescopes have ever spotted -- is on a journey from the outer reaches of our Solar System that will see it flying relatively close to Saturn's orbit. Now, a new analysis of the data we've collected on BB has revealed something rather surprising. Digging into readings logged by the Transient Exoplanet Survey Satellite (TESS) between 2018 and 2020, researchers have discovered that BB became active much earlier, and much farther out from the Sun, than was previously thought.

A comet becomes active when light from the Sun heats its icy surface, turning ice to vapor and releasing trapped dust and grit. The resulting haze, called a coma, can be useful for astronomers in working out exactly what a particular comet is made out of. In the case of BB, it's still too far out for water to sublimate. Based on studies of comets at similar distances, it's likely that the emerging fog is driven instead by a slow release of carbon monoxide. Only one active comet has previously been directly observed at a greater distance from the Sun, and it was much smaller than BB.
"These observations are pushing the distances for active comets dramatically farther than we have previously known," says astronomer Tony Farnham, from the University of Maryland (UMD). "We make the assumption that comet BB was probably active even farther out, but we just didn't see it before this. What we don't know yet is if there's some cut-off point where we can start to see these things in cold storage before they become active."

The research has been published in the Planetary Science Journal.
Data Storage

Microsoft Makes Breakthrough In the Quest To Use DNA As Data Storage (gizmodo.com) 43

An anonymous reader quotes a report from Gizmodo: Microsoft, one of the pioneers of DNA storage, is making some headway, working with the University of Washington's Molecular Information Systems Laboratory, or MISL. The company announced in a new research paper the first nanoscale DNA storage writer, which the research group expects to scale for a DNA write density of 25 x 10^6 sequences per square centimeter, or "three orders of magnitude" (1,000x) more tightly than before. What makes this particularly significant is that it's the first indication of achieving the minimum write speeds required for DNA storage.

Microsoft is one of the biggest players in cloud storage and is looking at DNA data storage to gain an advantage over the competition by using its unparalleled density, sustainability, and shelf life. DNA is said to have a density capable of storing one exabyte, or 1 billion gigabytes, per square inch -- an amount many magnitudes larger than what our current best storage method, Linear Type-Open (LTO) magnetic tape, can provide. What do these advantages mean in real-world terms? Well, the International Data Corporation predicts data storage demands will reach nine zettabytes by 2024. As Microsoft notes, only one zettabyte of storage would be used if Windows 11 were downloaded on 15 billion devices. Using current methods, that data would need to be stored on millions of tape cartridges. Cut the tape and use DNA, and nine zettabytes of information can be stored in an area as small as a refrigerator (some scientists say every movie ever released could fit in the footprint of a sugar cube). But perhaps a freezer would be a better analogy, because data stored on DNA can last for thousands of years whereas data loss occurs on tape with 30 years and even sooner on SSDs and HDDs.

Finding ways to increase write speeds addresses one of the two main problems with DNA storage (the other being cost). With the minimum write speed threshold within grasp, Microsoft is already pushing ahead with the next phase. "A natural next step is to embed digital logic in the chip to allow individual control of millions of electrode spots to write kilobytes per second of data in DNA, and we foresee the technology reaching arrays containing billions of electrodes capable of storing megabytes per second of data in DNA. This will bring DNA data storage performance and cost significantly closer to tape," Microsoft told TechRadar.

United States

Wanted: A Town Willing to Host a Dump for U.S. Nuclear Waste (bloomberg.com) 335

The Biden administration is looking for communities willing to serve as temporary homes for tens of thousands of metric tons of nuclear waste currently stranded at power plants around the country. Bloomberg reports: The Energy Department filed (PDF) a public notice Tuesday that it is restarting the process for finding a voluntary host for spent nuclear fuel until a permanent location is identified. "Hearing from and then working with communities interested in hosting one of these facilities is the best way to finally solve the nation's spent nuclear fuel management issues," Energy Secretary Jennifer Granholm said in a statement. The agency, in its notice, requested input on how to proceed with a "consent-based" process for a federal nuclear storage facility, including what benefits could entice local and state governments and how to address potential impediments. Federal funding is also possible, the notice said. Approximately 89,000 metric tons of nuclear waste is being stored at dozens of nuclear power plants and other sites around the country.
[...]
One such interim storage site could be in Andrews, Texas. The Nuclear Regulatory Commission in September approved a license for a proposal by Orano CIS LLC and its joint venture partner, J.F. Lehman & Co.'s Waste Control Specialists LLC, to establish a repository in the heart of Texas' Permian Basin oil fields for as many as 40,000 metric tons of radioactive waste. The joint venture envisioned having nuclear waste shipped by rail from around the country and sealed in concrete casks where it would be stored above ground at a site about 30 miles (48.28 kilometers) from Andrews. But the plan has drawn opposition from Texas authorities and local officials who once embraced it as an economic benefit but have since had a change of heart. A similar nuclear waste storage project, proposed in New Mexico by Holtec International Corp., is awaiting approval by the Nuclear Regulatory Commission. The agency said it expects to make a decision on that proposal in January 2022.

Earth

The World Needs To Crack Battery Recycling, Fast (wired.co.uk) 97

As batteries start to pile up, carmakers, battery companies and researchers are trying to save them from ending up in landfills. From a report: Recyclers are primarily interested in extracting the valuable metals and minerals in the cells. Getting to these materials is complex and dangerous: After removing the steel casing, the battery pack needs to be unbundled into cells carefully, to avoid puncturing any hazardous materials. The electrolyte, a liquid whose job it is to move lithium ions between the cathode and anode, can catch fire or even explode if heated. Only once the pack has been dismantled, recyclers can safely extract the conductive lithium, nickel, copper, and cobalt.

Used in the cathode, cobalt is the most sought-after material used in batteries. In its raw form, the rare, bluish-grey metal is predominantly sourced from the Democratic Republic of Congo, where miners work in perilous conditions. The world's major electric car manufacturers are already moving away from cobalt, deterred by the human rights abuses, shortages in the supply chain. That raises the question of whether recyclers will still find it worthwhile to dismantle newer battery types lacking the most valuable ingredients. "When you move to more sustainable materials, and lower cost materials, the incentive to recycle and recover them diminishes," says Jenny Baker, an energy storage expert at Swansea University. She likens this to a dilemma in consumer electronics: It is often cheaper to buy a new mobile phone than to get it fixed or recycled.

[...] In a first step, recyclers typically shred the cathode and anode materials of spent batteries into a powdery mixture, the so-called black mass. In the board game analogy, this would be the first slide down on a snake, Gavin Harper, a research fellow at the University of Birmingham, explains. The black mass can then be processed in one of two ways to extract its valuable components. One method, called pyrometallurgy, involves smelting the black mass in a furnace powered with fossil fuels. It's a relatively cheap method but a lot of lithium, aluminium, graphite and manganese is lost in the process. Another method, hydrometallurgy, leaches the metals out of the black mass by dissolving it in acids and other solvents. This method, Harper says, would correspond to a shorter snake in the board game, because more material can be recovered: you fall back, but not by as many squares as when using pyrometallurgy. The process, however, consumes a lot of energy and produces toxic gases and wastewater.

Piracy

Is 'The NFT Bay' Just a Giant Hoax? (clubnft.com) 74

Recently Australian developer Geoffrey Huntley announced they'd created a 20-terabyte archive of all NFTs on the Ethereum and Solana blockchains.

But one NFT startup company now says they tried downloading the archive — and discovered most of it was zeroes. Many of the articles are careful to point out "we have not verified the contents of the torrent," because of course they couldn't. A 20TB torrent would take several days to download, necessitating a pretty beefy internet connection and more disk space to store than most people have at their disposal. We at ClubNFT fired up a massive AWS instance with 40TB of EBS disk space to attempt to download this, with a cost estimate of $10k-20k over the next month, as we saw this torrent as potentially an easy way to pre-seed our NFT storage efforts — not many people have these resources to devote to a single news story.

Fortunately, we can save you the trouble of downloading the entire torrent — all you need is about 10GB. Download the first 10GB of the torrent, plus the last block, and you can fill in all the rest with zeroes. In other words, it's empty; and no, Geoff did not actually download all the NFTs. Ironically, Geoff has archived all of the media articles about this and linked them on TheNFTBay's site, presumably to preserve an immutable record of the spread and success of his campaign — kinda like an NFT...

We were hoping this was real... [I]t is actually rather complicated to correctly download and secure the media for even a single NFT, nevermind trying to do it for every NFT ever made. This is why we were initially skeptical of Geoff's statements. But even if he had actually downloaded all the NFT media and made it available as a torrent, this would not have solved the problem... a torrent containing all the NFTs does nothing to actually make those NFTs available via IPFS, which is the network they must be present on in order for the NFTs to be visible on marketplaces and galleries....

[A]nd this is a bit in the weeds: in order to reupload an NFT's media to IPFS, you need more than just the media itself. In order to restore a file to IPFS so it can continue to be located by the original link embedded in the NFT, you must know exactly the settings used when that file was originally uploaded, and potentially even the exact version of the IPFS software used for the upload.

For these reasons and more, ClubNFT is working hard on an actual solution to ensure that everybody's NFTs can be safely secured by the collectors themselves. We look forward to providing more educational resources on these and other topics, and welcome the attention that others, like Geoff, bring to these important issues.

Their article was shared by a Slashdot reader (who is one of ClubNFT's three founders). I'd wondered suspiciously if ClubNFT was a hoax, but if this PR Newswire press release is legit, they've raised $3 million in seed funding. (And that does include an investment from Drapen Dragon, co-founded by Tim Draper which shows up on CrunchBase). The International Business Times has also covered ClubNFT, identifying it as a startup whose mission statement is "to build the next generation of NFT solutions to help collectors discover, protect, and share digital assets." Co-founder and CEO Jason Bailey said these next-generation tools are in their "discovery" phase, and one of the first set of tools that is designed to provide a backup solution for NFTs will roll out early next year. Speaking to International Business Times, Bailey said, "We are looking at early 2022 to roll out the backup solution. But between now and then we should be feeding (1,500 beta testers) valuable information about their wallets." Bailey says while doing the beta testing, he realized that there are loopholes in the NFT storage systems and only 40% of the NFTs were actually pointing to the IPFS, while 40% of them were at risk — pointing to private servers.

Here is the problem explained: NFTs are basically a collection of metadata, that define the underlying property that is owned. Just like in the world of internet documents, links point to the art and any details about it that are being stored. But links can break, or die. Many NFTs use a system called InterPlanetary File System, or IPFS, which let you find a piece of content as long as it is hosted somewhere on the IPFS network. Unlike in the world of internet domains, you don't need to own the domain to really make sure the data is safe. Explaining the problem which the backup tool will address, Bailey said, "When you upload an image to IPFS, it creates a cryptographic hash. And if someone ever stops paying to store that image on IPFS, as long as you have the original image, you can always restore it. That's why we're giving people the right to download the image.... [W]e're going to start with this protection tool solution that will allow people to click a button and download all the assets associated with their NFT collection and their wallet in the exact format that they would need it in to restore it back up to IPFS, should it ever disappear. And we're not going to charge any money for that."

The idea, he said, is that collectors should not have to trust any company; rather they can use ClubNFT's tool, whenever it becomes available, to download the files locally... "One of the things that we're doing early around that discovery process, we're building out a tool that looks in your wallet and can see who you collect, and then go a level deeper and see who they collect," Bailey said. Bailey said that the rest of the tools will process after gathering lessons based on user feedback on the first set of solutions. He, however, seemed positive that the talks of the next set of tools will begin in the Spring of next year as the company has laid a "general roadmap."

Power

Could Fusion Energy Provide a Safer Alternative to Nuclear Power? (thebulletin.org) 239

"One way to help eliminate carbon emissions and thereby fight global warming may be to exploit fusion, the energy source of the sun and stars..." argues a new article in Bulletin of the Atomic Scientists (shared by Slashdot reader DanDrollette).

Though fusion energy would involve controllng a "plasma" gas of positively charged nuclei and negatively charged electrons heated to 150 million degrees Celsius, progress is being made — and the upside could be tremendous: One major advantage of using fusion as an energy source is that its underlying physics precludes either a fuel meltdown — such as what happened at Three Mile Island and Fukushima Daichi — or a runaway reaction, such as at Chernobyl. Furthermore, the amount of radioactive material that could be released in an accident in a fusion power plant system is much less than in a fission reactor. Consequently, a fusion system has much less capability to damage itself, and any damage would have much less dangerous consequences. As a result, current concepts for fusion systems may not necessitate an evacuation plan beyond the site boundary. Another advantage of fusion is that neither the fuel nor its products create the very long-lived radioactive waste that fission does, which means that fusion does not require long-term, geological storage...

When and how can fusion contribute to mitigating climate change? Private companies are in a hurry to develop fusion, and many say that they will be able to put commercial fusion power on the US electric grid in the early 2030s. The total private financing in this sector is impressive, at about $2 billion... After looking over the state of publicly and privately funded fusion research, the National Academies recommended that the United States embark upon a program to develop multiple preliminary designs for a fusion pilot plant by 2028, with the goal of putting a modest amount of net electricity on the U.S. electrical grid from a pilot plant starting sometime in the years between 2035 and 2040, use the pilot plant to study and develop technologies for fusion, and have a first-of-a-kind commercial fusion power plant operational by 2050. The United Kingdom has recently announced a plan to build a prototype fusion power plant by 2040. China has a plan to begin operation of a fusion engineering test reactor in the 2030s, while the European Union foresees operation of a demonstration fusion power plant in the 2050s...

We must look beyond the 2035 timeframe to see how fusion can make a major contribution, and how it can complement renewables... [P]roviding low-carbon electricity in the world market, including later in the century, is of great importance for holding climate change at bay.

Piracy

'The NFT Bay' Shares Multi-Terabyte Archive of 'Pirated' NFTs (torrentfreak.com) 88

NFTs are unique blockchain entries through which people can prove that they own something. However, the underlying images can be copied with a single click. This point is illustrated by The NFT Bay which links to a 19.5 Terabyte collection of 'all NFTs' on the Ethereum and Solana blockchains. (UPDATE: One NFT startup is claiming that the collection is mostly just zeroes, and does not in fact contain all of the NFTs.)

But the archive also delivered an important warning message too. TorrentFreak reports: "The Billion Dollar Torrent," as it's called, reportedly includes all the NFTs on the Ethereum and Solana blockchains. These files are bundled in a massive torrent that points to roughly 15 terabytes of data. Unpacked, this adds up to almost 20 terabytes. Australian developer Geoff is the brains behind the platform, which he describes as an art project. Speaking with TorrentFreak, he says that The Pirate Bay was used as inspiration for nostalgic reasons, which needs further explanation.

The NFT Bay is not just any random art project. It does come with a message, perhaps a wake-up call, for people who jump on the NFT bandwagon without fully realizing what they're spending their crypto profits on. "Purchasing NFT art right now is nothing more than directions on how to access or download an image. The image is not stored on the blockchain and the majority of images I've seen are hosted on Web 2.0 storage which is likely to end up as 404 meaning the NFT has even less value." The same warning is more sharply articulated in the torrent's release notes which are styled in true pirate fashion. "[T]his handy torrent contains all of the NFT's so that future generations can study this generation's tulip mania and collectively go..." it reads.

EU

Advisor To EU's Top Court Suggests German Bulk Data Retention Law Isn't Legal (techcrunch.com) 15

The battle between the appetites of European Union Member States' governments to retain their citizens' data -- for fuzzy, catch-all 'security' purposes -- and the region's top court, the CJEU, which continues to defend fundamental rights by reiterating that indiscriminate mass surveillance is incompatible with general principles of EU law (such as proportionality and respect for privacy) -- has led to another pointed legal critique of national law on bulk data retention. From a report: This time it's a German data retention law that's earned the slap-down -- via a CJEU referral which joins a couple of cases, involving ISPs SpaceNet and Telekom Deutschland which are challenging the obligation to store their customers' telecommunications traffic data. The court's judgement is still pending but an influential opinion put out today by an advisor to the CJEU takes the view that general and indiscriminate retention of traffic and location data can only be permitted exceptionally -- in relation to a threat to national security -- and nor can data be retained permanently. In a press release announcing the opinion of advocate general Manuel Campos Sanchez-Bordona, the court writes that the AG "considers that the answers to all the questions referred are already in the Court's case-law or can be inferred from them without difficulty"; going on to set out his view that the German law's "general and indiscriminate storage obligation" -- which covers "a very wide range of traffic and location data" -- cannot be reconciled with EU law by a time limit imposed on storage as data is being sucked up in bulk, not in a targeted fashion (i.e. for a specific national security purpose).
Power

Bill Gates' TerraPower Will Set Up a $4 Billion Nuclear Plant In Wyoming (interestingengineering.com) 243

Hmmmmmm shares a report from Interesting Engineering: Founded by Bill Gates, TerraPower, a company that plans to use nuclear energy to deliver power in a sustainable manner, has selected Kremmer, Wyoming as a suitable site to demonstrate its advanced nuclear reactor, Natrium. The decision was made after extensive evaluation of the site and consultations with the local community, the company said in a press release.

Last year, the Department of Energy (DOE) had awarded TerraPower a grant of $80 million to demonstrate its technology. The advanced nuclear reactor that is being developed by the company in association with General Electric-Hitachi, uses a sodium-cooled fast reactor that works with a molten salt-based energy storage system. Earlier in June, the company had decided to set up its demonstration plant in Wyoming and has recently sealed the decision by selecting the site of a coal-fired power plant that is scheduled for a shut down by 2025, the press release said.

The demonstration plant where the company plans to set up a 345 MW reactor will be used to validate the design, construction, and operation of TerraPower's technology. Natrium technology uses uranium enriched to up to 20 percent, far higher than what is used by other nuclear reactors. However, nuclear energy supporters say that the technology creates lesser nuclear waste, Reuters reported. The energy storage system to be used in the plant is also designed to work with renewable sources of energy. TerraPower plans to utilize this capability and boost its output to up to 500 MW, enough to power 400,000 homes, the company said.

Open Source

Penpot, the Vector Design Web-app Taking On Figma and Canva With FOSS, Hits Beta (penpot.app) 55

"It's Open Source. It's free," says a web page at Penpot.app.

Slashdot reader kxra writes: Penpot is a free-software, web-based vector design platform using .svg as a first-class filetype used as the underlying storage for all designs.

As more design teams around the world move to the convenience of multi-device synchronized and collaborative web apps, this is a welcome respite from proprietary vendor lock-in by the likes of Figma and Canva. Penpot has finally launched as Beta, with competitive features such as a template library that all creators can pull from.

It's created by Kaleidos Open Source, the same team behind the project management tool Taiga for Agile teams which is taking on the likes of JIRA and Confluence with FLOSS.

"Not having a free & open source UX/UI tool that would make devs participate in the design process and bridge the gap between UX/UI and code was a terrible itch for us..." explains the FAQ at Penpot.app. But it also answers the question: why Open Source? Software Technology has the unique advantage, compared to other industries and intellectual property, of having almost zero cost to replicate itself, thus providing a wonderful chance to massively distribute the tools for a more digitally sovereign society. Besides the pure license aspect of it and its legal framework, Open Source fosters more engaging communities where the lines between user and contributor are often blurred...

Penpot requires a browser, that's it. If you want to host your own Penpot instance, that's fine too. We plan to release a native app bundle later this year.

There is a theme here. Universal access. That's why we love to call our product Penpot, there's nothing more personal and yet more universal than a pot full of pens. It's all about choice.

Its GitHub repository already has 5,200 stars and 41 contributors.
Data Storage

Seagate Unveils First Ever PCIe NVMe HDD (techradar.com) 70

An anonymous reader quotes a report from TechRadar: Seagate has unveiled the first ever hard disk drive (HDD) that utilizes both the NVMe protocol and a PCIe interface, which have historically been used for solid state drives (SSDs) exclusively. As explained in a company blog post, the proof-of-concept HDD is based on a proprietary controller that plays nice with all major protocols (SAS, SATA and NVMe), without requiring a bridge. The NVMe HDD was demoed at the Open Compute Compute Project Summit in a custom JBOD enclosure, with twelve 3.5-inch drives hooked up via a PCIe interface. Although the capacity of the drive is unconfirmed, Seagate used images of the Exos X18 for the presentation, which has a maximum capacity of 18TB.

According to Seagate, there are a number of benefits to bringing the NVMe protocol to HDDs, such as reduced total cost of ownership (TCO), performance improvements, and energy savings. Further, by creating consistency across different types of storage device, NVMe HDDs could drastically simplify datacenter configurations. While current HDDs are nowhere near fast enough to make full use of the latest PCIe standards, technical advances could mean SATA and SAS interfaces are no longer sufficient in future. At this juncture, PCIe NVMe HDDs may become the default. That said, it will take a number of years for these hard drives to enter the mainstream. Seagate says it expects the first samples to be made available to a small selection of customers in Autumn next year, while full commercial rollout is slated for 2024 at the earliest.

Privacy

Infrastructure Bill's Drunk Driving Tech Mandate Leaves Some Privacy Advocates Nervous (gizmodo.com) 138

An anonymous reader quotes a report from Gizmodo: The recently passed $1 trillion infrastructure package is jam-packed with initiatives but sprinkled in there alongside $17 billion in funding for road safety programs is a mandate requiring carmakers to implement monitoring systems to identify and stop drunk drivers. The mandate, first noted by the Associated Press could apply to new vehicles sold as early as 2026. Courts have ordered some drunk drivers to use breathalyzers attached to ignition interlocks to start their vehicles for years, but the technology noted in this bill would take that concept much further and would need to be capable of "passively monitor[ing] the performance of a driver of a motor vehicle to accurately identify whether that driver may be impaired."

Though the Department of Transportation has yet to put its foot down on the exact type of technology it will use for this program, the National Highway Traffic Safety Administration (NHTSA) and 17 automakers have been working on something called the Driver Alcohol Detection System for Safety (DADSS) since 2008. DADSS is exploring both a breath and touch-based system to detect whether or not a driver has a blood alcohol concentration (BAC) at or above 0.08%. The breath-based system aims to measure alcohol readings based on a driver's breath with the goal of distinguishing between the driver and passengers. The touch-based system meanwhile would shine an infrared light through a driver's fingertip to measure blood alcohol levels under the skin's surface. [...]

The new mandate struck a positive note with some car safety groups, including Mothers Against Drunk Driving which has advocated for more detection tech in the past. "It's monumental," Alex Otte, national president of Mothers Against Drunk Driving told the AP. Otte went on to describe the package as the "single most important legislation" in the group's history. At the same time though, the mandate has drawn concerns from safety experts and digital rights groups that warn driver monitoring technology could have knock-on privacy implications. In a letter sent last year by the American Highway Users Alliance, the organization urged support of the NHTSA's DADSS Research Program but expressed concerns that the technology could potentially infringe on driver's civil liberties.
"The group also expressed concerns over how the collection and storage of driver data would work and who would have the rights to that data," adds Gizmodo. Others have also expressed concerns over the accuracy of driving monitoring technology and potential risks of bias.
Earth

India Holds Back on Climate Pledge Until Rich Nations Pay $1 Trillion (bloomberg.com) 161

India has declined to update its official climate goal at the United Nations climate negotiations, holding out for rich countries to first offer $1 trillion in climate finance by the end of the decade. From a report: The resistance from India stands in contrast to its surprise announcement on Nov. 1, just as COP26 negotiations got underway, that it would set an ambitious new goal to reach net-zero emissions by 2070. Prime Minister Narendra Modi opened the talks in Glasgow, Scotland, with a decision to increase his nation's share of renewable electricity generation capacity alongside the long-term target to zero out carbon. At the same time, Modi demanded rich countries provide as much as $1 trillion in climate finance just for India -- far more than the $100 billion a year for all poor countries sought under previous deals. Until now, however, it wasn't clear whether India's demand came with a fixed timeline. Officials on Wednesday confirmed that India is seeking that sum by 2030 to fund the build out of renewables, energy storage, decarbonization of the industrial sector and defending infrastructure to a warming planet.
Microsoft

Microsoft's New $249 Surface Laptop SE is Its First True Chromebook Competitor (theverge.com) 26

Microsoft is going head to head with Chromebooks with a new $249 Surface Laptop SE, its most affordable Surface yet. While the software giant has attempted to compete with the popularity of Chrome OS in US schools for years, the Surface Laptop SE is the company's first true Chromebook competitor. From a report: Surface Laptop SE will be sold exclusively to schools and students, starting at $249. It's part of a much broader effort with Windows 11 SE, a new student edition designed to compete with Chrome OS that will ship on a range of low-cost laptops in the coming months. Surface Laptop SE is every bit the low-cost Windows device you'd expect to see for $249.

While it retains the same keyboard and trackpad found on Microsoft's Surface Laptop Go, the all-plastic body houses an 11.6-inch display running at just a 1366 x 768 resolution. This is the first 16:9 Surface device in more than seven years, after Microsoft switched to 3:2 for its Surface line with the Surface Pro 3 launch in 2014. The screen looks like the biggest drawback on this device, particularly as we weren't fans of the low-resolution screen (1536 x 1024) found on the $549 Surface Laptop Go. Lenovo's Chromebook Duet ships with a better 10.1-inch (1920 x 1200) display for the same $249 price as the Surface Laptop SE. Intel's Celeron N4020 or N4120 power the Surface Laptop SE, combined with 4GB / 8GB of RAM and 64GB or 128GB of eMMC storage.

Data Storage

5D Optical Disc Could Store 500TB For Billions of Year (extremetech.com) 133

Researchers from the University of Southampton "have developed a fast and energy-efficient laser-writing method for producing high-density nanostructures in silica glass," reports Optica. "These tiny structures can be used for long-term five-dimensional (5D) optical data storage that is more than 10,000 times denser than Blue-Ray optical disc storage technology." ExtremeTech reports: This type of data storage uses three layers of nanoscale dots in a glass disc. The size, orientation, and position (in three dimensions) of the dots gives you the five "dimensions" used to encode data. Researchers say that a 5D disc could remain readable after 13.8 billion years, but it would be surprising if anyone was even around to read them at that point. In the shorter term, 5D optical media could also survive after being heated to 1,000 degrees Celsius.

The technique devised by doctoral researcher Yuhao Lei uses a femtosecond laser with a high repetition rate. The process starts with a seeding pulse that creates a nanovoid, but the fast pulse doesn't need to actually write any data. The repeated weak pulses leverage a phenomenon known as near-field enhancement to sculpt the nanostructures in a more gentle way. The researchers evaluated laser pulses at a variety of power levels, finding a level that sped up writing without damaging the silica glass disc. The study reported a maximum data rate of one million voxels per second, but each bit requires several voxels in 5D optical systems. That works out to a data rate of about 230 kilobytes per second. At that point, it becomes feasible to fill one of the discs, which have an estimated capacity of 500TB. It would take about two months to write this much data, after which it cannot be changed.

This work is still in the early stages, but the team managed to write and retrieve 5GB of text data using a 5D optical medium. All you need to read the stored data is a microscope and polarizer, and it should be readable for eons.
The findings appear in Optica, Optica Publishing Group's journal for high-impact research.
Businesses

Dell Spins off $64 Billion VMware as it Battles Debt Hangover (arstechnica.com) 29

PC pioneer Michael Dell is set to cap his climb back to the top of the computing world on Monday with one of the largest corporate spin-offs. Dell Technologies will shed its 81 percent stake in publicly traded VMware, creating an independent software company with a stock market value of nearly $64 billion. Dell's remaining hardware operations have an implied value of $33 billion, based on its latest share price. From a report: The transaction, first disclosed in April, completes an eight-year saga in which the Texan entrepreneur turned his $3.8 billion interest in an out-of-favor PC maker into a personal stake in a broader data center hardware and software empire worth $40 billion. Beginning with the buyout of his PC company, Dell went on to devour server and storage company EMC for $67 billion before taking the group public again in 2018. Along the way, he fought heated battles with dissident shareholders over claims that he bought Dell on the cheap and used complex financial engineering in the EMC deal to short-change investors. Silver Lake, the Silicon Valley private equity group that helped mastermind the dealmaking, will be left with stakes in Dell and VMware worth $11 billion.
The Military

The US Is Installing New Power- and Accuracy-Increasing Sensors on Its Nuclear Weapons 147

new nukes "A sophisticated electronic sensor buried in hardened metal shells at the tip of a growing number of America's ballistic missiles reflects a significant achievement in weapons engineering that experts say could help pave the way for reductions in the size of the country's nuclear arsenal," reports the Washington Post, "but also might create new security perils." The wires, sensors, batteries and computing gear now being installed on hundreds of the most powerful U.S. warheads give them an enhanced ability to detonate with what the military considers exquisite timing over some of the world's most challenging targets, substantially increasing the probability that in the event of a major conflict, those targets would be destroyed in a radioactive rain of fire, heat and unearthly explosive pressures.

The new components — which determine and set the best height for a nuclear blast — are now being paired with other engineering enhancements that collectively increase what military planners refer to as the individual nuclear warheads' "hard-target kill capability." This gives them an improved ability to destroy Russian and Chinese nuclear-tipped missiles and command posts in hardened silos or mountain sanctuaries, or to obliterate military command and storage bunkers in North Korea, also considered a potential U.S. nuclear target.

The increased destructiveness of the warheads means that in some cases fewer weapons could be needed to ensure that all the objectives in the nation's nuclear targeting plans are fully met, opening a path to future shrinkage of the overall arsenal, current and former U.S. officials said in a number of interviews, in which some spoke on the condition of anonymity to discuss sensitive technology.

Production of the first of many high-yield nuclear warheads containing the gear, developed over the past decade at a cost of billions of dollars, was completed in July for installation on missiles aboard Navy submarines, the National Nuclear Security Administration announced.

The Post notes that the U.S. has now installed the technology on hundreds of submarine-based warheads, doubling their destructive power (according to estimates by a Georgetown professor).

The acting administrator of America's National Nuclear Security Administration called it "the culmination of over a decade of work."
Android

Google's India Smartphone With Custom Android OS Launches November 4 For $87 (techcrunch.com) 13

Google and top Indian telecom network Jio Platforms said on Friday that their much-anticipated budget smartphone, JioPhone Next, will go on sale in the world's second largest smartphone market on November 4. From a report: The firms said the JioPhone Next will cost 6,499 Indian rupees ($87), and can also be purchased in multiple instalments with an entry price as low as $27. The smartphone runs Pragati OS, which is powered by an "extremely optimized" Android mobile operating system with a range of customized feature. The two firms also revealed the specifications of the JioPhone Next. The smartphone features a 5.45-inch HD+ display with Corning Gorilla Glass 3 protection. It is powered by Qualcomm's quad-core QM-215 chipset that clocks up to 1.3GHz, coupled with 2GB of RAM and 32GB internal storage, which is expandable.
The Almighty Buck

Payments Company Stripe Is Kick Starting Market For Carbon Removal 23

An anonymous reader quotes a report from The Wall Street Journal: Stripe is signing up to pay for carbon-removal technologies that haven't been invented yet. The payments company has formed a partnership with Deep Science Ventures, a London investment firm that specializes in building technology companies from the ground up. DSV will recruit scientists to develop ways to remove carbon dioxide from the atmosphere. If they come up with viable concepts, Stripe will be their first customer. It will pay DSV startups $500,000 each up front to capture and store carbon, then a further $1 million if they meet performance milestones.

The new partnership marks an expansion of Stripe's effort to provide a market for unproven technology that could potentially help limit the damage of global warming. The United Nations' scientific panel on climate change says the least-bad global-temperature scenarios depend on people removing billions of tons of planet-warming gases from the atmosphere. It also cautions that companies and governments may never be able to deploy the technology on the scale required to make that happen. Since August 2019, when it promised "to pay, at any available price, for the direct removal of carbon dioxide from the atmosphere and its sequestration in secure, long-term storage," Stripe has committed $9 million to 10 carbon-removal projects.

Stripe's carbon-removal procurement is led by Ryan Orbuch, who was a product manager before focusing on climate, and the team's projects are vetted by a panel of industry experts. Costs vary, with the most expensive service costing more than $2,000 per ton of carbon removed. Scalability is more important than current pricing. Stripe says technologies should have the potential to remove half a gigaton of carbon dioxide a year by 2050 at a cost of $100 per ton, and store it for at least 1,000 years. Stripe has tethered its core business of operating payment infrastructure to its side project. Stripe Climate, a tool introduced in October 2020, lets Stripe's customers divert a percentage of revenue to the carbon-removal pot. Roughly 9,000 of Stripe's millions of business users have enrolled contributing nearly $3 million a year collectively, and roughly 8% of new Stripe users sign up [...].

Slashdot Top Deals