×
Power

MIT Engineers Produce the World's Longest Flexible Fiber Battery (mit.edu) 35

Researchers have developed a rechargeable lithium-ion battery in the form of an ultra-long fiber that could be woven into fabrics. From a report: In a proof of concept, the team behind the new battery technology has produced the world's longest flexible fiber battery, 140 meters long, to demonstrate that the material can be manufactured to arbitrarily long lengths. The work is described today in the journal Materials Today. [...] The new fiber battery is manufactured using novel battery gels and a standard fiber-drawing system that starts with a larger cylinder containing all the components and then heats it to just below its melting point. The material is drawn through a narrow opening to compress all the parts to a fraction of their original diameter, while maintaining all the original arrangement of parts.

While others have attempted to make batteries in fiber form, [says MIT postdoc Tural Khudiyey, a lead author on the paper], those were structured with key materials on the outside of the fiber, whereas this system embeds the lithium and other materials inside the fiber, with a protective outside coating, thus directly making this version stable and waterproof. This is the first demonstration of a sub-kilometer long fiber battery which is both sufficiently long and highly durable to have practical applications, he says. The fact that they were able to make a 140-meter fiber battery shows that "there's no obvious upper limit to the length. We could definitely do a kilometer-scale length," he says.

The 140-meter fiber produced so far has an energy storage capacity of 123 milliamp-hours, which can charge smartwatches or phones, he says. The fiber device is only a few hundred microns in thickness, thinner than any previous attempts to produce batteries in fiber form. In addition to individual one-dimensional fibers, which can be woven to produce two-dimensional fabrics, the material can also be used in 3D printing or custom-shape systems to create solid objects, such as casings that could provide both the structure of a device and its power source. To demonstrate this capability, a toy submarine was wrapped with the battery fiber to provide it with power. Incorporating the power source into the structure of such devices could lower the overall weight and so improve the efficiency and range they can achieve.

United States

The US Could Reliably Run On Clean Energy By 2050 (popsci.com) 214

An anonymous reader quotes a report from Popular Science: The Biden administration has pledged to create a carbon-free energy sector by 2035, but because renewable resources generate only around 19 percent of US electricity as of 2020, climate experts warn that our transition to a green grid future needs to speed up. A group of researchers at Stanford led by Mark Jacobson, professor of civil and environmental engineering, has set out to prove that a 100 percent renewable energy grid by 2050 is not only feasible but can be done without any blackouts and at a lower cost than the existing grid. Jacobson is the lead author of a new paper, published in Renewable Energy, which argues that a complete transition to renewable energy -- defined as wind, water, and solar energy -- would benefit the US as a whole and individuals by saving costs, creating jobs, and reducing air pollution and carbon emissions.

They modeled how wind turbines, tidal turbines, geothermal and hydroelectric power plants, rooftop and utility photovoltaic panels, and other sources could generate energy in 2050. A host of different sources powered these projections: Jacobson used data from a weather-climate-air pollution model he first built in 1990, which has been used in numerous simulations since. Individual state and sector energy consumption was taken from the Energy Information Administration. Current fossil fuel energy sources were converted to electric devices that are powered by wind, water, and solar. This was then used to create projections for energy use in 2050. Time-dependent energy supply was matched with demand and storage in a grid integration model for every 30 second interval in 2050 and 2051. The study authors analyzed US regions and countrywide demand until the model produced a solution with what the authors called zero-load loss -- meaning, essentially, no blackouts with 100 percent renewable energy and storage. According to Jacobson, no other study is conducting this kind of modeling, which is unique in part because it checks conditions for any simulation every 30 seconds.

As the cost of renewables falls, researchers predict power companies and consumers will migrate to using renewables. Solar and wind are already half the cost of natural gas. Policy may also motivate adoption -- or hinder it. While the current administration has set out goals for a renewable energy grid, new permits for gas and drilling in the Gulf of Mexico counteract those same efforts. [...] The researchers quantified these benefits by looking at private costs, such as those to individuals or corporations, and social ones, which also include health and climate costs. Zero-emissions leads to few air pollution related deaths and illness, and a reduced toll on the healthcare system. [...] The model cannot address emissions from things like long-distance shipping or aviation, though the authors argue that green hydrogen could be a possible alternative to explore. They did not include nuclear energy or carbon capture, which [Anna-Katharina von Krauland, a PhD candidate in the Atmosphere/Energy program at Stanford and a co-author of the paper] views as "distractions from getting to 100 percent renewable energy as quickly as possible" because the technologies are costly, unproven, or lacking in their promises. "The best path forward would be to invest in what we know works as quickly as we can," she says -- such as wind, water, and solar energy.

Security

The NCA Shares 585 Million Passwords With 'Have I Been Pwned' (therecord.media) 20

The UK National Crime Agency has shared a collection of more than 585 million compromised passwords it found during an investigation with Have I Been Pwned, a website that indexes data from security breaches. The Record reports: The NCA now becomes the second law enforcement agency to officially supply HIBP with hacked passwords after the US Federal Bureau of Investigations began a similar collaboration with the service back in May. In a blog post today, Troy Hunt, HIBP creator Troy Hunt said that 225 million of the compromised passwords found by the NCA were new and unique.

These passwords have been added to a section of the HIBP website called Pwned Passwords. This section allows companies and system administrators to check and see if their current passwords have been compromised in hacks and if they are likely to be part of public lists used by threat actors in brute-force and password-spraying attacks. Currently, the HIBP Pwned Passwords collection includes 5.5 billion entries, of which 847 million are unique. All these passwords are also available as a free download, so companies can check their passwords against the data set locally without connecting to Hunt's service.

In a statement shared by Hunt, the NCA said it found the compromised passwords, paired with email accounts, in an account at a UK cloud storage facility. The NCA said they weren't able to determine or attribute the compromised email and password combos to any specific platform or company.

Power

Imagining an All-Renewable Grid With No Blackouts Without Long-Duration Batteries (stanford.edu) 227

Slashdot reader SoftwareArtist shares an announcement from a Stanford University institute for environmental studies. "For some, visions of a future powered by clean, renewable energy are clouded by fears of blackouts driven by intermittent electricity supplies," the announcement begins.

"Those fears are misplaced, according to a new Stanford University study that analyzes grid stability under multiple scenarios in which wind, water and solar energy resources power 100% of U.S. energy needs for all purposes." "This study is the first to examine grid stability in all U.S. grid regions and many individual states after electrifying all energy and providing the electricity with only energy that is both clean and renewable," said study lead author Mark Z. Jacobson, a professor of civil and environmental engineering at Stanford... Imagine all cars and trucks were powered with electric motors or hydrogen fuel cells, electric heat pumps replaced gas furnaces and water heaters and wind turbines and solar panels replaced coal and natural gas power plants. The study envisions those and many more transitions in place across the electricity, transportation, buildings and industrial sectors in the years 2050 and 2051...

Interconnecting larger and larger geographic regions made power supply smoother and costs lower because it upped the chances of available wind, sun and hydro power availability and reduced the need for extra wind turbines, solar panels and batteries. A significant finding of the study was that long-duration batteries were neither needed nor helpful for keeping the grid stable. Instead, grid stability could be obtained by linking together currently available batteries with storage durations of four hours or less. Linking together short-duration batteries can provide long-term storage when they are used in succession. They can also be discharged simultaneously to meet heavy peaks in demand for short periods. In other words, short-duration batteries can be used for both big peaks in demand for short periods and lower peaks for a long period or anything in-between.

Other findings:
  • Cleaner air would spare about 53,200 people from pollution-related deaths every year. It would also spare millions more from pollution-related illnesses. Total estimated health costs saved each year: $700 billion.
  • Building and operating a completely renewable grid may create 4.7 million long-term jobs.
  • Per capita household energy costs were nearly 63% less.
  • New electricity generators would occupy about 0.84% of U.S. land (versus roughly 1.3% occupied today by the fossil fuel industry).

Google

Google Drive Could Soon Start Locking Your Files (techradar.com) 76

Google has announced a new policy for cloud storage service Drive, which will soon begin to restrict access to files deemed to be in violation of the company's policies. TechRadar reports: As explained in a new blog post, Google will take active steps to identify files hosted on its platform that are in breach of either its Terms of Service or abuse program policies. These files will be flagged to their owner and restricted automatically, which means they can no longer be shared with other people, and access will be withdrawn from everyone but the owner. "This will help ensure owners of Google Drive items are fully informed about the status of their content, while also helping to ensure users are protected from abusive content," the company explained.

According to Google, the motive behind the policy change is to shield against the abuse of its services. This broad catchall encompasses cybercriminal activity (like malware hosting, phishing etc.), hate speech, and content that might endanger children, but also sexually explicit material. "We need to curb abuses that threaten our ability to provide these services, and we ask that everyone abide by [our policies] to help us achieve this goal," states Google in its policy document. "After we are notified of a potential policy violation, we may review the content and take action, including restricting access to the content, removing the content, and limiting or terminating a user's access to Google products."
Google goes on to say that it may make "exceptions based on artistic, educational, documentary or scientific considerations." As noted by TechRadar, "there is a system to request a review of a decision if someone feels a file has been restricted unfairly, but it's unclear how the process will be handled on Google's end and how long it might take."
Power

Metaverse Vision Requires 1000x More Computational Power, Says Intel (intel.com) 79

Leading chip-maker Intel has stressed that building Metaverse -- at scale and accessible by billions of humans in real time -- will require a 1,000-times increase in computational efficiency from what we have today. Insider reports: Raja Koduri, a senior vice president and head of Intel's Accelerated Computing Systems and Graphics Group, said that our computing, storage and networking infrastructure today is simply not enough to enable this Metaverse vision, being popularized by Meta (formerly Facebook) and other companies. "We need several orders of magnitude more powerful computing capability, accessible at much lower latencies across a multitude of device form factors," Koduri said in a blog post. To enable these capabilities at scale, the entire plumbing of the internet will need major upgrades, he added.
Japan

Telecom Data Storage Locations Will Soon Be Public In Japan (theregister.com) 4

An anonymous reader quotes a report from The Register: Social media and search engine operators in Japan will be required to specify the countries in which users' data is physically stored, under a planned tweak to local laws. The Ministry of Internal Affairs and Communications this week announced it plans to submit the revision to the Telecommunications Business Law early next year. The amendment, if passed, requires search engines, social media operators and mobile phone companies with over 10 million Japanese users to disclose where in the world they store data, and identify any foreign subcontractors that can access the data. The proposed law applies to overseas companies that operate in Japan -- meaning the likes of Twitter and Facebook will need to disclose their storage choices publicly. Oddly, search engines that just cover travel and food get a pass and don't have to comply. "The move is in part a reaction to Japan's hugely popular homegrown freeware instant communication app, LINE, which had several recent snafus related to data storage and protection," the report adds.

In March, the Japanese government said it was investigating LINE's parent company after a local newspaper reported that engineers at one of the app's Chinese contractors accessed the messages and personal details of LINE users. And just a couple weeks ago, the company announced that around 133,000 users' payment details were mistakenly published on GitHub between September and November of this year.
Android

Android 12 Go Edition Brings New Speed, Battery, Privacy Features To Lower-end Phones (cnet.com) 10

Google's Pixel 6 line may have served as Android 12's big debut for higher-end phones, but Android 12 (Go edition) plans to bring many of the enhancements and features of Android 12 to lower-end phones, too. Google on Tuesday unveiled a host of new features for the Go edition that are set to roll out to devices in 2022. From a report: Google says that in addition to speed enhancements that'll help apps launch up to 30% faster, Android 12 (Go edition) will include a feature that'll save battery life and storage by automatically "hibernating apps that haven't been used for extended periods of time." And with the Files Go app, you'll be able to recover files within 30 days of deletion. Android 12 (Go edition) will also help you easily translate any content, listen to the news and share apps with nearby devices offline to save data, Google says. The company said Android Go has amassed 200 million users.
Microsoft

Microsoft Launches Center for Reporting Malicious Drivers (therecord.media) 27

Microsoft has launched this week a special web portal where users and researchers can report malicious drivers to the company's security team. From a report: The new Vulnerable and Malicious Driver Reporting Center is basically a web form that allows users to upload a copy of a malicious driver, which gets uploaded and analyzed by a Microsoft automated scanner. At a technical level, Microsoft says this automated scanner can identify techniques that are commonly abused by malicious drivers, such as:
Drivers with the ability to map arbitrary kernel, physical, or device memory to user mode.
Drivers with the ability to read or write arbitrary kernel, physical, or device memory, including Port I/O and central processing unit (CPU) registers from user mode.
Drivers that provide access to storage that bypass Windows access control.

Earth

Earth is Getting a Black Box To Record Events that Lead To the Downfall of Civilization (cnet.com) 120

An indestructible "black box" is set to be built upon a granite plain on the west coast of Tasmania, Australia, in early 2022. Its mission: Record "every step we take" toward climate catastrophe, providing a record for future civilizations to understand what caused our demise, according to the Australian Broadcasting Corporation. From a report: The project, led by marketing communications company Clemenger BBDO in collaboration with University of Tasmania researchers, is currently in beta and has already begun collecting information at its website. The structure is designed to be about the size of a city bus, made of 3-inch-thick steel and topped with solar panels. Its interior will be filled with "storage drives" that gather climate change-related data such as atmospheric carbon dioxide levels and average temperatures. In addition, using an algorithm, it will scour the web for tweets, posts, news and headlines.

The developers estimate that storage will run out in 30 to 50 years, according to the ABC. There are plans to increase the storage capacity and provide a more long-term solution, but it's unclear how the structure will be maintained -- how its solar panels might be replaced before the end of civilization, how well those drives hold up after decades and how impervious the vault will be to vandalism or sabotage. Its remote location, around four hours from the closest major city, is one deterrent -- but will that be enough?

Networking

Comcast Reduced 'Working Latency' By 90% with AQM. Is This the Future? (apnic.net) 119

Long-time Slashdot reader mtaht writes: Comcast fully deployed bufferbloat fixes across their entire network over the past year, demonstrating 90% improvements in working latency and jitter — which is described in this article by by Comcast Vice President of Technology Policy & Standards. (The article's Cumulative Distribution Function chart is to die for...) But: did anybody notice? Did any other ISPs adopt AQM tech? How many of y'all out there are running smart queue management (sch_cake in linux) nowadays?
But wait — it gets even more interesting...

The Comcast official anticipates even less latency with the newest Wi-Fi 6E standard. (And for home users, the article links to a page recommending "a router whose manufacturer understands the principles of bufferbloat, and has updated the firmware to use one of the Smart Queue Management algorithms such as cake, fq_codel, PIE.")

But then the Comcast VP looks to the future, and where all of this is leading: Currently under discussion at the IETF in the Transport Area Working Group is a proposal for Low Latency, Low Loss Scalable Throughput. This potential approach to achieve very low latency may result in working latencies of roughly one millisecond (though perhaps 1-5 milliseconds initially). As the IETF sorts out the best technical path forward through experimentation and consensus-building (including debate of alternatives), in a few years we may see the beginning of a shift to sub-5 millisecond working latency. This seems likely to not only improve the quality of experience of existing applications but also create a network foundation on which entirely new classes of applications will be built.

While we can certainly think of usable augmented and virtual reality (AR and VR), these are applications we know about today. But what happens when the time to access resources on the Internet is the same, or close to the time to access local compute or storage resources? What if the core assumption that developers make about networks — that there is an unpredictable and variable delay — goes away? This is a central assumption embedded into the design of more or less all existing applications. So, if that assumption changes, then we can potentially rethink the design of many applications and all sorts of new applications will become possible. That is a big deal and exciting to think about the possibilities!

In a few years, when most people have 1 Gbps, 10 Gbps, or eventually 100 Gbps connections in their home, it is perhaps easy to imagine that connection speed is not the only key factor in your performance. We're perhaps entering an era where consistently low working latency will become the next big thing that differentiates various Internet access services and application services/platforms. Beyond that, factors likely exceptionally high uptime, proactive/adaptive security, dynamic privacy protection, and other new things will likely also play a role. But keep an eye on working latency — there's a lot of exciting things happening!

Science

The Coronavirus in a Tiny Drop (nytimes.com) 63

To better understand the coronavirus's journey from one person to another, a team of 50 scientists has for the first time created an atomic simulation of the coronavirus nestled in a tiny airborne drop of water. From a report: To create the model, the researchers needed one of the world's biggest supercomputers to assemble 1.3 billion atoms and track all their movements down to less than a millionth of a second. This computational tour de force is offering an unprecedented glimpse at how the virus survives in the open air as it spreads to a new host. "Putting a virus in a drop of water has never been done before," said Rommie Amaro, a biologist at the University of California San Diego who led the effort, which was unveiled at the International Conference for High Performance Computing, Networking, Storage and Analysis last month. "People have literally never seen what this looks like."

How the coronavirus spreads through the air became the subject of fierce debate early in the pandemic. Many scientists championed the traditional view that most of the virus's transmission was made possible by larger drops, often produced in coughs and sneezes. Those droplets can travel only a few feet before falling to the floor. But epidemiological studies showed that people with Covid-19 could infect others at a much greater distance. Even just talking without masks in a poorly ventilated indoor space like a bar, church or classroom was enough to spread the virus. Those findings pointed to much smaller drops, called aerosols, as important vehicles of infection. Scientists define droplets as having a diameter greater than 100 micrometers, or about 4 thousandths of an inch. Aerosols are smaller -- in some cases so small that only a single virus can fit inside them. And thanks to their minuscule size, aerosols can drift in the air for hours.

Space

The Largest Comet We've Ever Seen Just Delivered a Curious Surprise (sciencealert.com) 18

schwit1 shares a report from ScienceAlert: The comet Bernardinelli-Bernstein (BB) -- the largest our telescopes have ever spotted -- is on a journey from the outer reaches of our Solar System that will see it flying relatively close to Saturn's orbit. Now, a new analysis of the data we've collected on BB has revealed something rather surprising. Digging into readings logged by the Transient Exoplanet Survey Satellite (TESS) between 2018 and 2020, researchers have discovered that BB became active much earlier, and much farther out from the Sun, than was previously thought.

A comet becomes active when light from the Sun heats its icy surface, turning ice to vapor and releasing trapped dust and grit. The resulting haze, called a coma, can be useful for astronomers in working out exactly what a particular comet is made out of. In the case of BB, it's still too far out for water to sublimate. Based on studies of comets at similar distances, it's likely that the emerging fog is driven instead by a slow release of carbon monoxide. Only one active comet has previously been directly observed at a greater distance from the Sun, and it was much smaller than BB.
"These observations are pushing the distances for active comets dramatically farther than we have previously known," says astronomer Tony Farnham, from the University of Maryland (UMD). "We make the assumption that comet BB was probably active even farther out, but we just didn't see it before this. What we don't know yet is if there's some cut-off point where we can start to see these things in cold storage before they become active."

The research has been published in the Planetary Science Journal.
Data Storage

Microsoft Makes Breakthrough In the Quest To Use DNA As Data Storage (gizmodo.com) 43

An anonymous reader quotes a report from Gizmodo: Microsoft, one of the pioneers of DNA storage, is making some headway, working with the University of Washington's Molecular Information Systems Laboratory, or MISL. The company announced in a new research paper the first nanoscale DNA storage writer, which the research group expects to scale for a DNA write density of 25 x 10^6 sequences per square centimeter, or "three orders of magnitude" (1,000x) more tightly than before. What makes this particularly significant is that it's the first indication of achieving the minimum write speeds required for DNA storage.

Microsoft is one of the biggest players in cloud storage and is looking at DNA data storage to gain an advantage over the competition by using its unparalleled density, sustainability, and shelf life. DNA is said to have a density capable of storing one exabyte, or 1 billion gigabytes, per square inch -- an amount many magnitudes larger than what our current best storage method, Linear Type-Open (LTO) magnetic tape, can provide. What do these advantages mean in real-world terms? Well, the International Data Corporation predicts data storage demands will reach nine zettabytes by 2024. As Microsoft notes, only one zettabyte of storage would be used if Windows 11 were downloaded on 15 billion devices. Using current methods, that data would need to be stored on millions of tape cartridges. Cut the tape and use DNA, and nine zettabytes of information can be stored in an area as small as a refrigerator (some scientists say every movie ever released could fit in the footprint of a sugar cube). But perhaps a freezer would be a better analogy, because data stored on DNA can last for thousands of years whereas data loss occurs on tape with 30 years and even sooner on SSDs and HDDs.

Finding ways to increase write speeds addresses one of the two main problems with DNA storage (the other being cost). With the minimum write speed threshold within grasp, Microsoft is already pushing ahead with the next phase. "A natural next step is to embed digital logic in the chip to allow individual control of millions of electrode spots to write kilobytes per second of data in DNA, and we foresee the technology reaching arrays containing billions of electrodes capable of storing megabytes per second of data in DNA. This will bring DNA data storage performance and cost significantly closer to tape," Microsoft told TechRadar.

United States

Wanted: A Town Willing to Host a Dump for U.S. Nuclear Waste (bloomberg.com) 335

The Biden administration is looking for communities willing to serve as temporary homes for tens of thousands of metric tons of nuclear waste currently stranded at power plants around the country. Bloomberg reports: The Energy Department filed (PDF) a public notice Tuesday that it is restarting the process for finding a voluntary host for spent nuclear fuel until a permanent location is identified. "Hearing from and then working with communities interested in hosting one of these facilities is the best way to finally solve the nation's spent nuclear fuel management issues," Energy Secretary Jennifer Granholm said in a statement. The agency, in its notice, requested input on how to proceed with a "consent-based" process for a federal nuclear storage facility, including what benefits could entice local and state governments and how to address potential impediments. Federal funding is also possible, the notice said. Approximately 89,000 metric tons of nuclear waste is being stored at dozens of nuclear power plants and other sites around the country.
[...]
One such interim storage site could be in Andrews, Texas. The Nuclear Regulatory Commission in September approved a license for a proposal by Orano CIS LLC and its joint venture partner, J.F. Lehman & Co.'s Waste Control Specialists LLC, to establish a repository in the heart of Texas' Permian Basin oil fields for as many as 40,000 metric tons of radioactive waste. The joint venture envisioned having nuclear waste shipped by rail from around the country and sealed in concrete casks where it would be stored above ground at a site about 30 miles (48.28 kilometers) from Andrews. But the plan has drawn opposition from Texas authorities and local officials who once embraced it as an economic benefit but have since had a change of heart. A similar nuclear waste storage project, proposed in New Mexico by Holtec International Corp., is awaiting approval by the Nuclear Regulatory Commission. The agency said it expects to make a decision on that proposal in January 2022.

Earth

The World Needs To Crack Battery Recycling, Fast (wired.co.uk) 97

As batteries start to pile up, carmakers, battery companies and researchers are trying to save them from ending up in landfills. From a report: Recyclers are primarily interested in extracting the valuable metals and minerals in the cells. Getting to these materials is complex and dangerous: After removing the steel casing, the battery pack needs to be unbundled into cells carefully, to avoid puncturing any hazardous materials. The electrolyte, a liquid whose job it is to move lithium ions between the cathode and anode, can catch fire or even explode if heated. Only once the pack has been dismantled, recyclers can safely extract the conductive lithium, nickel, copper, and cobalt.

Used in the cathode, cobalt is the most sought-after material used in batteries. In its raw form, the rare, bluish-grey metal is predominantly sourced from the Democratic Republic of Congo, where miners work in perilous conditions. The world's major electric car manufacturers are already moving away from cobalt, deterred by the human rights abuses, shortages in the supply chain. That raises the question of whether recyclers will still find it worthwhile to dismantle newer battery types lacking the most valuable ingredients. "When you move to more sustainable materials, and lower cost materials, the incentive to recycle and recover them diminishes," says Jenny Baker, an energy storage expert at Swansea University. She likens this to a dilemma in consumer electronics: It is often cheaper to buy a new mobile phone than to get it fixed or recycled.

[...] In a first step, recyclers typically shred the cathode and anode materials of spent batteries into a powdery mixture, the so-called black mass. In the board game analogy, this would be the first slide down on a snake, Gavin Harper, a research fellow at the University of Birmingham, explains. The black mass can then be processed in one of two ways to extract its valuable components. One method, called pyrometallurgy, involves smelting the black mass in a furnace powered with fossil fuels. It's a relatively cheap method but a lot of lithium, aluminium, graphite and manganese is lost in the process. Another method, hydrometallurgy, leaches the metals out of the black mass by dissolving it in acids and other solvents. This method, Harper says, would correspond to a shorter snake in the board game, because more material can be recovered: you fall back, but not by as many squares as when using pyrometallurgy. The process, however, consumes a lot of energy and produces toxic gases and wastewater.

Piracy

Is 'The NFT Bay' Just a Giant Hoax? (clubnft.com) 74

Recently Australian developer Geoffrey Huntley announced they'd created a 20-terabyte archive of all NFTs on the Ethereum and Solana blockchains.

But one NFT startup company now says they tried downloading the archive — and discovered most of it was zeroes. Many of the articles are careful to point out "we have not verified the contents of the torrent," because of course they couldn't. A 20TB torrent would take several days to download, necessitating a pretty beefy internet connection and more disk space to store than most people have at their disposal. We at ClubNFT fired up a massive AWS instance with 40TB of EBS disk space to attempt to download this, with a cost estimate of $10k-20k over the next month, as we saw this torrent as potentially an easy way to pre-seed our NFT storage efforts — not many people have these resources to devote to a single news story.

Fortunately, we can save you the trouble of downloading the entire torrent — all you need is about 10GB. Download the first 10GB of the torrent, plus the last block, and you can fill in all the rest with zeroes. In other words, it's empty; and no, Geoff did not actually download all the NFTs. Ironically, Geoff has archived all of the media articles about this and linked them on TheNFTBay's site, presumably to preserve an immutable record of the spread and success of his campaign — kinda like an NFT...

We were hoping this was real... [I]t is actually rather complicated to correctly download and secure the media for even a single NFT, nevermind trying to do it for every NFT ever made. This is why we were initially skeptical of Geoff's statements. But even if he had actually downloaded all the NFT media and made it available as a torrent, this would not have solved the problem... a torrent containing all the NFTs does nothing to actually make those NFTs available via IPFS, which is the network they must be present on in order for the NFTs to be visible on marketplaces and galleries....

[A]nd this is a bit in the weeds: in order to reupload an NFT's media to IPFS, you need more than just the media itself. In order to restore a file to IPFS so it can continue to be located by the original link embedded in the NFT, you must know exactly the settings used when that file was originally uploaded, and potentially even the exact version of the IPFS software used for the upload.

For these reasons and more, ClubNFT is working hard on an actual solution to ensure that everybody's NFTs can be safely secured by the collectors themselves. We look forward to providing more educational resources on these and other topics, and welcome the attention that others, like Geoff, bring to these important issues.

Their article was shared by a Slashdot reader (who is one of ClubNFT's three founders). I'd wondered suspiciously if ClubNFT was a hoax, but if this PR Newswire press release is legit, they've raised $3 million in seed funding. (And that does include an investment from Drapen Dragon, co-founded by Tim Draper which shows up on CrunchBase). The International Business Times has also covered ClubNFT, identifying it as a startup whose mission statement is "to build the next generation of NFT solutions to help collectors discover, protect, and share digital assets." Co-founder and CEO Jason Bailey said these next-generation tools are in their "discovery" phase, and one of the first set of tools that is designed to provide a backup solution for NFTs will roll out early next year. Speaking to International Business Times, Bailey said, "We are looking at early 2022 to roll out the backup solution. But between now and then we should be feeding (1,500 beta testers) valuable information about their wallets." Bailey says while doing the beta testing, he realized that there are loopholes in the NFT storage systems and only 40% of the NFTs were actually pointing to the IPFS, while 40% of them were at risk — pointing to private servers.

Here is the problem explained: NFTs are basically a collection of metadata, that define the underlying property that is owned. Just like in the world of internet documents, links point to the art and any details about it that are being stored. But links can break, or die. Many NFTs use a system called InterPlanetary File System, or IPFS, which let you find a piece of content as long as it is hosted somewhere on the IPFS network. Unlike in the world of internet domains, you don't need to own the domain to really make sure the data is safe. Explaining the problem which the backup tool will address, Bailey said, "When you upload an image to IPFS, it creates a cryptographic hash. And if someone ever stops paying to store that image on IPFS, as long as you have the original image, you can always restore it. That's why we're giving people the right to download the image.... [W]e're going to start with this protection tool solution that will allow people to click a button and download all the assets associated with their NFT collection and their wallet in the exact format that they would need it in to restore it back up to IPFS, should it ever disappear. And we're not going to charge any money for that."

The idea, he said, is that collectors should not have to trust any company; rather they can use ClubNFT's tool, whenever it becomes available, to download the files locally... "One of the things that we're doing early around that discovery process, we're building out a tool that looks in your wallet and can see who you collect, and then go a level deeper and see who they collect," Bailey said. Bailey said that the rest of the tools will process after gathering lessons based on user feedback on the first set of solutions. He, however, seemed positive that the talks of the next set of tools will begin in the Spring of next year as the company has laid a "general roadmap."

Power

Could Fusion Energy Provide a Safer Alternative to Nuclear Power? (thebulletin.org) 239

"One way to help eliminate carbon emissions and thereby fight global warming may be to exploit fusion, the energy source of the sun and stars..." argues a new article in Bulletin of the Atomic Scientists (shared by Slashdot reader DanDrollette).

Though fusion energy would involve controllng a "plasma" gas of positively charged nuclei and negatively charged electrons heated to 150 million degrees Celsius, progress is being made — and the upside could be tremendous: One major advantage of using fusion as an energy source is that its underlying physics precludes either a fuel meltdown — such as what happened at Three Mile Island and Fukushima Daichi — or a runaway reaction, such as at Chernobyl. Furthermore, the amount of radioactive material that could be released in an accident in a fusion power plant system is much less than in a fission reactor. Consequently, a fusion system has much less capability to damage itself, and any damage would have much less dangerous consequences. As a result, current concepts for fusion systems may not necessitate an evacuation plan beyond the site boundary. Another advantage of fusion is that neither the fuel nor its products create the very long-lived radioactive waste that fission does, which means that fusion does not require long-term, geological storage...

When and how can fusion contribute to mitigating climate change? Private companies are in a hurry to develop fusion, and many say that they will be able to put commercial fusion power on the US electric grid in the early 2030s. The total private financing in this sector is impressive, at about $2 billion... After looking over the state of publicly and privately funded fusion research, the National Academies recommended that the United States embark upon a program to develop multiple preliminary designs for a fusion pilot plant by 2028, with the goal of putting a modest amount of net electricity on the U.S. electrical grid from a pilot plant starting sometime in the years between 2035 and 2040, use the pilot plant to study and develop technologies for fusion, and have a first-of-a-kind commercial fusion power plant operational by 2050. The United Kingdom has recently announced a plan to build a prototype fusion power plant by 2040. China has a plan to begin operation of a fusion engineering test reactor in the 2030s, while the European Union foresees operation of a demonstration fusion power plant in the 2050s...

We must look beyond the 2035 timeframe to see how fusion can make a major contribution, and how it can complement renewables... [P]roviding low-carbon electricity in the world market, including later in the century, is of great importance for holding climate change at bay.

Piracy

'The NFT Bay' Shares Multi-Terabyte Archive of 'Pirated' NFTs (torrentfreak.com) 88

NFTs are unique blockchain entries through which people can prove that they own something. However, the underlying images can be copied with a single click. This point is illustrated by The NFT Bay which links to a 19.5 Terabyte collection of 'all NFTs' on the Ethereum and Solana blockchains. (UPDATE: One NFT startup is claiming that the collection is mostly just zeroes, and does not in fact contain all of the NFTs.)

But the archive also delivered an important warning message too. TorrentFreak reports: "The Billion Dollar Torrent," as it's called, reportedly includes all the NFTs on the Ethereum and Solana blockchains. These files are bundled in a massive torrent that points to roughly 15 terabytes of data. Unpacked, this adds up to almost 20 terabytes. Australian developer Geoff is the brains behind the platform, which he describes as an art project. Speaking with TorrentFreak, he says that The Pirate Bay was used as inspiration for nostalgic reasons, which needs further explanation.

The NFT Bay is not just any random art project. It does come with a message, perhaps a wake-up call, for people who jump on the NFT bandwagon without fully realizing what they're spending their crypto profits on. "Purchasing NFT art right now is nothing more than directions on how to access or download an image. The image is not stored on the blockchain and the majority of images I've seen are hosted on Web 2.0 storage which is likely to end up as 404 meaning the NFT has even less value." The same warning is more sharply articulated in the torrent's release notes which are styled in true pirate fashion. "[T]his handy torrent contains all of the NFT's so that future generations can study this generation's tulip mania and collectively go..." it reads.

EU

Advisor To EU's Top Court Suggests German Bulk Data Retention Law Isn't Legal (techcrunch.com) 15

The battle between the appetites of European Union Member States' governments to retain their citizens' data -- for fuzzy, catch-all 'security' purposes -- and the region's top court, the CJEU, which continues to defend fundamental rights by reiterating that indiscriminate mass surveillance is incompatible with general principles of EU law (such as proportionality and respect for privacy) -- has led to another pointed legal critique of national law on bulk data retention. From a report: This time it's a German data retention law that's earned the slap-down -- via a CJEU referral which joins a couple of cases, involving ISPs SpaceNet and Telekom Deutschland which are challenging the obligation to store their customers' telecommunications traffic data. The court's judgement is still pending but an influential opinion put out today by an advisor to the CJEU takes the view that general and indiscriminate retention of traffic and location data can only be permitted exceptionally -- in relation to a threat to national security -- and nor can data be retained permanently. In a press release announcing the opinion of advocate general Manuel Campos Sanchez-Bordona, the court writes that the AG "considers that the answers to all the questions referred are already in the Court's case-law or can be inferred from them without difficulty"; going on to set out his view that the German law's "general and indiscriminate storage obligation" -- which covers "a very wide range of traffic and location data" -- cannot be reconciled with EU law by a time limit imposed on storage as data is being sucked up in bulk, not in a targeted fashion (i.e. for a specific national security purpose).

Slashdot Top Deals