Japan

Telecom Data Storage Locations Will Soon Be Public In Japan (theregister.com) 4

An anonymous reader quotes a report from The Register: Social media and search engine operators in Japan will be required to specify the countries in which users' data is physically stored, under a planned tweak to local laws. The Ministry of Internal Affairs and Communications this week announced it plans to submit the revision to the Telecommunications Business Law early next year. The amendment, if passed, requires search engines, social media operators and mobile phone companies with over 10 million Japanese users to disclose where in the world they store data, and identify any foreign subcontractors that can access the data. The proposed law applies to overseas companies that operate in Japan -- meaning the likes of Twitter and Facebook will need to disclose their storage choices publicly. Oddly, search engines that just cover travel and food get a pass and don't have to comply. "The move is in part a reaction to Japan's hugely popular homegrown freeware instant communication app, LINE, which had several recent snafus related to data storage and protection," the report adds.

In March, the Japanese government said it was investigating LINE's parent company after a local newspaper reported that engineers at one of the app's Chinese contractors accessed the messages and personal details of LINE users. And just a couple weeks ago, the company announced that around 133,000 users' payment details were mistakenly published on GitHub between September and November of this year.
Android

Android 12 Go Edition Brings New Speed, Battery, Privacy Features To Lower-end Phones (cnet.com) 10

Google's Pixel 6 line may have served as Android 12's big debut for higher-end phones, but Android 12 (Go edition) plans to bring many of the enhancements and features of Android 12 to lower-end phones, too. Google on Tuesday unveiled a host of new features for the Go edition that are set to roll out to devices in 2022. From a report: Google says that in addition to speed enhancements that'll help apps launch up to 30% faster, Android 12 (Go edition) will include a feature that'll save battery life and storage by automatically "hibernating apps that haven't been used for extended periods of time." And with the Files Go app, you'll be able to recover files within 30 days of deletion. Android 12 (Go edition) will also help you easily translate any content, listen to the news and share apps with nearby devices offline to save data, Google says. The company said Android Go has amassed 200 million users.
Microsoft

Microsoft Launches Center for Reporting Malicious Drivers (therecord.media) 27

Microsoft has launched this week a special web portal where users and researchers can report malicious drivers to the company's security team. From a report: The new Vulnerable and Malicious Driver Reporting Center is basically a web form that allows users to upload a copy of a malicious driver, which gets uploaded and analyzed by a Microsoft automated scanner. At a technical level, Microsoft says this automated scanner can identify techniques that are commonly abused by malicious drivers, such as:
Drivers with the ability to map arbitrary kernel, physical, or device memory to user mode.
Drivers with the ability to read or write arbitrary kernel, physical, or device memory, including Port I/O and central processing unit (CPU) registers from user mode.
Drivers that provide access to storage that bypass Windows access control.

Earth

Earth is Getting a Black Box To Record Events that Lead To the Downfall of Civilization (cnet.com) 120

An indestructible "black box" is set to be built upon a granite plain on the west coast of Tasmania, Australia, in early 2022. Its mission: Record "every step we take" toward climate catastrophe, providing a record for future civilizations to understand what caused our demise, according to the Australian Broadcasting Corporation. From a report: The project, led by marketing communications company Clemenger BBDO in collaboration with University of Tasmania researchers, is currently in beta and has already begun collecting information at its website. The structure is designed to be about the size of a city bus, made of 3-inch-thick steel and topped with solar panels. Its interior will be filled with "storage drives" that gather climate change-related data such as atmospheric carbon dioxide levels and average temperatures. In addition, using an algorithm, it will scour the web for tweets, posts, news and headlines.

The developers estimate that storage will run out in 30 to 50 years, according to the ABC. There are plans to increase the storage capacity and provide a more long-term solution, but it's unclear how the structure will be maintained -- how its solar panels might be replaced before the end of civilization, how well those drives hold up after decades and how impervious the vault will be to vandalism or sabotage. Its remote location, around four hours from the closest major city, is one deterrent -- but will that be enough?

Networking

Comcast Reduced 'Working Latency' By 90% with AQM. Is This the Future? (apnic.net) 119

Long-time Slashdot reader mtaht writes: Comcast fully deployed bufferbloat fixes across their entire network over the past year, demonstrating 90% improvements in working latency and jitter — which is described in this article by by Comcast Vice President of Technology Policy & Standards. (The article's Cumulative Distribution Function chart is to die for...) But: did anybody notice? Did any other ISPs adopt AQM tech? How many of y'all out there are running smart queue management (sch_cake in linux) nowadays?
But wait — it gets even more interesting...

The Comcast official anticipates even less latency with the newest Wi-Fi 6E standard. (And for home users, the article links to a page recommending "a router whose manufacturer understands the principles of bufferbloat, and has updated the firmware to use one of the Smart Queue Management algorithms such as cake, fq_codel, PIE.")

But then the Comcast VP looks to the future, and where all of this is leading: Currently under discussion at the IETF in the Transport Area Working Group is a proposal for Low Latency, Low Loss Scalable Throughput. This potential approach to achieve very low latency may result in working latencies of roughly one millisecond (though perhaps 1-5 milliseconds initially). As the IETF sorts out the best technical path forward through experimentation and consensus-building (including debate of alternatives), in a few years we may see the beginning of a shift to sub-5 millisecond working latency. This seems likely to not only improve the quality of experience of existing applications but also create a network foundation on which entirely new classes of applications will be built.

While we can certainly think of usable augmented and virtual reality (AR and VR), these are applications we know about today. But what happens when the time to access resources on the Internet is the same, or close to the time to access local compute or storage resources? What if the core assumption that developers make about networks — that there is an unpredictable and variable delay — goes away? This is a central assumption embedded into the design of more or less all existing applications. So, if that assumption changes, then we can potentially rethink the design of many applications and all sorts of new applications will become possible. That is a big deal and exciting to think about the possibilities!

In a few years, when most people have 1 Gbps, 10 Gbps, or eventually 100 Gbps connections in their home, it is perhaps easy to imagine that connection speed is not the only key factor in your performance. We're perhaps entering an era where consistently low working latency will become the next big thing that differentiates various Internet access services and application services/platforms. Beyond that, factors likely exceptionally high uptime, proactive/adaptive security, dynamic privacy protection, and other new things will likely also play a role. But keep an eye on working latency — there's a lot of exciting things happening!

Science

The Coronavirus in a Tiny Drop (nytimes.com) 63

To better understand the coronavirus's journey from one person to another, a team of 50 scientists has for the first time created an atomic simulation of the coronavirus nestled in a tiny airborne drop of water. From a report: To create the model, the researchers needed one of the world's biggest supercomputers to assemble 1.3 billion atoms and track all their movements down to less than a millionth of a second. This computational tour de force is offering an unprecedented glimpse at how the virus survives in the open air as it spreads to a new host. "Putting a virus in a drop of water has never been done before," said Rommie Amaro, a biologist at the University of California San Diego who led the effort, which was unveiled at the International Conference for High Performance Computing, Networking, Storage and Analysis last month. "People have literally never seen what this looks like."

How the coronavirus spreads through the air became the subject of fierce debate early in the pandemic. Many scientists championed the traditional view that most of the virus's transmission was made possible by larger drops, often produced in coughs and sneezes. Those droplets can travel only a few feet before falling to the floor. But epidemiological studies showed that people with Covid-19 could infect others at a much greater distance. Even just talking without masks in a poorly ventilated indoor space like a bar, church or classroom was enough to spread the virus. Those findings pointed to much smaller drops, called aerosols, as important vehicles of infection. Scientists define droplets as having a diameter greater than 100 micrometers, or about 4 thousandths of an inch. Aerosols are smaller -- in some cases so small that only a single virus can fit inside them. And thanks to their minuscule size, aerosols can drift in the air for hours.

Space

The Largest Comet We've Ever Seen Just Delivered a Curious Surprise (sciencealert.com) 18

schwit1 shares a report from ScienceAlert: The comet Bernardinelli-Bernstein (BB) -- the largest our telescopes have ever spotted -- is on a journey from the outer reaches of our Solar System that will see it flying relatively close to Saturn's orbit. Now, a new analysis of the data we've collected on BB has revealed something rather surprising. Digging into readings logged by the Transient Exoplanet Survey Satellite (TESS) between 2018 and 2020, researchers have discovered that BB became active much earlier, and much farther out from the Sun, than was previously thought.

A comet becomes active when light from the Sun heats its icy surface, turning ice to vapor and releasing trapped dust and grit. The resulting haze, called a coma, can be useful for astronomers in working out exactly what a particular comet is made out of. In the case of BB, it's still too far out for water to sublimate. Based on studies of comets at similar distances, it's likely that the emerging fog is driven instead by a slow release of carbon monoxide. Only one active comet has previously been directly observed at a greater distance from the Sun, and it was much smaller than BB.
"These observations are pushing the distances for active comets dramatically farther than we have previously known," says astronomer Tony Farnham, from the University of Maryland (UMD). "We make the assumption that comet BB was probably active even farther out, but we just didn't see it before this. What we don't know yet is if there's some cut-off point where we can start to see these things in cold storage before they become active."

The research has been published in the Planetary Science Journal.
Data Storage

Microsoft Makes Breakthrough In the Quest To Use DNA As Data Storage (gizmodo.com) 43

An anonymous reader quotes a report from Gizmodo: Microsoft, one of the pioneers of DNA storage, is making some headway, working with the University of Washington's Molecular Information Systems Laboratory, or MISL. The company announced in a new research paper the first nanoscale DNA storage writer, which the research group expects to scale for a DNA write density of 25 x 10^6 sequences per square centimeter, or "three orders of magnitude" (1,000x) more tightly than before. What makes this particularly significant is that it's the first indication of achieving the minimum write speeds required for DNA storage.

Microsoft is one of the biggest players in cloud storage and is looking at DNA data storage to gain an advantage over the competition by using its unparalleled density, sustainability, and shelf life. DNA is said to have a density capable of storing one exabyte, or 1 billion gigabytes, per square inch -- an amount many magnitudes larger than what our current best storage method, Linear Type-Open (LTO) magnetic tape, can provide. What do these advantages mean in real-world terms? Well, the International Data Corporation predicts data storage demands will reach nine zettabytes by 2024. As Microsoft notes, only one zettabyte of storage would be used if Windows 11 were downloaded on 15 billion devices. Using current methods, that data would need to be stored on millions of tape cartridges. Cut the tape and use DNA, and nine zettabytes of information can be stored in an area as small as a refrigerator (some scientists say every movie ever released could fit in the footprint of a sugar cube). But perhaps a freezer would be a better analogy, because data stored on DNA can last for thousands of years whereas data loss occurs on tape with 30 years and even sooner on SSDs and HDDs.

Finding ways to increase write speeds addresses one of the two main problems with DNA storage (the other being cost). With the minimum write speed threshold within grasp, Microsoft is already pushing ahead with the next phase. "A natural next step is to embed digital logic in the chip to allow individual control of millions of electrode spots to write kilobytes per second of data in DNA, and we foresee the technology reaching arrays containing billions of electrodes capable of storing megabytes per second of data in DNA. This will bring DNA data storage performance and cost significantly closer to tape," Microsoft told TechRadar.

United States

Wanted: A Town Willing to Host a Dump for U.S. Nuclear Waste (bloomberg.com) 335

The Biden administration is looking for communities willing to serve as temporary homes for tens of thousands of metric tons of nuclear waste currently stranded at power plants around the country. Bloomberg reports: The Energy Department filed (PDF) a public notice Tuesday that it is restarting the process for finding a voluntary host for spent nuclear fuel until a permanent location is identified. "Hearing from and then working with communities interested in hosting one of these facilities is the best way to finally solve the nation's spent nuclear fuel management issues," Energy Secretary Jennifer Granholm said in a statement. The agency, in its notice, requested input on how to proceed with a "consent-based" process for a federal nuclear storage facility, including what benefits could entice local and state governments and how to address potential impediments. Federal funding is also possible, the notice said. Approximately 89,000 metric tons of nuclear waste is being stored at dozens of nuclear power plants and other sites around the country.
[...]
One such interim storage site could be in Andrews, Texas. The Nuclear Regulatory Commission in September approved a license for a proposal by Orano CIS LLC and its joint venture partner, J.F. Lehman & Co.'s Waste Control Specialists LLC, to establish a repository in the heart of Texas' Permian Basin oil fields for as many as 40,000 metric tons of radioactive waste. The joint venture envisioned having nuclear waste shipped by rail from around the country and sealed in concrete casks where it would be stored above ground at a site about 30 miles (48.28 kilometers) from Andrews. But the plan has drawn opposition from Texas authorities and local officials who once embraced it as an economic benefit but have since had a change of heart. A similar nuclear waste storage project, proposed in New Mexico by Holtec International Corp., is awaiting approval by the Nuclear Regulatory Commission. The agency said it expects to make a decision on that proposal in January 2022.

Earth

The World Needs To Crack Battery Recycling, Fast (wired.co.uk) 97

As batteries start to pile up, carmakers, battery companies and researchers are trying to save them from ending up in landfills. From a report: Recyclers are primarily interested in extracting the valuable metals and minerals in the cells. Getting to these materials is complex and dangerous: After removing the steel casing, the battery pack needs to be unbundled into cells carefully, to avoid puncturing any hazardous materials. The electrolyte, a liquid whose job it is to move lithium ions between the cathode and anode, can catch fire or even explode if heated. Only once the pack has been dismantled, recyclers can safely extract the conductive lithium, nickel, copper, and cobalt.

Used in the cathode, cobalt is the most sought-after material used in batteries. In its raw form, the rare, bluish-grey metal is predominantly sourced from the Democratic Republic of Congo, where miners work in perilous conditions. The world's major electric car manufacturers are already moving away from cobalt, deterred by the human rights abuses, shortages in the supply chain. That raises the question of whether recyclers will still find it worthwhile to dismantle newer battery types lacking the most valuable ingredients. "When you move to more sustainable materials, and lower cost materials, the incentive to recycle and recover them diminishes," says Jenny Baker, an energy storage expert at Swansea University. She likens this to a dilemma in consumer electronics: It is often cheaper to buy a new mobile phone than to get it fixed or recycled.

[...] In a first step, recyclers typically shred the cathode and anode materials of spent batteries into a powdery mixture, the so-called black mass. In the board game analogy, this would be the first slide down on a snake, Gavin Harper, a research fellow at the University of Birmingham, explains. The black mass can then be processed in one of two ways to extract its valuable components. One method, called pyrometallurgy, involves smelting the black mass in a furnace powered with fossil fuels. It's a relatively cheap method but a lot of lithium, aluminium, graphite and manganese is lost in the process. Another method, hydrometallurgy, leaches the metals out of the black mass by dissolving it in acids and other solvents. This method, Harper says, would correspond to a shorter snake in the board game, because more material can be recovered: you fall back, but not by as many squares as when using pyrometallurgy. The process, however, consumes a lot of energy and produces toxic gases and wastewater.

Piracy

Is 'The NFT Bay' Just a Giant Hoax? (clubnft.com) 74

Recently Australian developer Geoffrey Huntley announced they'd created a 20-terabyte archive of all NFTs on the Ethereum and Solana blockchains.

But one NFT startup company now says they tried downloading the archive — and discovered most of it was zeroes. Many of the articles are careful to point out "we have not verified the contents of the torrent," because of course they couldn't. A 20TB torrent would take several days to download, necessitating a pretty beefy internet connection and more disk space to store than most people have at their disposal. We at ClubNFT fired up a massive AWS instance with 40TB of EBS disk space to attempt to download this, with a cost estimate of $10k-20k over the next month, as we saw this torrent as potentially an easy way to pre-seed our NFT storage efforts — not many people have these resources to devote to a single news story.

Fortunately, we can save you the trouble of downloading the entire torrent — all you need is about 10GB. Download the first 10GB of the torrent, plus the last block, and you can fill in all the rest with zeroes. In other words, it's empty; and no, Geoff did not actually download all the NFTs. Ironically, Geoff has archived all of the media articles about this and linked them on TheNFTBay's site, presumably to preserve an immutable record of the spread and success of his campaign — kinda like an NFT...

We were hoping this was real... [I]t is actually rather complicated to correctly download and secure the media for even a single NFT, nevermind trying to do it for every NFT ever made. This is why we were initially skeptical of Geoff's statements. But even if he had actually downloaded all the NFT media and made it available as a torrent, this would not have solved the problem... a torrent containing all the NFTs does nothing to actually make those NFTs available via IPFS, which is the network they must be present on in order for the NFTs to be visible on marketplaces and galleries....

[A]nd this is a bit in the weeds: in order to reupload an NFT's media to IPFS, you need more than just the media itself. In order to restore a file to IPFS so it can continue to be located by the original link embedded in the NFT, you must know exactly the settings used when that file was originally uploaded, and potentially even the exact version of the IPFS software used for the upload.

For these reasons and more, ClubNFT is working hard on an actual solution to ensure that everybody's NFTs can be safely secured by the collectors themselves. We look forward to providing more educational resources on these and other topics, and welcome the attention that others, like Geoff, bring to these important issues.

Their article was shared by a Slashdot reader (who is one of ClubNFT's three founders). I'd wondered suspiciously if ClubNFT was a hoax, but if this PR Newswire press release is legit, they've raised $3 million in seed funding. (And that does include an investment from Drapen Dragon, co-founded by Tim Draper which shows up on CrunchBase). The International Business Times has also covered ClubNFT, identifying it as a startup whose mission statement is "to build the next generation of NFT solutions to help collectors discover, protect, and share digital assets." Co-founder and CEO Jason Bailey said these next-generation tools are in their "discovery" phase, and one of the first set of tools that is designed to provide a backup solution for NFTs will roll out early next year. Speaking to International Business Times, Bailey said, "We are looking at early 2022 to roll out the backup solution. But between now and then we should be feeding (1,500 beta testers) valuable information about their wallets." Bailey says while doing the beta testing, he realized that there are loopholes in the NFT storage systems and only 40% of the NFTs were actually pointing to the IPFS, while 40% of them were at risk — pointing to private servers.

Here is the problem explained: NFTs are basically a collection of metadata, that define the underlying property that is owned. Just like in the world of internet documents, links point to the art and any details about it that are being stored. But links can break, or die. Many NFTs use a system called InterPlanetary File System, or IPFS, which let you find a piece of content as long as it is hosted somewhere on the IPFS network. Unlike in the world of internet domains, you don't need to own the domain to really make sure the data is safe. Explaining the problem which the backup tool will address, Bailey said, "When you upload an image to IPFS, it creates a cryptographic hash. And if someone ever stops paying to store that image on IPFS, as long as you have the original image, you can always restore it. That's why we're giving people the right to download the image.... [W]e're going to start with this protection tool solution that will allow people to click a button and download all the assets associated with their NFT collection and their wallet in the exact format that they would need it in to restore it back up to IPFS, should it ever disappear. And we're not going to charge any money for that."

The idea, he said, is that collectors should not have to trust any company; rather they can use ClubNFT's tool, whenever it becomes available, to download the files locally... "One of the things that we're doing early around that discovery process, we're building out a tool that looks in your wallet and can see who you collect, and then go a level deeper and see who they collect," Bailey said. Bailey said that the rest of the tools will process after gathering lessons based on user feedback on the first set of solutions. He, however, seemed positive that the talks of the next set of tools will begin in the Spring of next year as the company has laid a "general roadmap."

Power

Could Fusion Energy Provide a Safer Alternative to Nuclear Power? (thebulletin.org) 239

"One way to help eliminate carbon emissions and thereby fight global warming may be to exploit fusion, the energy source of the sun and stars..." argues a new article in Bulletin of the Atomic Scientists (shared by Slashdot reader DanDrollette).

Though fusion energy would involve controllng a "plasma" gas of positively charged nuclei and negatively charged electrons heated to 150 million degrees Celsius, progress is being made — and the upside could be tremendous: One major advantage of using fusion as an energy source is that its underlying physics precludes either a fuel meltdown — such as what happened at Three Mile Island and Fukushima Daichi — or a runaway reaction, such as at Chernobyl. Furthermore, the amount of radioactive material that could be released in an accident in a fusion power plant system is much less than in a fission reactor. Consequently, a fusion system has much less capability to damage itself, and any damage would have much less dangerous consequences. As a result, current concepts for fusion systems may not necessitate an evacuation plan beyond the site boundary. Another advantage of fusion is that neither the fuel nor its products create the very long-lived radioactive waste that fission does, which means that fusion does not require long-term, geological storage...

When and how can fusion contribute to mitigating climate change? Private companies are in a hurry to develop fusion, and many say that they will be able to put commercial fusion power on the US electric grid in the early 2030s. The total private financing in this sector is impressive, at about $2 billion... After looking over the state of publicly and privately funded fusion research, the National Academies recommended that the United States embark upon a program to develop multiple preliminary designs for a fusion pilot plant by 2028, with the goal of putting a modest amount of net electricity on the U.S. electrical grid from a pilot plant starting sometime in the years between 2035 and 2040, use the pilot plant to study and develop technologies for fusion, and have a first-of-a-kind commercial fusion power plant operational by 2050. The United Kingdom has recently announced a plan to build a prototype fusion power plant by 2040. China has a plan to begin operation of a fusion engineering test reactor in the 2030s, while the European Union foresees operation of a demonstration fusion power plant in the 2050s...

We must look beyond the 2035 timeframe to see how fusion can make a major contribution, and how it can complement renewables... [P]roviding low-carbon electricity in the world market, including later in the century, is of great importance for holding climate change at bay.

Piracy

'The NFT Bay' Shares Multi-Terabyte Archive of 'Pirated' NFTs (torrentfreak.com) 88

NFTs are unique blockchain entries through which people can prove that they own something. However, the underlying images can be copied with a single click. This point is illustrated by The NFT Bay which links to a 19.5 Terabyte collection of 'all NFTs' on the Ethereum and Solana blockchains. (UPDATE: One NFT startup is claiming that the collection is mostly just zeroes, and does not in fact contain all of the NFTs.)

But the archive also delivered an important warning message too. TorrentFreak reports: "The Billion Dollar Torrent," as it's called, reportedly includes all the NFTs on the Ethereum and Solana blockchains. These files are bundled in a massive torrent that points to roughly 15 terabytes of data. Unpacked, this adds up to almost 20 terabytes. Australian developer Geoff is the brains behind the platform, which he describes as an art project. Speaking with TorrentFreak, he says that The Pirate Bay was used as inspiration for nostalgic reasons, which needs further explanation.

The NFT Bay is not just any random art project. It does come with a message, perhaps a wake-up call, for people who jump on the NFT bandwagon without fully realizing what they're spending their crypto profits on. "Purchasing NFT art right now is nothing more than directions on how to access or download an image. The image is not stored on the blockchain and the majority of images I've seen are hosted on Web 2.0 storage which is likely to end up as 404 meaning the NFT has even less value." The same warning is more sharply articulated in the torrent's release notes which are styled in true pirate fashion. "[T]his handy torrent contains all of the NFT's so that future generations can study this generation's tulip mania and collectively go..." it reads.

EU

Advisor To EU's Top Court Suggests German Bulk Data Retention Law Isn't Legal (techcrunch.com) 15

The battle between the appetites of European Union Member States' governments to retain their citizens' data -- for fuzzy, catch-all 'security' purposes -- and the region's top court, the CJEU, which continues to defend fundamental rights by reiterating that indiscriminate mass surveillance is incompatible with general principles of EU law (such as proportionality and respect for privacy) -- has led to another pointed legal critique of national law on bulk data retention. From a report: This time it's a German data retention law that's earned the slap-down -- via a CJEU referral which joins a couple of cases, involving ISPs SpaceNet and Telekom Deutschland which are challenging the obligation to store their customers' telecommunications traffic data. The court's judgement is still pending but an influential opinion put out today by an advisor to the CJEU takes the view that general and indiscriminate retention of traffic and location data can only be permitted exceptionally -- in relation to a threat to national security -- and nor can data be retained permanently. In a press release announcing the opinion of advocate general Manuel Campos Sanchez-Bordona, the court writes that the AG "considers that the answers to all the questions referred are already in the Court's case-law or can be inferred from them without difficulty"; going on to set out his view that the German law's "general and indiscriminate storage obligation" -- which covers "a very wide range of traffic and location data" -- cannot be reconciled with EU law by a time limit imposed on storage as data is being sucked up in bulk, not in a targeted fashion (i.e. for a specific national security purpose).
Power

Bill Gates' TerraPower Will Set Up a $4 Billion Nuclear Plant In Wyoming (interestingengineering.com) 243

Hmmmmmm shares a report from Interesting Engineering: Founded by Bill Gates, TerraPower, a company that plans to use nuclear energy to deliver power in a sustainable manner, has selected Kremmer, Wyoming as a suitable site to demonstrate its advanced nuclear reactor, Natrium. The decision was made after extensive evaluation of the site and consultations with the local community, the company said in a press release.

Last year, the Department of Energy (DOE) had awarded TerraPower a grant of $80 million to demonstrate its technology. The advanced nuclear reactor that is being developed by the company in association with General Electric-Hitachi, uses a sodium-cooled fast reactor that works with a molten salt-based energy storage system. Earlier in June, the company had decided to set up its demonstration plant in Wyoming and has recently sealed the decision by selecting the site of a coal-fired power plant that is scheduled for a shut down by 2025, the press release said.

The demonstration plant where the company plans to set up a 345 MW reactor will be used to validate the design, construction, and operation of TerraPower's technology. Natrium technology uses uranium enriched to up to 20 percent, far higher than what is used by other nuclear reactors. However, nuclear energy supporters say that the technology creates lesser nuclear waste, Reuters reported. The energy storage system to be used in the plant is also designed to work with renewable sources of energy. TerraPower plans to utilize this capability and boost its output to up to 500 MW, enough to power 400,000 homes, the company said.

Open Source

Penpot, the Vector Design Web-app Taking On Figma and Canva With FOSS, Hits Beta (penpot.app) 55

"It's Open Source. It's free," says a web page at Penpot.app.

Slashdot reader kxra writes: Penpot is a free-software, web-based vector design platform using .svg as a first-class filetype used as the underlying storage for all designs.

As more design teams around the world move to the convenience of multi-device synchronized and collaborative web apps, this is a welcome respite from proprietary vendor lock-in by the likes of Figma and Canva. Penpot has finally launched as Beta, with competitive features such as a template library that all creators can pull from.

It's created by Kaleidos Open Source, the same team behind the project management tool Taiga for Agile teams which is taking on the likes of JIRA and Confluence with FLOSS.

"Not having a free & open source UX/UI tool that would make devs participate in the design process and bridge the gap between UX/UI and code was a terrible itch for us..." explains the FAQ at Penpot.app. But it also answers the question: why Open Source? Software Technology has the unique advantage, compared to other industries and intellectual property, of having almost zero cost to replicate itself, thus providing a wonderful chance to massively distribute the tools for a more digitally sovereign society. Besides the pure license aspect of it and its legal framework, Open Source fosters more engaging communities where the lines between user and contributor are often blurred...

Penpot requires a browser, that's it. If you want to host your own Penpot instance, that's fine too. We plan to release a native app bundle later this year.

There is a theme here. Universal access. That's why we love to call our product Penpot, there's nothing more personal and yet more universal than a pot full of pens. It's all about choice.

Its GitHub repository already has 5,200 stars and 41 contributors.
Data Storage

Seagate Unveils First Ever PCIe NVMe HDD (techradar.com) 70

An anonymous reader quotes a report from TechRadar: Seagate has unveiled the first ever hard disk drive (HDD) that utilizes both the NVMe protocol and a PCIe interface, which have historically been used for solid state drives (SSDs) exclusively. As explained in a company blog post, the proof-of-concept HDD is based on a proprietary controller that plays nice with all major protocols (SAS, SATA and NVMe), without requiring a bridge. The NVMe HDD was demoed at the Open Compute Compute Project Summit in a custom JBOD enclosure, with twelve 3.5-inch drives hooked up via a PCIe interface. Although the capacity of the drive is unconfirmed, Seagate used images of the Exos X18 for the presentation, which has a maximum capacity of 18TB.

According to Seagate, there are a number of benefits to bringing the NVMe protocol to HDDs, such as reduced total cost of ownership (TCO), performance improvements, and energy savings. Further, by creating consistency across different types of storage device, NVMe HDDs could drastically simplify datacenter configurations. While current HDDs are nowhere near fast enough to make full use of the latest PCIe standards, technical advances could mean SATA and SAS interfaces are no longer sufficient in future. At this juncture, PCIe NVMe HDDs may become the default. That said, it will take a number of years for these hard drives to enter the mainstream. Seagate says it expects the first samples to be made available to a small selection of customers in Autumn next year, while full commercial rollout is slated for 2024 at the earliest.

Privacy

Infrastructure Bill's Drunk Driving Tech Mandate Leaves Some Privacy Advocates Nervous (gizmodo.com) 138

An anonymous reader quotes a report from Gizmodo: The recently passed $1 trillion infrastructure package is jam-packed with initiatives but sprinkled in there alongside $17 billion in funding for road safety programs is a mandate requiring carmakers to implement monitoring systems to identify and stop drunk drivers. The mandate, first noted by the Associated Press could apply to new vehicles sold as early as 2026. Courts have ordered some drunk drivers to use breathalyzers attached to ignition interlocks to start their vehicles for years, but the technology noted in this bill would take that concept much further and would need to be capable of "passively monitor[ing] the performance of a driver of a motor vehicle to accurately identify whether that driver may be impaired."

Though the Department of Transportation has yet to put its foot down on the exact type of technology it will use for this program, the National Highway Traffic Safety Administration (NHTSA) and 17 automakers have been working on something called the Driver Alcohol Detection System for Safety (DADSS) since 2008. DADSS is exploring both a breath and touch-based system to detect whether or not a driver has a blood alcohol concentration (BAC) at or above 0.08%. The breath-based system aims to measure alcohol readings based on a driver's breath with the goal of distinguishing between the driver and passengers. The touch-based system meanwhile would shine an infrared light through a driver's fingertip to measure blood alcohol levels under the skin's surface. [...]

The new mandate struck a positive note with some car safety groups, including Mothers Against Drunk Driving which has advocated for more detection tech in the past. "It's monumental," Alex Otte, national president of Mothers Against Drunk Driving told the AP. Otte went on to describe the package as the "single most important legislation" in the group's history. At the same time though, the mandate has drawn concerns from safety experts and digital rights groups that warn driver monitoring technology could have knock-on privacy implications. In a letter sent last year by the American Highway Users Alliance, the organization urged support of the NHTSA's DADSS Research Program but expressed concerns that the technology could potentially infringe on driver's civil liberties.
"The group also expressed concerns over how the collection and storage of driver data would work and who would have the rights to that data," adds Gizmodo. Others have also expressed concerns over the accuracy of driving monitoring technology and potential risks of bias.
Earth

India Holds Back on Climate Pledge Until Rich Nations Pay $1 Trillion (bloomberg.com) 161

India has declined to update its official climate goal at the United Nations climate negotiations, holding out for rich countries to first offer $1 trillion in climate finance by the end of the decade. From a report: The resistance from India stands in contrast to its surprise announcement on Nov. 1, just as COP26 negotiations got underway, that it would set an ambitious new goal to reach net-zero emissions by 2070. Prime Minister Narendra Modi opened the talks in Glasgow, Scotland, with a decision to increase his nation's share of renewable electricity generation capacity alongside the long-term target to zero out carbon. At the same time, Modi demanded rich countries provide as much as $1 trillion in climate finance just for India -- far more than the $100 billion a year for all poor countries sought under previous deals. Until now, however, it wasn't clear whether India's demand came with a fixed timeline. Officials on Wednesday confirmed that India is seeking that sum by 2030 to fund the build out of renewables, energy storage, decarbonization of the industrial sector and defending infrastructure to a warming planet.
Microsoft

Microsoft's New $249 Surface Laptop SE is Its First True Chromebook Competitor (theverge.com) 26

Microsoft is going head to head with Chromebooks with a new $249 Surface Laptop SE, its most affordable Surface yet. While the software giant has attempted to compete with the popularity of Chrome OS in US schools for years, the Surface Laptop SE is the company's first true Chromebook competitor. From a report: Surface Laptop SE will be sold exclusively to schools and students, starting at $249. It's part of a much broader effort with Windows 11 SE, a new student edition designed to compete with Chrome OS that will ship on a range of low-cost laptops in the coming months. Surface Laptop SE is every bit the low-cost Windows device you'd expect to see for $249.

While it retains the same keyboard and trackpad found on Microsoft's Surface Laptop Go, the all-plastic body houses an 11.6-inch display running at just a 1366 x 768 resolution. This is the first 16:9 Surface device in more than seven years, after Microsoft switched to 3:2 for its Surface line with the Surface Pro 3 launch in 2014. The screen looks like the biggest drawback on this device, particularly as we weren't fans of the low-resolution screen (1536 x 1024) found on the $549 Surface Laptop Go. Lenovo's Chromebook Duet ships with a better 10.1-inch (1920 x 1200) display for the same $249 price as the Surface Laptop SE. Intel's Celeron N4020 or N4120 power the Surface Laptop SE, combined with 4GB / 8GB of RAM and 64GB or 128GB of eMMC storage.

Slashdot Top Deals