AI

Tesla Autopilot Crisis Deepens With Loss of Third Autopilot Boss In 18 Months (arstechnica.com) 45

An anonymous reader quotes a report from Ars Technica: It is no secret that Tesla's Autopilot project is struggling. Last summer, we covered a report that Tesla was bleeding talent from its Autopilot division. Tesla Autopilot head Sterling Anderson quit Tesla at the end of 2016. His replacement was Chris Lattner, who had previously created the Swift programming language at Apple. But Lattner only lasted six months before departing last June. Now Lattner's replacement, Jim Keller, is leaving Tesla as well.

Keller was a well-known chip designer at AMD before he was recruited to lead Tesla's hardware engineering efforts for Autopilot in 2016. Keller has been working to develop custom silicon for Autopilot, potentially replacing the Nvidia chips being used in today's Tesla vehicles. When Lattner left Tesla last June, Keller was given broader authority over the Autopilot program as a whole. Keller's departure comes just weeks after the death of Walter Huang, a driver whose Model X vehicle slammed into a concrete lane divider in Mountain View, California. Tesla has said Autopilot was engaged at the time of the crash. Tesla has since gotten into public feuds with both Huang's family and the National Transportation Safety Board, the federal agency investigating the crash.
"Today is Jim Keller's last day at Tesla, where he has overseen low-voltage hardware, Autopilot software and infotainment," Tesla said in a statement to Electrek. "Prior to joining Tesla, Jim's core passion was microprocessor engineering, and he's now joining a company where he'll be able to once again focus on this exclusively."
XBox (Games)

Xbox One April Update Rolling Out With Low-Latency Mode, FreeSync, and 1440p Support; 120Hz Support Coming In May Update (theverge.com) 47

Microsoft is rolling out a new Xbox One update that brings 1440p support for the Xbox One S and X, as well as support for AMD's FreeSync technology to allow compatible displays to sync refresh rates with Microsoft's consoles. A subsequent update in May will bring 120Hz-display refresh-rate support to the Xbox One. The Verge reports: FreeSync, like Nvidia's G-Sync, helps remove tearing or stuttering usually associated with gaming on monitors, as the feature syncs refresh rates to ensure games run smoothly. Alongside this stutter-free tech, Microsoft is also supporting automatic switching to a TV's game mode. Auto Low-Latency Mode, as Microsoft calls it, will be supported on new TVs, and will automatically switch a TV into game mode to take advantage of the latency reductions. The Xbox One will also support disabling game mode when you switch to another app like Netflix. Microsoft is also making some audio tweaks with the April update for the Xbox One. New system sounds take advantage of spatial sound to fully support surround sound systems when you navigate around. Gamers who listen to music while playing can also now balance game audio against background music right inside the Xbox Guide. Other features in this update include sharing game clips direct to Twitter, dark to light mode transitions based on sunrise / sunset, and improvements to Microsoft Edge to let you download or upload pictures, music, and videos.
AMD

AMD Wants To Hear From GPU Resellers and Partners Bullied By Nvidia (forbes.com) 127

An anonymous reader quotes a report from Forbes: Nvidia may not be talking about its GeForce Partner Program, but AMD has gone from silent to proactive in less than 24 hours. Hours ago Scott Herkelman, Corporate VP and General Manager of AMD Radeon Gaming, addressed AMD resellers via Twitter, not only acknowledging the anti-competitive tactics Nvidia has leveraged against them, but inviting others to share their stories. The series of tweets coincides with an AMD sales event held in London this week. This was preceded by an impassioned blog post from Herkelman yesterday where he comes out swinging against Nvidia's GeForce Partner Program, and references other closed, proprietary technologies like G-Sync and GameWorks.

AMD's new mantra is "Freedom of Choice," a tagline clearly chosen to combat Nvidia's new program which is slowly taking gaming GPU brands from companies like MSI and Gigabyte, and locking them exclusively under the GeForce banner. The GeForce Partner Program also seems to threaten the business of board partners who are are not aligned with the program. Here's what Herkelman -- who was a former GeForce marketing executive at Nvidia -- had to say on Twitter: "I wanted to personally thank all of our resellers who are attending our AMD sales event in London this week, it was a pleasure catching up with you and thank you for your support. Many of you told me how our competition tries to use funding and allocation to restrict or block [...] your ability to market and sell Radeon based products in the manner you and your customers desire. I want to let you know that your voices have been heard and that I welcome any others who have encountered similar experiences to reach out to me..."
The report adds that Kyle Bennett of HardOCP, the author who broke the original GPP story, "says that Nvidia is beginning a disinformation campaign against him, claiming that he was paid handsomely for publishing the story."
AMD

AMD 2nd Gen Ryzen Processors Launched and Benchmarked (hothardware.com) 106

MojoKid writes: AMD launched its 2nd Generation Ryzen processors today, based on a refined update to the company's Zen architecture, dubbed Zen+. The chips offer higher clocks, lower latencies, and a more intelligent Precision Boost 2 algorithm that improves performance, system responsiveness, and power efficiency characteristics. These new CPUs still leverage the existing AM4 infrastructure and are compatible with the same socket, chipsets, and motherboards as AMD's first-generation products, with a BIOS/UEFI update.

There are four processors arriving today, AMD's Ryzen 7 2700X, the Ryzen 7 2700, the Ryzen 5 2600X, and the Ryzen 5 2600. Ryzen 7 chips are still 8-core CPUs with 20MB of cache but now top out at 4.3GHz, while Ryzen 5 chips offer 6 cores with 19MB of cache and peak at 4.2GHz. AMD claims 2nd Gen Ryzen processors offer reductions in L1, L2, and L3 cache latencies of approximately 13%, 34%, and 16%, respectively. Memory latency is reportedly reduced by about 11% and all of those improvements result in an approximate 3% increase in IPC (instructions per clock). The processors now also have official support for faster DDR4-2933 memory as well. In the benchmarks, 2nd Gen Ryzen CPUs outpaced AMD's first gen chips across the board with better single and multithreaded performance, closing the gap even further versus Intel, often with better or similar performance at lower price points. AMD 2nd Gen Ryzen processors, and new X470 chipset motherboards that support them, are available starting today and the CPUs range from $199 to $299.

AMD

AMD Makes 2nd Gen Ryzen Processors Official With Availability Starting Next Week (hothardware.com) 63

MojoKid writes: Today AMD announced official details regarding its new mainstream second-generation Ryzen family of processors. Pricing and detailed specs show some compelling new alternatives from AMD and a refined family of chips to give Intel even more competition, especially considering price point. These new AMD CPUs are all based on the 12nm Zen+ architecture and, at least initially, include four SKUs. The Ryzen 7 family features 8 cores and 16 threads along with 20MB of cache. Ryzen 7 2700 (65W) has a base clock of 3.2GHz and a turbo frequency of 4.1GHz. The top-of-the-line Ryzen 7 2700X (105W) ups the stakes with clocks of 3.7GHz and 4.3GHz respectively. The new Ryzen 5 family features six physical cores capable of executing 12 threads and 19MB of cache. The Ryzen 5 2600 (65W) has a base clock of 3.4GHz and a max boost frequency of 3.9GHz. The Ryzen 5 2600X (95W) ups those speeds to 3.6GHz and 4.2GHz respectively. AMD says that the Ryzen 5 2600, Ryzen 5 2600X, Ryzen 7 2700 and Ryzen 2700X will be available starting April 19th, priced at $199, $229, $299 and $329 respectively.
AMD

AMD Releases Spectre v2 Microcode Updates for CPUs Going Back To 2011 (bleepingcomputer.com) 54

Catalin Cimpanu, writing for BleepingComputer: AMD has released CPU microcode updates for processors affected by the Spectre variant 2 (CVE-2017-5715) vulnerability. The company has forwarded these microcode updates to PC and motherboard makers to include them in BIOS updates. Updates are available for products released as far as 2011, for the first processors of the Bulldozer line. Microsoft has released KB4093112, an update that also includes special OS-level patches for AMD users in regards to the Spectre v2 vulnerability. Similar OS-level updates have been released for Linux users earlier this year. Yesterday's microcode patches announcement is AMD keeping a promise it made to users in January, after the discovery of the Meltdown and Spectre (v1 and v2) vulnerabilities.
Graphics

Intel Reportedly Designing Arctic Sound Discrete GPU For Gaming, Pro Graphics (hothardware.com) 68

MojoKid shares a report from HotHardware: When AMD's former graphics boss Raja Koduri landed at Intel after taking a much-earned hiatus from the company, it was seen as a major coup for the Santa Clara chip outfit, one that seemed to signal that Intel might be targeting to compete in the discrete graphics card market. While nothing has been announced in that regard, some analysts are claiming that there will indeed be a gaming variant of Intel's upcoming discrete "Arctic Sound" GPU. According to reports, Intel originally planned to build Arctic Sound graphics chips mainly for video streaming chores and data center activities. However, claims are surfacing that the company has since decided to build out a gaming variant at the behest of Koduri, who wants to "enter the market with a bang." Certainly a gaming GPU that could compete with AMD and NVIDIA would accomplish that goal. Reportedly, Intel could pull together two different version of Arctic Sound. One would be an integrated chip package, like the Core i7-8809G (Kaby Lake-G) but with Intel's own discrete graphics, as well as a standalone chip that will end up in a traditional graphics cards. Likely both of those will have variants designed for gaming, just as AMD and NVIDIA build GPUs for professional use and gaming as well.
Media

Ask Slashdot: How Do You Stream/Capture Video? 155

datavirtue writes: I am starting to look at capturing and streaming video, specifically video games in 4K at 60 frames per second. I have a Windows 10 box with a 6GB GTX 1060 GPU and a modern AMD octa-core CPU recording with Nvidia ShadowPlay. This works flawlessly, even in 4K at 60 fps. ShadowPlay produces MP4 files which play nice locally but seem to take a long time to upload to YouTube -- a 15-minute 4K 60fps video took almost three hours. Which tools are you fellow Slashdotters using to create, edit, and upload video in the most efficient manner?
Privacy

Ask Slashdot: Why Are There No True Dual-System Laptops Or Tablet Computers? 378

dryriver writes: This is not a question about dual-booting OSs -- having 2 or more different OSs installed on the same machine. Rather, imagine that I'm a business person or product engineer or management consultant with a Windows 10 laptop that has confidential client emails, word documents, financial spreadsheets, product CAD files or similar on it. Business stuff that needs to stay confidential per my employment contract or NDAs or any other agreement I may have signed. When I have to access the internet from an untrusted internet access point that somebody else controls -- free WiFi in a restaurant, cafe or airport lounge in a foreign country for example -- I do not want my main Win 10 OS, Intel/AMD laptop hardware or other software exposed to this untrusted internet connection at all. Rather, I want to use a 2nd and completely separate System On Chip or SOC inside my Laptop running Linux or Android to do my internet accessing. In other words, I want to be able to switch to a small 2nd standalone Android/Linux computer inside my Windows 10 laptop, so that I can do my emailing and internet browsing just about anywhere without any worries at all, because in that mode, only the small SOC hardware and its RAM is exposed to the internet, not any of the rest of my laptop or tablet. A hardware switch on the laptop casing would let me turn the 2nd SOC computer on when I need to use it, and it would take over the screen, trackpad and keyboard when used. But the SOC computer would have no physical connection at all to my main OS, BIOS, CPU, RAM, SSD, USB ports and so on. Does something like this exist at all (if so, I've never seen it...)? And if not, isn't this a major oversight? Wouldn't it be worth sticking a 200 Dollar Android or Linux SOC computer into a laptop computer if that enables you access internet anywhere, without any worries that your main OS and hardware can be compromised by 3rd parties while you do this?
Graphics

Ask Slashdot: How Did Real-Time Ray Tracing Become Possible With Today's Technology? 145

dryriver writes: There are occasions where multiple big tech manufacturers all announce the exact same innovation at the same time -- e.g. 4K UHD TVs. Everybody in broadcasting and audiovisual content creation knew that 4K/8K UHD and high dynamic range (HDR) were coming years in advance, and that all the big TV and screen manufacturers were preparing 4K UHD HDR product lines because FHD was beginning to bore consumers. It came as no surprise when everybody had a 4K UHD product announcement and demo ready at the same time. Something very unusual happened this year at GDC 2018 however. Multiple graphics and GPU companies, like Microsoft, Nvidia, and AMD, as well as other game developers and game engine makers, all announced that real-time ray tracing is coming to their mass-market products, and by extension, to computer games, VR content and other realtime 3D applications.

Why is this odd? Because for many years any mention of 30+ FPS real-time ray tracing was thought to be utterly impossible with today's hardware technology. It was deemed far too computationally intensive for today's GPU technology and far too expensive for anything mass market. Gamers weren't screaming for the technology. Technologists didn't think it was doable at this point in time. Raster 3D graphics -- what we have in DirectX, OpenGL and game consoles today -- was very, very profitable and could easily have evolved further the way it has for another 7 to 8 years. And suddenly there it was: everybody announced at the same time that real-time ray tracing is not only technically possible, but also coming to your home gaming PC much sooner than anybody thought. Working tech demos were shown. What happened? How did real-time ray tracing, which only a few 3D graphics nerds and researchers in the field talked about until recently, suddenly become so technically possible, economically feasible, and so guaranteed-to-be-profitable that everybody announced this year that they are doing it?
Graphics

A New Era For Linux's Low-level Graphics (collabora.com) 61

Slashdot reader mfilion writes: Over the past couple of years, Linux's low-level graphics infrastructure has undergone a quiet revolution. Since experimental core support for the atomic modesetting framework landed a couple of years ago, the DRM subsystem in the kernel has seen roughly 300,000 lines of code changed and 300,000 new lines added, when the new AMD driver (~2.5m lines) is excluded. Lately Weston has undergone the same revolution, albeit on a much smaller scale. Here, Daniel Stone, Graphics Lead at Collabora, puts the spotlight on the latest enhancements to Linux's low-level graphics infrastructure, including Atomic modesetting, Weston 4.0, and buffer modifiers.
AMD

Linux Mint Ditches AMD For Intel With New Mintbox Mini 2 (betanews.com) 46

An anonymous reader writes: Makers of Mint Box, a diminutive desktop which runs Linux Mint -- an Ubuntu-based OS, on Friday announced the Mintbox Mini 2. While the new model has several new aspects, the most significant is that the Linux Mint Team has switched from AMD to Intel (the original Mini used an A4-Micro 6400T). For $299, the Mintbox Mini 2 comes with a quad-core Intel Celeron J3455 processor, 4GB of RAM, and a 60GB SSD. For $50 more you can opt for the "Pro" model which doubles the RAM to 8GB and increases the SSD capacity to 120GB. Graphics are fairly anemic, as it uses integrated Intel HD 500, but come on -- you shouldn't expect to game with this thing. For video connectivity, you get both HDMI and Mini DisplayPort. Both can push 4K, and while the mini DP port can do 60Hz, the HDMI is limited to 30.
AMD

AMD Says Patches Coming Soon For Chip Vulnerabilities (securityweek.com) 84

wiredmikey writes: After investigating recent claims from a security firm that its processors are affected by more than a dozen serious vulnerabilities, chipmaker Advanced Micro Devices (AMD) says patches are coming to address several security flaws in its chips. In its first public update after the surprise disclosure of the vulnerabilities by Israeli-based security firm CTS Labs, AMD said the issues are associated with the firmware managing the embedded security control processor in some of its products (AMD Secure Processor) and the chipset used in some socket AM4 and socket TR4 desktop platforms supporting AMD processors.

AMD said that patches will be released through BIOS updates to address the flaws, which have been dubbed MASTERKEY, RYZENFALL, FALLOUT and CHIMERA. The company said that no performance impact is expected for any of the forthcoming mitigations.

Security

Linus Torvalds Slams CTS Labs Over AMD Vulnerability Report (zdnet.com) 115

Earlier this week, CTS Labs, a Tel Aviv-based cybersecurity startup claimed it has discovered critical security flaws in AMD chips that could allow attackers to access sensitive data from highly guarded processors across millions of devices. Linus Torvalds, Linux's creator doesn't buy it. ZDNet reports: Torvalds, in a Google+ discussion, wrote: "When was the last time you saw a security advisory that was basically 'if you replace the BIOS or the CPU microcode with an evil version, you might have a security problem?' Yeah." Or, as a commenter put it on the same thread, "I just found a flaw in all of the hardware space. No device is secure: if you have physical access to a device, you can just pick it up and walk away. Am I a security expert yet?" CTS Labs claimed in an interview they gave AMD less than a day because they didn't think AMD could fix the problem for "many, many months, or even a year" anyway. Why would they possibly do this? For Torvalds: "It looks more like stock manipulation than a security advisory to me."

These are real bugs though. Dan Guido, CEO of Trail of Bits, a security company with a proven track-record, tweeted: "Regardless of the hype around the release, the bugs are real, accurately described in their technical report (which is not public afaik), and their exploit code works." But, Guido also admitted, "Yes, all the flaws require admin [privileges] but all are flaws, not expected functionality." It's that last part that ticks Torvalds off. The Linux creator agrees these are bugs, but all the hype annoys the heck out of him. Are there bugs? Yes. Do they matter in the real world? No. They require a system administrator to be almost criminally negligent to work. To Torvalds, inflammatory security reports are annoying distractions from getting real work done.

Security

Can AMD Vulnerabilities Be Used To Game the Stock Market? (vice.com) 106

Earlier this week, a little-known security firm called CTS Labs reported, what it claimed to be, severe vulnerabilities and backdoors in some AMD processors. While AMD looks into the matter, the story behind the researchers' discovery and the way they made it public has become a talking point in security circles. The researchers, who work for CTS Labs, only reported the flaws to AMD shortly before publishing their report online. Typically, researchers give companies a few weeks or even months to fix the issues before going public with their findings. To make things even stranger, a little bit over 30 minutes after CTS Labs published its report, a controversial financial firm called Viceroy Research published what they called an "obituary" for AMD. Motherboard reports: "We believe AMD is worth $0.00 and will have no choice but to file for Chapter 11 (Bankruptcy) in order to effectively deal with the repercussions of recent discoveries," Viceroy wrote in its report. CTS Labs seemed to hint that it too had a financial interest in the performance of AMD stock. "We may have, either directly or indirectly, an economic interest in the performance of the securities of the companies whose products are the subject of our reports," CTS Labs wrote in the legal disclaimer section of its report.

On Twitter, rumors started to swirl. Are the researchers trying to make money by betting that AMD's share price will go down due to the news of the vulnerabilities? Or, in Wall Street jargon, were CTS Labs and Viceroy trying to short sell AMD stock? Security researcher Arrigo Triulzi speculated that Viceroy and CTS Lab were profit sharing for shorting, while Facebook's chief security officer Alex Stamos warned against a future where security research is driven by short selling.

[...] There's no evidence that CTS Labs worked with Viceroy to short AMD. But something like that has happened before. In 2016, security research firm MedSec found vulnerabilities in pacemakers made by St. Jude Medical. In what was likely a first, MedSec partnered with hedge fund Muddy Waters to bet against St. Jude Medical's stock. For Adrian Sanabria, director of research at security firm Threatcare and a former analyst at 451 Research, where he covered the cybersecurity industry, trying to short based on vulnerabilities just doesn't make much sense. While it could work in theory and could become more common in the future, he said in a phone call, "I don't think we've seen enough evidence of security vulnerabilities really moving the stock for it to really become an issue."
Further reading: Linus Torvalds slams CTS Labs over AMD vulnerability report (ZDNet).
AMD

Researchers Find Critical Vulnerabilities in AMD's Ryzen and EPYC Processors, But They Gave the Chipmaker Only 24 Hours Before Making the Findings Public (cnet.com) 195

Alfred Ng, reporting for CNET: Researchers have discovered critical security flaws in AMD chips that could allow attackers to access sensitive data from highly guarded processors across millions of devices. Particularly worrisome is the fact that the vulnerabilities lie in the so-called secure part of the processors -- typically where your device stores sensitive data like passwords and encryption keys. It's also where your processor makes sure nothing malicious is running when you start your computer. CTS-Labs, a security company based in Israel, announced Tuesday that its researchers had found 13 critical security vulnerabilities that would let attackers access data stored on AMD's Ryzen and EPYC processors, as well as install malware on them. Ryzen chips power desktop and laptop computers, while EPYC processors are found in servers. The researchers gave AMD less than 24 hours to look at the vulnerabilities and respond before publishing the report. Standard vulnerability disclosure calls for 90 days' notice so that companies have time to address flaws properly. An AMD spokesperson said, "At AMD, security is a top priority and we are continually working to ensure the safety of our users as new risks arise. We are investigating this report, which we just received, to understand the methodology and merit of the findings," an AMD spokesman said. Zack Whittaker, a security reporter at CBS, said: Here's the catch: AMD had less than a day to look at the research. No wonder why its response is so vague.
Bitcoin

Qarnot Unveils a Cryptocurrency Heater For Your Home (techcrunch.com) 65

Qarnot, the French startup known for using Ryzen Pro processors to heat homes and offices for free, is unveiling a new computing heater specifically made for cryptocurrency mining. "The QC1 is a heater for your home that features a passive computer inside," reports TechCrunch. "And this computer is optimized for mining." From the report: The QC1 features two AMD GPUs (Sapphire Nitro+ Radeon RX580 with 8GB of VRAM) and is designed to mine Ethers by default. You can set it up in a few minutes by plugging an Ethernet cable and putting your Ethereum wallet address in the mobile app. You'll then gradually receive ethers on this address -- Qarnot doesn't receive any coin, you keep 100 percent of your cryptocurrencies. If you believe Litecoin or another cryptocurrency is the future, you can also access the computer and mine another cryptocurrency. It's a Linux server and you can access it directly. If your home is cold and you desperately need to turn on the heaters, the QC1 is going to turn on the two GPUs and mine at a 60 MH/s speed. There are also traditional heating conductors in case those two GPUs are not enough. Qarnot heaters don't have any hard drive and rely on passive heating. You won't hear any fan buzzing in the background. You can order the QC1 for $3,600 starting today -- you can also pay in bitcoins. The company hopes to sell hundreds of QC1 in the next year.
Bug

How Are Sysadmins Handling Spectre/Meltdown Patches? (hpe.com) 49

Esther Schindler (Slashdot reader #16,185) writes that the Spectre and Meltdown vulnerabilities have become "a serious distraction" for sysadmins trying to apply patches and keep up with new fixes, sharing an HPE article described as "what other sysadmins have done so far, as well as their current plans and long-term strategy, not to mention how to communicate progress to management." Everyone has applied patches. But that sounds ever so simple. Ron, an IT admin, summarizes the situation succinctly: "More like applied, applied another, removed, I think re-applied, I give up, and have no clue where I am anymore." That is, sysadmins are ready to apply patches -- when a patch exists. "I applied the patches for Meltdown but I am still waiting for Spectre patches from manufacturers," explains an IT pro named Nick... Vendors have released, pulled back, re-released, and re-pulled back patches, explains Chase, a network administrator. "Everyone is so concerned by this that they rushed code out without testing it enough, leading to what I've heard referred to as 'speculative reboots'..."

The confusion -- and rumored performance hits -- are causing some sysadmins to adopt a "watch carefully" and "wait and see" approach... "The problem is that the patches don't come at no cost in terms of performance. In fact, some patches have warnings about the potential side effects," says Sandra, who recently retired from 30 years of sysadmin work. "Projections of how badly performance will be affected range from 'You won't notice it' to 'significantly impacted.'" Plus, IT staff have to look into whether the patches themselves could break something. They're looking for vulnerabilities and running tests to evaluate how patched systems might break down or be open to other problems.

The article concludes that "everyone knows that Spectre and Meltdown patches are just Band-Aids," with some now looking at buying new servers. One university systems engineer says "I would be curious to see what the new performance figures for Intel vs. AMD (vs. ARM?) turn out to be."
Software

Ask Slashdot: Could Linux Ever Become Fully Compatible With Windows and Mac Software? 359

dryriver writes: Linux has been around for a long time now. A lot of work has gone into it; it has evolved nicely and it dominates in the server space. Computer literate people with some tech skills also like to use it as their desktop OS. It's free and open source. It's not vendor-locked, full of crapware or tied to any walled garden. It's fast and efficient. But most "everyday computer users" or "casual computer buyers" still feel they have to choose either a Windows PC or an Apple device as the platform they will do their computing on. This binary choice exists largely because of very specific commercial list of programs and games available for these OSs that is not available for Linux.

Here is the question: Could Linux ever be made to become fully compatible with all Windows and Mac software? What I mean is a Linux distro that lets you successfully install/run/play just about anything significant that says "for Windows 10" or "for OSX" under Linux, without any sort of configuring or crazy emulation orgies being needed? Macs and PCs run on the exact same Intel/AMD/Nvidia hardware as Linux. Same mobos, same CPUs and GPUs, same RAM and storage devices. Could Linux ever be made to behave sufficiently like those two OSs so that a computer buyer could "go Linux" without any negative consequences like not being able to run essential Windows/Mac software at all? Or is Linux being able to behave like Windows and OSX simply not technically doable because Windows and OSX are just too damn complex to mimic successfully?
Businesses

To Combat Shortage, Nvidia Asks Retailers To Limit Graphics Card Orders (pcmag.com) 212

An anonymous reader writes: If you're a PC builder -- or your aging desktop system is in dire need of some modern upgrades -- you've probably wondered why it's impossible to get a graphics card lately. You can thank the outrageous interest in cryptocurrency for all of this. Since graphics cards mine cryptocurrency much faster than CPUs, an eager community of get-rich-quick enthusiasts are scooping up graphics cards as fast as they can get them. While there isn't much major manufacturers AMD and Nvidia can do about the overwhelming demand for GPUs, Nvidia is at least trying to let retailers know that they should be holding their stock for the company's core audience: gamers, not miners. "For NVIDIA, gamers come first. All activities related to our GeForce product line are targeted at our main audience. To ensure that GeForce gamers continue to have good GeForce graphics card availability in the current situation, we recommend that our trading partners make the appropriate arrangements to meet gamers' needs as usual," reads a translated statement Nvidia's Boris Bohles. Nvidia is suggesting that retailers limit graphics card orders to just two per person, but that's just an idea -- one Nvidia can't actually enforce beyond restricting sales on its website, which it's currently doing. Further reading: It's a terrible time to buy a graphics card.

Slashdot Top Deals