Space

Starlight Could Really Be a Vast Alien Quantum Internet, Physicist Proposes (vice.com) 76

Terry Rudolph, a professor of quantum physics at Imperial College London, suggests that interstellar light could actually be harnessed by space faring aliens to form an encrypted quantum internet. Motherboard reports: This may sound like the stuff of science fiction, but Rudolph says it was actually a natural extension of what he does as co-founder of PsiQuantum, a Silicon Valley-based company on a mission to build a scalable photonic quantum computer. He laid out his idea in a paper recently published on the arXiv preprint server. Rudolph said the idea for the paper on aliens communicating with quantum starlight flowed from his work on quantum computers. Unlike the quantum computers being pursued by the likes of Google or Intel that use superconducting circuits or trapped ions at incredibly cold temperatures to create qubits (the quantum equivalent of a computer bit), photonic computers use light to accomplish the same thing. While Rudolph says this kind of quantum design is unconventional, it does also have advantages over its rival -- including being able to operate at room temperature and easy integration into existing fiber optic infrastructure.

The primary way the aliens would create this kind of quantum internet is through a quantum mechanics principle called entanglement, explains Rudolph. In a nutshell, entanglement is a phenomena in which the quantum states of particles (like photons) are linked together. This is what Einstein referred to as "spooky action at a distance" and means that disturbing one particle will automatically affect its partner, even if they're miles apart. This entanglement would allow aliens -- or even humans -- to send encrypted signals between entangled partners, or nodes. Now, scale that single computer system up to a network potentially spanning the entire cosmos.

Aliens aside, Rudolph says that his paper demonstrates that building a photon-based quantum internet here on Earth might be "much easier than we expected." As for the aliens, even if they were using this kind of technology to transform waves of light into their own personal chat rooms, we'd have no way of knowing, says Rudolph. And even if we could pick out these light patterns in the sky, we still wouldn't be able to listen in. This is due to the incredibly shy nature of quantum particles -- any attempt to observe them by an outside party would alter their state and destroy the information they were carrying.

China

US Intel Agencies Are Reviewing Genetic Data From Wuhan Lab (cnn.com) 145

ytene writes: CNN is claiming an exclusive scoop, with an article reporting that U.S. intelligence agencies have scored a massive trove of Covid-19 genetic data, which, CNN suggests, comes from the Wuhan research lab. More than the complex challenge of absorbing and understanding the "mountain" of raw data, U.S. researchers are going to have to translate the material from native Mandarin before the real work can begin. Whilst there has obviously been a lot of interest in a clear identification of the source, it isn't clear how such a revelation could have a material impact on the efficacy of vaccines or the take-up of the treatment. It might, however, give useful clues to help understand where or how the next deadly outbreak could develop. "It's unclear exactly how or when U.S. intelligence agencies gained access to the information, but the machines involved in creating and processing this kind of genetic data from viruses are typically connected to external cloud-based servers -- leaving open the possibility they were hacked," notes CNN, citing multiple people familiar with the matter.

The report also notes that senior intelligence officials are "genuinely split between the two prevailing theories on the pandemic's origins." The World Health Organization says wildlife farms in southern China are the most likely source of the COVID-19 pandemic, but the theory that the virus accidentally escaped from a lab in Wuhan is still being investigated. According to a CNN report last month, "[S]enior Biden administration officials overseeing the 90-day review now believe the theory that the virus accidentally escaped from a lab in Wuhan is at least as credible as the possibility that it emerged naturally in the wild -- a dramatic shift from a year ago, when Democrats publicly downplayed the so-called lab leak theory."
AMD

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points (hothardware.com) 69

MojoKid writes: AMD is taking the wraps off of its latest integrated processors known as Ryzen 7 5700G and the Ryzen 5 5600G. As their branding suggests, these new products are based on the same excellent AMD Zen 3 core architecture, but with integrated graphics capabilities on board as well, hence the "G" designation. AMD is targeting more mainstream applications with these chips. The Ryzen 7 5700G is an 8-core/16-thread CPU with 4MB of L2 cache and 16MB of L3. Those CPU cores are mated to an 8 CU (Compute Unit) Radeon Vega graphics engine, and it has 24 lanes of PCIe Gen 3 connectivity. The 5700G's base CPU clock is 3.8GHz, with a maximum boost clock of 4.6GHz. The on-chip GPU can boost up to 2GHz, which is a massive uptick from the 1.4GHz of previous-gen 3000-series APUs.

The Ryzen 5 5600G takes things down a notch with 6 CPU cores (12 threads) and a smaller 3MB L2 cache while L3 cache size remains unchanged. The 5600G's iGPU is scaled down slightly as well with only 7 CUs. At 3.9GHz, the 5600G's base CPU clock is 100MHz higher than the 5700G's, but its max boost lands at 4.4GHz with a slightly lower GPU boost clock of 1.9GHz. In the benchmarks, the Ryzen 5 5600G and Ryzen 7 5700G both offer enough multi-threaded muscle for the vast majority of users, often besting similar Intel 11th Gen Core series chips, with highly competitive single-thread performance as well.

Desktops (Apple)

Mac Pro Gets a Graphics Update (sixcolors.com) 23

On Tuesday, Apple rolled out three new graphics card modules for the Intel-based Mac Pro, all based on AMD's Radeon Pro W6000 series GPU. From a report: (Apple posted a Mac Pro performance white paper [PDF] to celebrate.) The new modules (in Apple's MPX format) come in three variants, with a Radeon Pro W6800X, two W6800X GPUs, and the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro. The Mac Pro supports two MPX modules, so you could pop in two of the dual-GPU modules to max out performance. They can connect using AMD's Infinity Fabric Link, which can connect up to four GPUs to communicate with one another via a super-fast connection with much more bandwidth than is available via the PCIe bus.
AMD

AMD and Valve Working On New Linux CPU Performance Scaling Design (phoronix.com) 10

Along with other optimizations to benefit the Steam Deck, AMD and Valve have been jointly working on CPU frequency/power scaling improvements to enhance the Steam Play gaming experience on modern AMD platforms running Linux. Phoronix reports: It's no secret that the ACPI CPUFreq driver code has at times been less than ideal on recent AMD processors with delivering less than expected performance/behavior with being slow to ramp up to a higher performance state or otherwise coming up short of disabling the power management functionality outright. AMD hasn't traditionally worked on the Linux CPU frequency scaling code as much as Intel does to their P-State scaling driver and other areas of power management at large. AMD is ramping up efforts in these areas including around the Linux scheduler given their recent hiring spree while it now looks like thanks to the Steam Deck there is renewed interest in better optimizing the CPU frequency scaling under Linux.

AMD and Valve have been working to improve the performance/power efficiency for modern AMD platforms running on Steam Play (Proton / Wine) and have spearheaded "[The ACPI CPUFreq driver] was not very performance/power efficiency for modern AMD platforms...a new CPU performance scaling design for AMD platform which has better performance per watt scaling on such as 3D game like Horizon Zero Dawn with VKD3D-Proton on Steam." AMD will be presenting more about this effort next month at XDC. It's quite possible this new effort is focused on ACPI CPPC support with the previously proposed AMD_CPUFreq. Back when Zen 2 launched in 2019, AMD did post patches for their new CPUFreq driver that leveraged ACPI Collaborative Processor Performance Controls but the driver was never mainlined nor any further iterations of the patches posted. When inquiring about that work a few times since then, AMD has always said it's been basically due to resource constraints that it wasn't a focus at that time. Upstream kernel developers also voiced their preference to seeing AMD work to improve the generic ACPI CPPC CPUFreq driver code rather than having another vendor-specific solution. It's also possible AMD has been working on better improvements around the now-default Schedutil governor for scheduler utilization data in making CPU frequency scaling decisions.

Google

Google Will Abandon Qualcomm and Build Its Own Smartphone Processors This Year (cnbc.com) 57

Google announced Monday it will build its own smartphone processor, called Google Tensor, that will power its new Pixel 6 and Pixel 6 Pro phones this fall. From a report: It's another example of a company building its own chips to create what it felt wasn't possible with those already on the market. In this case, Google is ditching Qualcomm. The move follows Apple, which is using its own processors in its new computers instead of Intel chips. And like Apple, Google is using an Arm-based architecture. Arm processors are lower power and are used across the industry for mobile devices, from phones to tablets and laptops.

Google Tensor will power new flagship phones that are expected to launch in October. (Google will reveal more details about those phones closer to launch.) That, too, is a strategy shift for Google, which in recent years has focused on affordability in its Pixel devices instead of offering high-end phones. And it shows that Google is again trying to compete directly in the flagship space against Apple and Samsung. The name Google Tensor is a nod to the name of Google's Tensor Processing Unit the company uses for cloud computing. It's a full system on a chip, or SoC, that the company says will offer big improvements to photo and video processing on phones, along with features like voice-to-speech and translation. And it includes a dedicated processor that runs artificial intelligence applications, in addition to a CPU, GPU and image signal processor. It will allow the phone to process more information on the device instead of having to send data to the cloud.
Further reading: Google's New Pixel Phones Features a Processor Designed In-House.
Intel

Intel Executive Posts Thunderbolt 5 Photo Then Deletes It (anandtech.com) 22

AnandTech: An executive visiting various research divisions across the globe isn't necessarily new, but with a focus on social media driving named individuals at each company to keep their followers sitting on the edge of their seats means that we get a lot more insights into how these companies operate. The downside of posting to social media is when certain images exposing unreleased information are not vetted by PR or legal, and we get a glimpse into the next generation of technology. That is what happened over the weekend.

EVP and GM of Intel's Client Computing Group, Gregory Bryant, last week spent some time at Intel's Israel R&D facilities in his first overseas Intel trip in of 2021. An early post on Sunday morning, showcasing Bryant's trip to the gym to overcome jetlag, was followed by another later in the day with Bryant being shown the offices and the research. The post contained four photos, but was rapidly deleted and replaced by a photo with three. The photo removed showcases some new information about next-generation Thunderbolt technology. In this image we can see a poster on the wall showcasing '80G PHY Technology,' which means that Intel is working on a physical layer (PHY) for 80 Gbps connections. Off the bat this is double the bandwidth of Thunderbolt 4, which runs at 40 Gbps.

The second line confirms that this is 'USB 80G is targeted to support the existing USB-C ecosystem,' which follows along that Intel is aiming to maintain the USB-C connector but double the effective bandwidth. The third line is actually where it gets technically interesting. 'The PHY will be based on novel PAM-3 modulation technology.' This is talking about how the 0 and 1s are transmitted -- traditionally we talk about NRZ encoding, which just allows for a 0 or a 1 to be transmitted, or a single bit. The natural progression is a scheme allowing two bits to be transferred, and this is called PAM-4 (Pulse Amplitude Modulation), with the 4 being the demarcation for how many different variants two bits could be seen (either as 00, 01, 10, or 11). PAM-4, at the same frequency, thus has 2x the bandwidth of an NRZ connection.

Intel

TSMC Will Start Making 2nm Chips As Intel Tries To Catch Up (gizmodo.com) 83

Kekke writes: "Taiwan Semiconductor Manufacturing Co.'s new foundry will produce 2-nanometer chips," reports Gizmodo. "Construction on the plant in Hsinchu, southwest from Taiwan's capital of Taipei, is expected to start as soon as early 2022. TSMC's 3nm tech is reportedly expected to be put into production in late 2022 -- meanwhile, Intel will be rolling out 7nm chips toward the end of 2022 and into 2023." Will Intel have a genie in the bottle or a rabbit in a hat? Doesn't seem so to me. On Tuesday, Intel unveiled a comeback plan designed to help it reclaim processor manufacturing leadership within four years.
Microsoft

Windows 11 Now Has Its First Beta Release (theverge.com) 49

Microsoft has released the first beta of Windows 11, available to those enrolled in its Windows Insider Program. From a report: Until today, getting access to Windows 11 meant installing the Dev preview, which Microsoft says is for "highly technical users" as it has "rough edges." According to Microsoft, the beta release is less volatile, with builds being validated by Microsoft (though it's still probably something you'll want to install on a test machine or second partition). Of course, to install the beta you'll need a compatible computer. Figuring out if your hardware will work with the next version of Windows has been notoriously tricky to pin down, but Microsoft's article about preparing for Insider builds directs people to its system requirements page. The company has said that it will be paying close attention to how well 7th Gen Intel and AMD Zen 1 CPUs work during the testing period, so it's possible those systems could be allowed to run the beta but not the final release.
Power

Dell Is Cancelling Alienware Gaming PC Shipments To Several US States (pcgamer.com) 86

davide marney writes: Orders for Alienware Aurora R12 and R10 gaming PC configurations placed in California, Colorado, Hawaii, Oregon, Vermont, or Washington will not be honored because of power consumption regulations, reports PC Gamer. "Any orders placed that are bound for those states will be canceled," Dell states in a message.

"The Aurora R12 and R10 are built around the latest generation processors from Intel and AMD, the former featuring 11th Gen Core Rocket Lake CPUs and the latter wielding Ryzen 5000 series chips based on Zen 3," reports PC Gamer. "Unfortunately for both Dell and buyers who reside in affected states, the majority of Aurora R12 and R10 configurations consume more power than local regulations allow. There are exceptions, though [depending on the configuration you select]."
Intel

Intel Details Comeback Plan To Leapfrog Chipmaking Rivals by 2025 (cnet.com) 72

Intel unveiled on Tuesday a smorgasbord of new technologies designed to help it reclaim processor manufacturing leadership within four years. The plans bear the fingerprints of newly installed CEO Pat Gelsinger, who has pledged to restore the company's engineering leadership and credibility. From a report: The developments include a new push to improve the power usage of Intel chips, a key element of battery life, while simultaneously raising chip performance. The technologies involve deep redesigns to how processors are constructed.

One technology, RibbonFET, fundamentally redesigns the transistor circuitry at the heart of all processors. Another, PowerVia, reimagines how electrical power is delivered to those transistors. Lastly, Intel is updating its Foveros technology for packaging chip elements from different sources into dense stacks of computing horsepower. Intel's commitments, unveiled at an online press event, will mean faster laptops with longer battery life, if realized. And the advancements could boost technologies like artificial intelligence at cloud computing companies and speed up the services on mobile phone networks. "In 2025, we think we will regain that performance crown," Sanjay Natarajan, who rejoined Intel this year to lead the company's processor technology development, said in an interview.
Further reading: Intel's foundry roadmap lays out the post-nanometer "Angstrom" era.
AMD

Leaked Intel i9-12900K Benchmark Shows Gains Over the Ryzen 5950X (digitaltrends.com) 90

UnknowingFool writes: An engineering sample of Intel's next flagship processor, the i9-12900K, was shown to beat AMD's current flagship 5950X in Cinebench R20 by 18% in multi-core and 28% in single-core tests. The next generation of Intel processors is believed to use a hybrid big.LITTLE design where 8 of its 16 cores are for low power usage and 8 are for full power. The low power cores only run in single thread where the high power cores can run 2 threads. No official word on pricing or release date from Intel though but engineering samples and B600 motherboards are being sold in China for $1,250 and $1,150, respectively. According to leaker OneRaichu, the results for the 12900K were gathered using water-cooling and without overclocking, so it's possible the final score could be even higher. The rumors suggest the processor will come with 16 cores and 24 threads with a boost clock speed of up to 5.3GHz.
Transportation

Intel's Mobileye Begins Testing Autonomous Vehicles In New York City (theverge.com) 32

Mobileye, the company that specializes in chips for vision-based autonomous vehicles, is now testing its AVs in New York City -- a difficult and rare move given the state's restrictions around such testing. The Verge reports: The announcement was made by Amnon Shashua, president and CEO of the Intel-owned company, at an event in the city on Tuesday. Shashua said the company is currently testing two autonomous vehicles in New York City, but plans to increase that number to seven "in the next few months." New York City has some of the most dangerous, congested, and poorly managed streets in the world. They are also chock-full of construction workers, pedestrians, bicyclists, and double- and sometimes even triple-parked cars. In theory, this would make it very difficult for an autonomous vehicle to navigate, given that AVs typically rely on good weather, clear signage, and less aggressive driving from other road users for safe operation. But Shashua said this was part of the challenge in deciding where to test Mobileye's vehicles.

"I think for a human it's very, very challenging to drive in New York City," Shashua said, "not to mention for a robotic car." While other states have become hot beds for AV testing, New York has been a bit of a ghost town. Part of the reason could be the state's strict rules, which include mandating that safety drivers keep their hands on the wheel at all times and requiring state police escort at all times to be paid for by the testing company. A spokesperson for Mobileye says the company has obtained a permit from the state to test its vehicles on public roads and is currently the only AV testing permit holder in the state. The spokesperson also said that police escorts were no longer required.

Businesses

Intel Is In Talks To Buy GlobalFoundries For About $30 Billion (reuters.com) 57

New submitter labloke11 shares a report from The Wall Street Journal: Intel is exploring a deal to buy GlobalFoundries (source paywalled; alternative source), according to people familiar with the matter, in a move that would turbocharge the semiconductor giant's plans to make more chips for other tech companies and rate as its largest acquisition ever. A deal could value GlobalFoundries at around $30 billion, the people said. It isn't guaranteed one will come together, and GlobalFoundries could proceed with a planned initial public offering. GlobalFoundries is owned by Mubadala Investment Co., an investment arm of the Abu Dhabi government, but based in the U.S. Any talks don't appear to include GlobalFoundries itself as a spokeswoman for the company said it isn't in discussions with Intel.

Intel's new Chief Executive, Pat Gelsinger, in March said the company would launch a major push to become a chip manufacturer for others, a market dominated by Taiwan Semiconductor Manufacturing Co. Intel, with a market value of around $225 billion, this year pledged more than $20 billion in investments to expand chip-making facilities in the U.S. and Mr. Gelsinger has said more commitments domestically and abroad are in the works.

Intel

How Intel Financialized and Lost Leadership in Semiconductor Fabrication (ineteconomics.org) 119

William Lazonick and Matt Hopkins, writing at Institute for New Economic Thinking: Why has Intel fallen behind TSMC and SEC in semiconductor fabrication, and why is it unlikely to catch up? The problem is that Intel is engaged in two types of competition, one with companies like TSMC and SEC in cutting-edge fabrication technology and the other within Intel itself between innovation and financialization. The Asian companies have governance structures that vaccinate them from an economic virus known as "maximizing shareholder value" (MSV). Intel caught the virus over two decades ago. As we shall see, with the sudden appointment of Gelsinger as CEO this past winter, Intel sent out a weak signal that it recognizes that it has the disease.

In the years 2011-2015, Intel was in the running, along with TSMC and SEC, to be the fabricator of the iPhone, iPad, and iPod chips that Apple designed. While Intel spent $50b. on P&E and $53b. on R&D over those five years, it also lavished shareholders with $36b. in stock buybacks and $22b. in cash dividends, which together absorbed 102% of Intel's net income. From 2016 through 2020, Intel spent $67b. on P&E and $66b. on R&D, but also distributed almost $27b. as dividends and another $45b. as buybacks. Intel's ample dividends have provided an income yield to shareholders for, as the name says, holding Intel shares. In contrast, the funds spent on buybacks have rewarded sharesellers, including senior Intel executives with their stock-based pay, for executing well-timed sales of their Intel shares to realize gains from buyback-manipulated stock prices.

Intel

Intel Continues To Rehire Veterans: At Some Point They'll Run Out (anandtech.com) 34

Intel has rehired 28-year veteran Shlomit Weiss into the position of Senior VP and Co-General Manager of Intel's Design Engineering Group (DEG), a position recently vacated by Uri Frank who left to head up Google's SoC development. "Weiss is the latest in an ever-growing list of 're-hiring' Intel veterans, which leads to the problem that at some point Intel will run out of ex-employees to rehire and instead nurture internal talent for those roles," writes Dr. Ian Cutress via AnandTech. From the report: As reported in Tom's Hardware and confirmed in her own LinkedIn announcement, Weiss will be working at Intel's Israel design center alongside Sunil Shenoy and is "committed to ensuring that the company continues to lead in developing chips." [...] In her first 28-year stint at Intel, Weiss is reported to have lead the team that developed both Intel Sandy Bridge and Intel Skylake, arguably two of the company's most important processor families over the last decade: Sandy Bridge reaffirmed Intel's lead in the market with a new base microarchitecture and continues in its 6+th generation in Comet Lake today, while Skylake has been Intel's most profitable microarchitecture ever. Weiss also received Intel's Achievement Award, the company's highest offer, but is not listed as an Intel Fellow, while CRN reports that Weiss also founded the Intel Israel Women Forum in 2014. Weiss left Intel in September 2017 to join Mellanox/NVIDIA, where she held the role of Senior VP Silicon Engineering and ran the company's networking chip design group. In her new role at Intel, Tom's is reporting that Weiss will lead all of Intel's consumer chip development and design, while the other Co-GM of Intel DEG Sunil Shenoy will lead the data center design initiatives. AnandTech goes on to note that Intel has hired 12 veterans since Dec. 20th of last year. "Of these named hires (plenty of other people hired below the role of VP), seven are listed as ex-Intel employees being rehired into the company, mostly into engineering-focused positions," writes Cutress. He continues: It should be noted however that number of engineers that Intel could rehire is limited -- going after key personnel critical to Intel's growth in the last few decades, despite their lists of successful products and accolades, can't be the be-all and end-all of Intel's next decade of growth. If we're strictly adhering to typical retirement ages as well, a number of them will soon be at that level within the next ten years. Intel can't keep rehiring veteran talent into key positions to get to the next phase in its product evolution -- at some level it has to reignite the initial passion from within.

[I]f Intel is having to rehire those who enabled former glory for the company, one has to wonder exactly what is going on such that talent already within the company isn't stepping up. At some point these veterans will retire, and Intel will be at a crossroads. In a recent interview with former Intel SVP Jim Keller, he stated that (paraphrased) "building a chip design team at a company depends on volume -- you hire in if you don't have the right people, but if you have a team of 1,000, then there are people there and it's a case of finding the right ones." In a company of 110,000 employees, it seems odd that Intel feels it has to rehire to fill those key roles. Some might question if those rehires would have left in the first place if Intel's brain drain had never occurred, but it poses an interesting question nonetheless.

Hardware

Qualcomm's New CEO Eyes Dominance in the Laptop Markets (reuters.com) 28

Qualcomm's new chief thinks that by next year his company will have just the chip for laptop makers wondering how they can compete with Apple, which last year introduced laptops using a custom-designed central processor chip that boasts longer battery life. From a report: Longtime processor suppliers Intel and Advanced Micro Devices have no chips as energy efficient as Apple's. Qualcomm Chief Executive Cristiano Amon told Reuters on Thursday he believes his company can have the best chip on the market, with help from a team of chip architects who formerly worked on the Apple chip but now work at Qualcomm. In his first interview since taking the top job at San Diego, California-based Qualcomm, Amon also said the company is also counting on revenue growth from China to power its core smartphone chip business despite political tensions. "We will go big in China," he said, noting that U.S. sanctions on Huawei give Qualcomm an opportunity to generate a lot more revenue.
Data Storage

Intel's New Optane SSD P5800X Is the Fastest SSD Drive Ever Made (hothardware.com) 24

MojoKid writes: Intel recently shifted its storage strategy somewhat and is now catering its flagship Optane SSD P5800X, which was formerly targeted solely at data centers, to workstation users. The Optane SSD P5800X is based on a proprietary PCIe Gen 4x4 native controller and it features Intel's second-generation Intel Optane memory. In terms of performance, in some of the first benchmark numbers to hit the web, the drive is an absolute beast in the workloads that matter most for the vast majority of workstation users and enthusiasts. Random reads and writes are exceptionally good and access times at low queue depths are best-of-class. The Optane SSD P5800X's sequential transfers, while strong, aren't quite on the same level as some of today's fastest NAND-based PCIe 4 solid state drives, but they do exceed 7GB/s, which is still extremely fast. Overall, it's essentially the fastest SSD ever made. Endurance is off the charts too. However, all of that SSD horsepower comes at a price though, at a little over $2.50 per Gig and over $2,000 for an 800GB drive. With capacities of 400GB, 800GB and 1.6TB, the new Intel Optane SSD P5800X is shipping and available now.
OS X

Apple Makes OS X Lion and Mountain Lion Free To Download (macrumors.com) 47

Mac OS X Lion and OS X Mountain Lion can now be downloaded for free from Apple's website. "Apple has kept OS X 10.7 Lion and OS X 10.8 Mountain Lion available for customers who have machines limited to the older software, but until recently, Apple was charging $19.99 to get download codes for the updates," notes MacRumors. "The $19.99 fee dates back to when Apple used to charge for Mac updates. Apple began making Mac updates free with the launch of OS X 10.9 Mavericks, which also marked the shift from big cat names to California landmark names." From the report: Mac OS X Lion is compatible with Macs that have an Intel Core 2 Duo, Core i3, Core i5, Core i7, or Xeon processor, a minimum of 2GB RAM, and 7GB storage space. Mac OS X Mountain Lion is compatible with the following Macs: iMac (Mid 2007-2020), MacBook (Late 2008 Aluminum, or Early 2009 or newer), MacBook Pro (Mid/Late 2007 or newer), MacBook Air (Late 2008 or newer), Mac mini (Early 2009 or newer), Mac Pro (Early 2008 or newer), and Xserve (Early 2009). Macs that shipped with Mac OS X Mavericks or later are not compatible with the installer, however.
Intel

Intel Delays Sapphire Rapids Xeon CPU Production To Q1 2022 (crn.com) 29

Intel has delayed production of its next-generation Xeon Scalable CPUs, code-named Sapphire Rapids, to the first quarter of 2022 and said it will start ramping shipments by at least April of next year. From a report: The Santa Clara, Calif.-based company disclosed the delay in a Tuesday blog post by Lisa Spelman, head of Intel's Xeon and Memory Group, who teased the CPU's new microarchitecture as well as two features that will be new to the Xeon lineup: the next generation of Deep Learning Boost and an acceleration engine called Intel Data Streaming Accelerator. Spelman said Intel is delaying Sapphire Rapids, the 10-nanometer successor to the recently launched Ice Lake server processors, because of extra time needed to validate the CPU.

Slashdot Top Deals