An anonymous reader writes: Earlier this month AMD released the air-cooled Radeon R9 Fury graphics card with Fury X-like performance, but the big caveat is the bold performance is only to be found on Windows. Testing the R9 Fury X on Linux revealed the Catalyst driver delivers devastatingly low performance for this graphics card. With OpenGL Linux games, the R9 Fury performed between the speed of a GeForce GTX 960 and 970, with the GTX 960 retailing for around $200 while the GTX 970 is $350. The only workloads where the AMD R9 Fury performed as expected under Linux was the Unigine Valley tech demo and OpenCL compute tests. There also is not any open-source driver support yet for the AMD R9 Fury.
An anonymous reader writes: A NVIDIA SHIELD Android TV modified to run Ubuntu Linux is providing interesting data on how NVIDIA's latest "Tegra X1" 64-bit ARM big.LITTLE SoC compares to various Intel/AMD/MIPS systems of varying form factors. Tegra X1 benchmarks on Ubuntu show strong performance with the X1 SoC in this $200 Android TV device, beating out low-power Intel Atom/Celeron Bay Trail SoCs, AMD AM1 APUs, and in some workloads is even getting close to an Intel Core i3 "Broadwell" NUC. The Tegra X1 features Maxwell "GM20B" graphics and the total power consumption is less than 10 Watts.
bigwophh writes: 14nm Broadwell processors weren't originally destined for the channel, but Intel ultimately changed course and launched a handful of 5th Generation Core processors based on the microarchitecture recently, the most powerful of which is the Core i7-5775C. Unlike all of the mobile Broadwell processors that came before it, the Core i7-5775C is a socketed, LGA processor for desktops, just like 4th Generation Core processors based on Haswell. In fact, it'll work in the very same 9-Series chipset motherboards currently available (after a BIOS update). The Core i7-5775C, however, features a 128MB eDRAM cache and integrated Iris Pro 6200 series graphics, which can boost graphics performance significantly. Testing shows that the Core i7-5775C's lower CPU core clocks limit its performance versus Haswell, but its Iris Pro graphics engine is clearly more powerful.
Mark Wilson writes: One of the features that has been removed from Windows 10 — at least for home users — is the ability to pick and choose when updates are installed. Microsoft has taken Windows Update out of the hands of users so the process is, for the most part, completely automated. In theory, this sounds great — no more worrying about having the latest patches installed, no more concerns that a machine that hasn't been updated will cause problems for others — but an issue with NVidia drivers shows that there is potential for things to go wrong. Irate owners of NVidia graphics cards have taken to support forums to complain that automatically-installed drivers installed have broken their computers.
An anonymous reader writes: With the upcoming Linux 4.2 kernel will be the premiere of the new "AMDGPU" kernel driver to succeed the "Radeon" DRM kernel driver, which is part of AMD's long talked about new Linux driver architecture for supporting the very latest GPUs and all future GPUs. Unfortunately for AMD customers, there's still much waiting. The new open-source AMDGPU Linux code works for Tonga/Carrizo GPUs but it doesn't yet support the latest R9 Fury "Fiji" GPUs, lacks re-clocking/DPM for Tonga GPUs leading to low performance, and there are stability issues under high-load OpenGL apps/games. There's also the matter that current Linux users need to jump through hoops for now in getting the code into a working state with the latest kernel and forked versions of Mesa, libdrm, new proprietary microcode files, and the new xf86-video-amdgpu user-space driver.
jones_supa writes: Twelve years ago, David White sat down over a weekend and created the small pet project that we know today as the open source strategy game The Battle For Wesnoth. At the time, Dave was the sole programmer, working alongside Francisco Muñoz, who produced the first graphics. As more and more people contributed, the game grew from a tiny personal project into an extensive one, encompassing hundreds of contributors. Today however, the ship is sinking. The project is asking for help to keep things rolling. Especially requested are C++, Python, and gameplay (WML) programmers. Any willing volunteers should have good communication skills and preferably be experienced with working alongside fellow members of a large project. More details can be found at the project website.
New submitter samtuke writes: AMD processors get rated and reviewed based on performance. It is in our self-interest to make things work really, really fast on AMD hardware. AMD engineers contribute to LibreOffice, for good reason. Think about what happens behind a spreadsheet calculation. There can be a huge amount of math. Writing software to take advantage of a Graphics Processing Unit (GPU) for general purpose computing is non-trivial. We know how to do it. AMD engineers wrote OpenCL kernels, and contributed them to the open source code base. Turning on the OpenCL option to enable GPU Compute resulted in a 500X+ speedup, about ¼ second vs. 2minutes, 21 seconds. Those measurements specifically come from the ground-water use sample from this set of Libre Office spreadsheets.
An anonymous reader writes: The Mesa 3D project that is the basis of the open-source Linux/BSD graphics drivers now supports OpenGL 4.0 and most of OpenGL 4.1~4.2. The OpenGL 4.0 enablement code landed in Mesa Git yesterday/today and more GL 4.1/4.2 patches are currently being reviewed for the Intel, Radeon, and Nouveau open-source GPU drivers.
An anonymous reader writes: LibreOffice has lost its X11 dependency on Linux and can now run smoothly under Wayland. LibreOffice has been ported to Wayland by adding GTK3 tool-kit support to the office suite over the past few months. LibreOffice on Wayland is now in good enough shape that the tracker bug has been closed and it should work as well as X11 except for a few remaining bugs. LibreOffice 5.0 will be released next month with this support and other changes outlined by the 5.0 release notes.
An anonymous reader writes: In past years the AMD Catalyst Linux driver has yielded better performance if naming the executable "doom3.x86" or "compiz" (among other choices), but these days this application profile concept is made more absurd with more games coming to Linux but AMD not maintaining well their Linux application profile database. The latest example is by getting ~40% better performance by renaming Counter-Strike: Global Offensive on Linux. If renaming the "csgo_linux" binary to "hl2_linux" for Half-Life 2 within Steam, the frame-rates suddenly increase across the board, this is with the latest Catalyst 15.7 Linux driver while CS:GO has been on Linux for nearly one year. Should driver developers re-evaluate their optimization practices for Linux?
AmiMoJo writes: Some users have noticed that the Japanese character "no", which is extremely common in the Japanese language (forming parts of many words, or meaning something similar to the English word "of" on its own). The Unicode standard has apparently marked the character as sometimes being used in mathematical formulae, causing it to be rendering in a different font to the surrounding text in certain applications. Similar but more widespread issues have plagued Unicode for decades due to the decision to unify dissimilar characters in Chinese, Japanese and Korean.
jones_supa writes: Now that Renderman has been available for free for non-commercial use for a while, there has been many requests for integration with Blender. An initiative spearheaded by Pixar now presents the first Blender to Renderman plugin. With the release of PRMan 20, a small group of developers headed by Brian Savery of Pixar have been working on support for using Renderman and Blender together. The plugin is still in early alpha but has had many great developments in the last few weeks. The source code is available in GitHub.
MojoKid writes: Mobile workstation notebooks typically offer a fair degree of performance but usually at the expense of battery life. It comes with the territory for machines that are configured with higher-end processors with discrete graphics chips, as well as high-end displays that take more power to light up. Lenovo, however, seems to have found a way to strike a better balance with their new ThinkPad W550s, which comes equipped with an Intel Core i7-5600U CPU, an NVIDIA Quadro K620M GPU, and a 15.5 inch IPS display that sports 2880X1620 native resolution. With that kind of horsepower and that many pixels to push, you would think untethered up-time wouldn't be its strong suit but Lenovo configured a snap-in extended battery for the W550s. The 6-cell extended battery, in combination with its 3-cell internal battery, was able to power the machine for over 18 hours of light-duty web browsing in real-world testing (Lenovo claims up to 20 hrs of battery life). The machine also lasted over five hours under heavy-load Battery Eater testing, and the extended battery is unobtrusive, tilting the keyboard up slightly toward the user but keeping well inside the machine's footprint.
An anonymous reader writes: An article written by Kyle Orland looks at how the nascent virtual reality industry will handle openness — in terms of standards, platforms, source code, and development. "Whether any single VR platform is 'open' or not, though, may be moot if developers have to juggle countless slightly different development standards for countless slightly different VR platforms. In a way, making a PC game that only works on the Oculus Rift is as ridiculous as making a PC game that only works on Dell monitors." Right now, the major players in VR tech are using different approaches. Oculus is distributing a closed-license SDK. Valve is setting up a more open platform that lets multiple manufacturers build devices for it. The downside is that it doesn't seem to work as well, particular with Oculus hardware. Oculus founder Palmer Luckey says standards are going to take time and cooperation. Of course, that tune may change when devices start hitting the market.
MojoKid writes: When AMD launched the liquid-cooled Radeon Fury X, it was obvious the was company willing to commit to new architecture and bleeding edge technologies (Fiji and High-Bandwidth Memory, respectively). However, it fell shy of the mark that enthusiasts hoped it would achieve, unable to quite deliver a definitive victory against NVIDIA's GeForce GTX 980 Ti. However, AMD just launched their Radeon R9 Fury (no "X" and sometimes referred to as "Fury Air"), a graphics card that brings a more compelling value proposition to the table. It's the Fury release that should give AMD a competitive edge against NVIDIA in the $500+ graphics card bracket. AMD's Radeon R9 Fury's basic specs are mostly identical to the liquid-cooled flagship Fury X, except for two important distinctions. There's a 50MHz reduction in GPU clock speed to 1000MHz, and 512 fewer stream processors for a total of 3584, versus what Fury X has on board. Here's the interesting news which the benchmark results demonstrate: In price the Fury veers closer to the NVIDIA GeForce GTX 980, but in performance it sneaks in awfully close to the GTX 980 Ti.
Ars Technica reviews the newest release from Linux MInt -- version 17.2, offered with either the Cinnamon desktop, or the lighter-weight MATE, which feels like what Gnome 2 might feel in an alternate universe where Gnome 3 never happened. Reviewer Scott Gilbertson has mostly good things to say about either variety, and notes a few small drawbacks, too. The nits seem to be minor ones, though they might bite some people more than others: Mint, based on Ubuntu deep down, is almost perfectly compatible with Ubuntu packages, but not every one, and this newest version of Mint ships with the 3.16 kernel of Ubuntu 14.04, which means slightly less advanced hardware support. (Gilbertson notes, though, that going with 3.16 means Mint may be the ideal distro if you want to avoid systemd.) "This release sees the Cinnamon developers focusing on some of what are sometimes call "paper cut" fixes, which just means there's been a lot of attention to the details, particularly the small, but annoying problems. For example, this release adds a new panel applet called "inhibit" which temporarily bans all notifications. It also turns off screen locking and stops any auto dimming you have set up, making it a great tool for when you want to watch a video or play a game." More "paper cut" fixes include improved multi-panel options, graphics-refresh tweaks, a way to restart the Cinnamon desktop without killing the contents of a session, graphics-refresh tweaks, and other speed-ups that make this release "noticeably snappier than its predecessor on the same hardware."
MojoKid writes: Previously, you might not have thought much about a wig on a manikin, but checking out NVIDIA's latest tech demo, as a gamer or 3D graphics artists, hair can be pretty interesting. The video is of NVIDIA HairWorks 1.1, a simulation and rendering tool for creating lifelike hair and fur in video games. In the clip, NVIDIA shows off a Fabio-style hairdo with about 500,000 hairs that bounce and sway as the camera circles and forces move the hair. If this was a real wig, it might unseat one of the most boring videos ever. However, as an example of what modern 3D graphics can do with hair physics, it's pretty darn cool. Previous demos of HairWorks showed up to 22,000 strands of hair, making the jump to half a million much much more significant. The video was recorded with ShadowPlay on a GeForce GTX 980, which has some serious muscle, though it's not the most powerful card in NVIDIA's lineup. What's cooler than making life-like human hair? Putting flowing manes on vicious monsters, of course. Apparently, NVIDIA HairWorks simulation technology also plays a role in bringing more than a dozen creatures to life in The Witcher 3: Wild Hunt.
Deathspawner writes: Following-up on the release of 12GB and 16GB FirePro compute cards last fall, AMD has just announced a brand-new top-end: the 32GB FirePro S9170. Targeted at DGEMM computation, the S9170 sets a new record for GPU memory on a single card, and does so without a dual-GPU design. Architecturally, the S9170 is similar to the S9150, but is clocked a bit faster, and is set to cost about the same as well, at between $3,000~$4,000. While AMD's recent desktop Radeon launch might have left a bit to be desired, the company has proven with its S9170 that it's still able to push boundaries.