Take advantage of Black Friday with 15% off sitewide with coupon code "BLACKFRIDAY" on Slashdot Deals (some exclusions apply)". ×

The Tamagochi Singularity Made Real: Infinite Tamagochi Living On the Internet (hackaday.com) 82

szczys writes: Everyone loves Tamagochi, the little electronic keychains spawned in the '90s that let you raise digital pets. Some time ago, XKCD made a quip about an internet-based matrix of thousands of these digital entities. That quip is now a reality thanks to elite hardware hacker Jeroen Domburg (aka Sprite_TM). In his recent talk called "The Tamagochi Singularity" at the Hackaday SuperConference he revealed that he had built an infinite network of virtual Tamagochi by implementing the original hardware as a virtual machine. This included developing AI to keep them happy, and developing a protocol to emulate their IR interactions. But he went even further, hacking an original keychain to use wirelessly as a console which can look in on any of the virtual Tamagochi living on his underground network. This full-stack process is unparalleled in just about every facet: complexity, speed of implementation, awesome factor, and will surely spark legions of other Tamagochi Matrices.

Book Review: the Network Security Test Lab: a Step-by-Step Guide 19

benrothke writes: It wasn't that long ago that building a full network security test lab was an expensive prospect. In The Network Security Test Lab: A Step-by-Step Guide, author Michael Gregg has written a helpful hands-on guide to provide the reader with an economical method to do that. The book is a step-by-step guide on how to create a security network lab, and how to use some of the most popular security and hacking tools. Read below for the rest of Ben's review.

Ask Slashdot: Advice On Enterprise Architect Position 198

dave562 writes: I could use some advice from the community. I have almost 20 years of IT experience, 5 of it with the company I am currently working for. In my current position, the infrastructure and applications that I am responsible for account for nearly 80% of the entire IT infrastructure of the company. In broad strokes our footprint is roughly 60 physical hosts that run close to 1500 VMs and a SAN that hosts almost 4PB of data. The organization is a moderate sized (~3000 employees), publicly traded company with a nearly $1 billion market value (recent fluctuations not withstanding).

I have been involved in a constant struggle with the core IT group over how to best run the operations. They are a traditional, internal facing IT shop. They have stumbled through a private cloud initiative that is only about 30% realized. I have had to drag them kicking and screaming into the world of automated provisioning, IaaS, application performance monitoring, and all of the other IT "must haves" that a reasonable person would expect from a company of our size. All the while, I have never had full access to the infrastructure. I do not have access to the storage. I do not have access to the virtualization layer. I do not have Domain Admin rights. I cannot see the network.

The entire organization has been ham strung by an "enterprise architect" who relies on consultants to get the job done, but does not have the capability to properly scope the projects. This has resulted in failure after failure and a broken trail of partially implemented projects. (VMware without SRM enabled. EMC storage hardware without automated tiering enabled. Numerous proof of concept systems that never make it into production because they were not scoped properly.)

After 5 years of succeeding in the face of all of these challenges, the organization has offered me the Enterprise Architect position. However they do not think that the position should have full access to the environment. It is an "architecture" position and not a "sysadmin" position is how they explained it to me. That seems insane. It is like asking someone to draw a map, without being able to actually visit the place that needs to be mapped.

For those of you in the community who have similar positions, what is your experience? Do you have unfettered access to the environment? Are purely architectural / advisory roles the norm at this level?

Revisiting How Much RAM Is Enough Today For Desktop Computing 350

jjslash writes: An article at TechSpot tests how much RAM you need for regular desktop computing and how it affects performance in apps and games. As it turns out, there's not much benefit going beyond 8 GB for regular programs, and surprisingly, 4GB still seems to be enough for gaming in most cases. Although RAM is cheap these days, and they had to go to absurdly unrealistic settings to simulate high demand for memory outside of virtualization, it's a good read to confirm our judgment calls on what is enough for most in 2015.

Ask Slashdot: Switching To a GNU/Linux Distribution For a Webdesign School 233

spadadot writes: I manage a rapidly growing webdesign school in France with 90 computers for our students, dispatched across several locations. By the end on the year it will amount to 200. Currently, they all run Windows 8 but we would love to switch to a GNU/Linux distribution (free software, easier to deploy/maintain and less licensing costs). The only thing preventing us is Adobe Photoshop which is only needed for a small amount of work. The curriculum is highly focused on coding skills (HTML, CSS, JavaScript, PHP/MySQL) but we still need to teach our students how to extract images from a PSD template. The industry format for graphic designs is PSD so The Gimp (XCF) is not really an option. Running a Windows VM on every workstation would be hard to setup (we redeploy all our PCs every 3 months) and just as costly as the current setup. Every classroom has at least 20Mbit/s — 1Mbit/s ADSL connection so maybe setting up a centralized virtualization server would work? How many Windows/Photoshop licenses would we need then? Anything else Slashdot would recommend?
Open Source

What Goes Into a Decision To Take Software From Proprietary To Open Source 45

Lemeowski writes: It's not often that you get to glimpse behind the curtain and see what led a proprietary software company to open source its software. Last year, the networking software company Midokura made a strategic decision to open source its network virtualization platform MidoNet, to address fragmentation in the networking industry. In this interview, Midokura CEO and CTO Dan Mihai Dumitriu explains the company's decision to give away fours years of engineering to the open source community, how it changed the way its engineers worked, and the lessons learned along the way. Among the challenges was helping engineers overcome the culture change of broadcasting their work to a broader community.
Emulation (Games)

Emulator Now Runs x86 Apps On All Raspberry Pi Models 82

DeviceGuru writes: Russia-based Eltechs announced its ExaGear Desktop virtual machine last August, enabling Linux/ARMv7 SBCs and mini-PCs to run x86 software. That meant that users of the quad-core, Cortex-A7-based Raspberry Pi 2 Model B, could use it as well, although the software was not yet optimized for it. Now Eltechs has extended extended ExaGear to support earlier ARMv6 versions of the Raspberry Pi. The company also optimized the emulator for the Pi 2 allowing, for example, Pi 2 users to use automatically forwarding startup scripts.

Google Offers Cheap Cloud Computing For Low-Priority Tasks 59

jfruh writes: Much of the history of computing products and services involves getting people desperate for better performance and faster results to pay a premium to get what they want. But Google has a new beta service that's going in the other direction — offering cheap cloud computing services for customers who don't mind waiting. Jobs like data analytics, genomics, and simulation and modeling can require lots of computational power, but they can run periodically, can be interrupted, and can even keep going if one or more nodes they're using goes offline.

'Venom' Security Vulnerability Threatens Most Datacenters 95

An anonymous reader sends a report about a new vulnerability found in open source virtualization software QEMU, which is run on hardware in datacenters around the world (CVE-2015-3456). "The cause is a widely-ignored, legacy virtual floppy disk controller that, if sent specially crafted code, can crash the entire hypervisor. That can allow a hacker to break out of their own virtual machine to access other machines — including those owned by other people or companies." The vulnerable code is used in Xen, KVM, and VirtualBox, while VMware, Hyper-V, and Bochs are unaffected. "Dan Kaminsky, a veteran security expert and researcher, said in an email that the bug went unnoticed for more than a decade because almost nobody looked at the legacy disk drive system, which happens to be in almost every virtualization software." The vulnerability has been dubbed "Venom," for "Virtualized Environment Neglected Operations Manipulation."

Microsoft Announces Device Guard For Windows 10 190

jones_supa writes: Microsoft has announced a new feature for Windows 10 called Device Guard, which aims to give administrators full control over what software can or cannot be installed on a device. "It provides better security against malware and zero days for Windows 10 by blocking anything other than trusted apps—which are apps that are signed by specific software vendors, the Windows Store, or even your own organization. ... To help protect users from malware, when an app is executed, Windows makes a determination on whether that app is trustworthy, and notifies the user if it is not. Device Guard can use hardware technology and virtualization to isolate that decision making function from the rest of the Windows operating system, which helps provide protection from attackers or malware that have managed to gain full system privilege." It's intended to be used in conjunction with traditional anti-virus, not as a replacement.

For Boot Camp Users, New Macs Require Windows 8 Or Newer 209

For anyone using Windows 7 by way of Apple's Boot Camp utility, beware: support for Windows via Boot Camp remains, but for the newest Apple laptops, it's only for Windows 8 for now. From Slashgear: This applies to the 2015 MacBook Air, and the 13-inch model of the 2015 MacBook Pro. Windows 8 will remain compatible, as will the forthcoming Windows 10. The 2013 Mac Pro also dropped Boot Camp support for Windows 7, while 2014 iMacs are still compatible, along with 2014 MacBook Airs and 2014 MacBook Pros. For those who still prefer to run Windows 7 on their Macs, there are other options. This change to Boot Camp will not affect using the Microsoft operating system through virtualization software, such as Parallels and VMware Fusion. Also at PC Mag.

Red Hat Strips Down For Docker 44

angry tapir writes Reacting to the surging popularity of the Docker virtualization technology, Red Hat has customized a version of its Linux distribution to run Docker containers. The Red Hat Enterprise Linux 7 Atomic Host strips away all the utilities residing in the stock distribution of Red Hat Enterprise Linux (RHEL) that aren't needed to run Docker containers. Removing unneeded components saves on storage space, and reduces the time needed for updating and booting up. It also provides fewer potential entry points for attackers. (Product page is here.)

VirtualBox Development At a Standstill 288

jones_supa writes: Phoronix notes how it has been a long time since last hearing of any major innovations or improvements to VirtualBox, the virtual machine software managed by Oracle. This comes while VMware is improving its products on all platforms, and KVM, Xen, Virt-Manager, and related Linux virtualization technologies continue to advance as well. Is there any hope left for a revitalized VirtualBox? It has been said that there are only four paid developers left on the VirtualBox team at the company, which is not enough manpower to significantly advance such a complex piece of software. The v4.3 series has been receiving some maintenance updates during the last two years, but that's about it.

The Legacy of CPU Features Since 1980s 180

jones_supa writes: David Albert asked the following question:

"My mental model of CPUs is stuck in the 1980s: basically boxes that do arithmetic, logic, bit twiddling and shifting, and loading and storing things in memory. I'm vaguely aware of various newer developments like vector instructions (SIMD) and the idea that newer CPUs have support for virtualization (though I have no idea what that means in practice). What cool developments have I been missing? "

An article by Dan Luu answers this question and provides a good overview of various cool tricks modern CPUs can perform. The slightly older presentation Compiler++ by Jim Radigan also gives some insight on how C++ translates to modern instruction sets.
Open Source

Big Names Dominate Open Source Funding 32

jones_supa writes: Network World's analysis of publicly listed sponsors of 36 prominent open-source non-profits and foundations reveals that the lion's share of financial support for open-source groups comes from a familiar set of names. Google was the biggest supporter, appearing on the sponsor lists of eight of the 36 groups analyzed. Four companies – Canonical, SUSE, HP and VMware – supported five groups each, and seven others (Nokia, Oracle, Cisco, IBM, Dell, Intel and NEC) supported four. For its part, Red Hat supports three groups (Linux Foundation, Creative Commons and the Open Virtualization Alliance).

It's tough to get more than a general sense of how much money gets contributed to which foundations by which companies – however, the numbers aren't large by the standards of the big contributors. The average annual revenue for the open-source organizations considered in the analysis was $4.36 million, and that number was skewed by the $27 million taken in by the Wikimedia Foundation (whose interests range far beyond OSS development) and the $17 million posted by Linux Foundation.

CoreOS Announces Competitor To Docker 71

New submitter fourbadgers writes: CoreOS, the start-up making the CoreOS Linux distribution, has announced Rocket, a container management system that's an alternative to Docker. CoreOS is derived from Chrome OS and has a focus on lightweight virtualization based on Linux containers. The project has been a long-time supporter of Docker, but saw the need for a simpler container system after what was seen as scope-creep in what Docker provides.
Data Storage

Making Best Use of Data Center Space: Density Vs. Isolation 56

jfruh writes The ability to cram multiple virtual servers on a single physical computer is tempting — so tempting that many shops overlook the downsides of having so many important systems subject to a single point of physical failure. But how can you isolate your servers physically but still take up less room? Matthew Mobrea takes a look at the options, including new server platforms that offer what he calls "dense isolation."

Why Military Personnel Make the Best IT Pros 299

Nerval's Lobster writes Every year, approximately 250,000 military personnel leave the service to return to civilian life. When the home front beckons, many will be looking to become IT professionals, a role that, according to the U.S. Bureau of Labor Statistics, is among the fastest growing jobs in the country. How their field skills will translate to the back office is something to ponder. With the advent of virtualization, mobile, and the cloud, tech undergoes rapid changes, as do the skill sets needed to succeed. That said, the nature of today's military—always on the go, and heavily reliant on virtual solutions—may actually be the perfect training ground for IT. Consider that many war-fighters already are IT technicians: They need to be skilled in data management, mobile solutions, security, the ability to fix problems as they arise onsite, and more. Military personnel used to working with everything from SATCOM terminals to iPads are ideally suited for handling these issues; many have successfully managed wireless endpoints, networks, and security while in the field. Should programs that focus on placing former military personnel in civilian jobs focus even more on getting them into IT roles?
Open Source

Linux Foundation Announces Major Network Functions Virtualization Project 40

Andy Updegrove writes: The Linux Foundation this morning announced the latest addition to its family of major hosted open source initiatives: the Open Platform for NFV Project (OPNFV). Its mission is to develop and maintain a carrier-grade, integrated, open source reference platform for the telecom industry. Importantly, the thirty-eight founding members include not only cloud and service infrastructure vendors, but telecom service providers, developers and end users as well. The announcement of OPNFV highlights three of the most significant trends in IT: virtualization (the NFV part of the name refers to network function virtualization), moving software and services to the cloud, and collaboratively developing complex open source platforms in order to accelerate deployment of new business models while enabling interoperability across a wide range of products and services. The project is also significant for reflecting a growing recognition that open source projects need to incorporate open standards planning into their work programs from the beginning, rather than as an afterthought.