Apple Expected To Move Mac Line To Custom ARM-Based Chips Starting Next Year, Says Report (axios.com) 356
Developers and Intel officials have told Axios that Apple is expected to move its Mac line to custom ARM-based chips as soon as next year. "Bloomberg offered a bit more specificity on things in a report on Wednesday, saying that the first ARM-based Macs could come in 2020, with plans to offer developers a way to write a single app that can run across iPhones, iPads and Macs by 2021," reports Axios. "The first hints of the effort came last year when Apple offered a sneak peek at its plan to make it easier for developers to bring iPad apps to the Mac." From the report: If anything, the Bloomberg timeline suggests that Intel might actually have more Mac business in 2020 than some had been expecting. The key question is not the timeline but just how smoothly Apple is able to make the shift. For developers, it will likely mean an awkward period of time supporting new and classic Macs as well as new and old-style Mac apps. The move could give developers a way to reach a bigger market with a single app, although the transition could be bumpy. For Intel, of course, it would mean the loss of a significant customer, albeit probably not a huge hit to its bottom line.
Great! (Score:5, Insightful)
If they're going to make new laptops, maybe the freaking morans will fix the keyboards at the same time.
Agreed (Score:2, Insightful)
... this keyboard is a bad joke.
Torvalds rant: X86 development vs Arm Development (Score:5, Interesting)
Recently Linus ranted about how server class ARM development was a deadend because of the lack of sufficient "home" computers for normal use (he didn't literally mean home, but rather personal-computers). The answers that! On the otherhand for those of us who rely on libraries like say TensorFLow that doesn't look too good since a lot of that is X86.
It will be interesting to see if Developers will flock to this as the optimum ARM development platform or flee from apple due to lack of x86 in their primary laptop.
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
but apple is moving to app store only so this will not help any other ARM dev's
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
No you can override the signed app protections easily, especially if you are a developer.
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
Re:Torvalds rant: X86 development vs Arm Developme (Score:5, Interesting)
I personally look forward to this. I like the ARM ISA. I thought Torvalds was being short-sighted. For starters, it's a more popular platform by number of chips in the wild. These Intel and AMD CISC designs are all RISC under the hood now, anyway.
We're just doing away with the cruft of a legacy architecture that grew off track.
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
The "cruft" barely matters any more. On super low end chips, sure the instruction decoder matters. On laptops, it really doesn't. The out of order and wide floating point units and wide, fast memory bus are far far more expensive than three decoder.
Re:Torvalds rant: X86 development vs Arm Developme (Score:2, Troll)
Re:Torvalds rant: X86 development vs Arm Developme (Score:5, Insightful)
It may be popular but it doesn't mean it doesn't suck. Torvalds was right. But maybe for different reasons, many of which probably don't apply to Apple.
The main problem with ARM, at least as I as a Linux user am concerned, is the lack of any standardized, open, boot system like the much-maligned BIOS, or EFI, and the lack of a standardized, minimal device tree. There are literally dozens of of cheap single board computers you can get to run linux on. But how many of them can boot a standard distro off of a hard drive or usb stick you just plugged in? How many can run a standard, generic, Linux kernel and a standard, generic, Linux distro? I don't know of any. And it's very frustrating. Those boards that can run android can run a particular version of android, obtained from the manufacturer, limited to their whims to update it.
The promise of ARM is awesome. But so far I remain disappointed. I've got a drawer full of ARM devices that I used for short periods of time. Sheeva Plugs, a GuruPlug, several raspberry pis, and various random chinese boards. All powerful machines in their own right, but not as useful as I thought. Mostly due to the proprietary (or at least esoteric) boot systems, custom kernels, special device trees, proprietary graphics cores, etc. I just don't really want to mess with U-Boot and flashing special images to partitions just to get the latest version of Debian up and running, or install a 5.0 kernel.
If intel produced a board at the price point as these ARM boards, but could boot regular old Debian with a generic x86 kernel, supporting the GPIO that makes Pis so popular, I'd ditch ARM in a heartbeat (SBCs, not phones).
Again, none of this applies to Apple necessarily, though. They control and access every bit of the hardware to make it sing their song, so I'm sure many users won't know or care, as long as they keep buying from the Apple Store. But it's a definite step towards a completely locked-down appliance. Might take another decade, but that's where Apple seems to be heading.
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
The promise of ARM is awesome. But so far I remain disappointed. I've got a drawer full of ARM devices that I used for short periods of time.
Please pry those out of storage, and sell them to some geeks who can use them, for a reasonable price. People could use them. Unless you're keeping them for posterity?
Re:Torvalds rant: X86 development vs Arm Developme (Score:3)
I'm helping others declutter by storing them.
In terms of time (mainly) and money it would be cheaper for someone to buy one new than for me to wrap one up and ship it to a fellow geek.
Re: Torvalds rant: X86 development vs Arm Developm (Score:3)
RISC has been alive longer than the x86.
How d'ya reckon that? The 8086's design started in early 1976, and it became available commercially in 1979. The first two major RISC projects (Standford's MIPS and Berkeley RISC, who evolved into the SPARC architecture) both started in the 1980s and became available commercially years later.
Some people point to the IBM 801 as a forerunner of the RISC concepts, but even this only became available commercially in 1980, and, as a single chip, only in 1981. It wasn't successful, but it was used as a base for the development of the RS/6000 - who, however was launched in 1990, 11 years after the x86
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
> (he didn't literally mean home, but rather personal-computers)
Thanks for the clarification!!!
Re:Torvalds rant: X86 development vs Arm Developme (Score:2)
Re:Great! (Score:2)
If they're going to make new laptops, maybe the freaking morans will fix the keyboards at the same time.
They will do something to get you to stop complaining. Notice how no one mentions the horrible touch bar anymore? Well not to worry, but making the next Mac completely unusable you won't ever worry about sticky keys again!
Re: Great! (Score:5, Insightful)
Not necessarily. Windows has an ARM version. Though more than likely they'll ditch POSIX support and go full iOS across the line.
Re: Great! (Score:2)
iOS is Posix. Itâ(TM)s the same OS as MacOS, just with mobile frameworks and ARM instead of x86.
Re: Great! (Score:2)
So where's /bin/sh?
There isn't full support, and even if there were, it's irrelevant as application developers aren't allowed to get at it. Everything is going through the constrained frameworks.
Re: Great! (Score:5, Informative)
While I worry about artificial constraints as well... if you’d ever worked with a jail broken iPhone or iPad, you’d know there’s basically a standard bash shell under there. You can ssh into the things by installing the openssh daemon. Heck, people have even run apache and nginx on them.
Re: Great! (Score:2)
Interesting. So it's Unix-ish if your a internal developer or hacker. I haven't played with any sort of "smartphone" out of sheer protest of the walled garden mentality of both Google and Apple, even if one isn't quite as strict as the other.
Re: Great! (Score:2)
iMac:~ haize$ which sh /bin /sh
iMac:~ haize$
Please get a clue before you keep running your mouth.
Re: Great! (Score:3, Insightful)
"arm is such a piece of shit for actual performance. this will KILL mac for anything actually useful."
Yeah, that's why the A12 chip on the iPad Pro benchmarks 85% faster than all of the laptops on the market today.
And you've seen, of course, the real-time 4K video editing/rendering apps on the iPad.
Personally, I can't wait to see what Apple's A-series chips can do when not limited by a phone or tablets power and thermal constraints.
Re: Great! (Score:2)
Nobody does 4k on the CPU. This was hardware acceleration in the GPU, with a company they worked very closely with to get the acceleration working flawlessly for one specific video codec format and setting. Going outside the presets likely won't end well.
And there's a pretty long history of companies gaming benchmarks. We know geek-bench hits a lot of paths that are hardware accelerated, and memory/disk sensitive. And this is even before going into the fact iOS is finely tuned to the hardware, and the App store likely recieved the geekbench app in an IR that can be precisely optimized for the hardware.
It's not doubt a good mobile chip, but a workstation chip? I kind of doublt it.
Re: Great! (Score:2, Flamebait)
Re: Great! (Score:2)
Strict W^X policy on iOS (Score:3, Interesting)
The poor OS is not real, though. iOS is OSX, with different libraries for making GUI applications, but with the same underpinnings.
One critical piece of the underpinnings differs: it's impossible for iOS applications to flip a page from writable to executable. Only the system executable loader can do that. The strict W^X policy on iOS makes it impossible to run a compiler like that included with Xcode or a JIT like PyPy. Any tool for programming on a device must be a full interpreter, like CPython or Swift Playgrounds, and a user ends up wasting most of the performance of a powerful ARM CPU on the overhead of this interpreter. This is what I meant by the usefulness of the iPad product being hamstrung by Apple's policies embodied in the OS.
Re: Great! (Score:2)
Out of curiosity, what other computer platform just puts "CPUs on cards" such that you can install just as many (2, 4, 10) as needed? Even blade servers put basically the entire CPU plus RAM plus IO on a system board.
Dual Boot MacOS and Windows is Critical (Score:5, Informative)
So this means no more boot camp as well?
Sure, but who wants that
Apple's market share literally doubled after switching to Intel and allowing Windows to dual boot. One of the biggest stumbling blocks to get people to switch to Mac was their need to use Windows (and before that MS-DOS) software. Once you could dual boot MacOS or Windows you no longer had to choose PC or Mac, you could have one computer that could run either software family.
... bad news for Apple.
Regarding emulation, it worked but was not practical. It barely works today where it does *not* have to emulate the CPU architecture. A switch to ARM would impose a huge burden on emulators and seriously and negatively impact performance.
While Microsoft might offer Windows on ARM you would have a lot of PC software that will not be recompiled for ARM. So dual booting ARM MacOS or ARM Windows gets you back to the bad old days of having the choose PC (ie x86) or something-not-PC. Good news for Dell, HP, etc
Re:Dual Boot MacOS and Windows is Critical (Score:3, Insightful)
So this means no more boot camp as well?
Sure, but who wants that
Apple's market share literally doubled after switching to Intel and allowing Windows to dual boot. One of the biggest stumbling blocks to get people to switch to Mac was their need to use Windows (and before that MS-DOS) software. Once you could dual boot MacOS or Windows you no longer had to choose PC or Mac, you could have one computer that could run either software family.
... bad news for Apple.
Regarding emulation, it worked but was not practical. It barely works today where it does *not* have to emulate the CPU architecture. A switch to ARM would impose a huge burden on emulators and seriously and negatively impact performance.
While Microsoft might offer Windows on ARM you would have a lot of PC software that will not be recompiled for ARM. So dual booting ARM MacOS or ARM Windows gets you back to the bad old days of having the choose PC (ie x86) or something-not-PC. Good news for Dell, HP, etc
Apple's computers are an abbreviation. Their money comes from locking people into the Apple ecosystem. Right now, with computers that can run Windows and act like general purpose operating systems (despite how limiting Mac OS is, you can still run unsigned code on it). No, this cannot stand in the world of Apple. They have a new way forward, a better way where your Apple computer connects to your Apple phone and to your Apple tablet all of which can only install Apple approved apps from an Apple only store whilst listening on Apple approved headphones to Apple approved music all of which Apple collects 30% from.
Apple doesn't care if Peter Programmer buys an Apple computer, you mean nothing to them. They want to force Helen Homemaker and Steve Salesman onto a 100% Apple platform with no escape. We saw this coming years ago (well I did, and if I saw it I'm sure I'm not the only one). Apple are ruthless at simplifying, reducing options, funnelling you into their way of doing things. They're not going to turn these things into giant IOS devices because they have to, but because they want to. It also won't happen overnight, but it will happen.
Re: Great! (Score:2)
Actually i think the touchbar would be better if it had the haptic engine that the trackpad uses, it really works well for the trackpad and I think it could Work for typing. I guess they kept that âinnovationâ(TM) for a future release since releasing innovations over time is good for business.
Here comes the singularity (Score:5, Interesting)
Ladies and Gentlemen, step right up to witness another technology train wreck where they try to achieve the illusive singularity. Apple is going to merge iPhone, iPad, and MacOS into a single platform. Other greats like Microsoft tried to achieve the singularity between mobile and and the desktop, but they failed. Their Windows Phone is just a memory and but the strange tiles on Windows 10 still remain and, Windows 10 tablet mode is still unusable.
Now, a company which doesn't have a touch screen computer, but only a lousy keyboard that everyone hates, is going to try this amazing feat again. Using a mobile ARM processor with a touch screen UI/UX/OS called IOS, they are going to merge it with another mouse driven UI/UX called MacOS. Can they pull it off without a touch screen? How will users dual boot to Windows 10 to run their CAD software? And will it have a headphone jack? So many questions, so few answers. Without the reality distortion field of Steve Jobs this could be a headless company recycling failed ideas from other companies. Did anyone from Microsoft recently take on a leadership role at Apple?
Not matter how you slice it, it will be painful drama for users. You won't be able to look away, it will be like watching a car crash in slow motion, you know you should look away, but you just can't.
The singularity, can it be achieved? Stay tuned..
That's not what is happening (Score:5, Insightful)
Apple has stated repeatedly they want nothing like the singularity, that desktops are inherently different than tablets or mobile devices.
All that is happening here is a processor switch, because Intel has dropped so many balls they are more balls than company now. Apple wants to be able to control the processor so they can actually realize some gains, and avoid some of the shoddy design issues that have come to light in intel processors recently...
I for one am fine with the change, these days adding support for another architecture is not THAT bad and Apple pulled it off really well before.
Re:That's not what is happening (Score:2, Interesting)
No less than Google has announced that Spectre vulnerabilities are here to stay and cannot be resolved in hardware or software. Researchers presented a new Spectre attack that cannot be defeated. Existing x86 and high-end ARM designs are all vulnerable and will remain broken for any kind of meaningful security.
Google: Software is never going to be able to fix Spectre-type bugs [arstechnica.com], 2/23/19
If Intel's top CPUs are unfixable, that may be influencing Apple's decision to move to ARM, especially if Apple's chip guys think they can fix those bugs in hardware.
An A13X CPU with decent cooling and high clock rate with multiple neural engines could make a very compelling MacBook Air. Even more so if it was immune to these speculative execution attacks like the various Spectre exploits.
Re:That's not what is happening (Score:2)
Yes, exactly - and Apple automatically gains some comparative performance boost with other systems by simply not having to have the system performance impacting workarounds Intel chips have to use today, which as you noted don't even really solve the problem entirely.
Re: That's not what is happening (Score:2)
Iâ(TM)m fine with it as long as I can use off-the-shelf components to build my own. Right now, Iâ(TM)m not aware of that potential for an ARM based system, but that could change I suppose. I donâ(TM)t really want to be locked into Appleâ(TM)s hardware though.
Re:That's not what is happening (Score:2)
Apple has stated repeatedly they want nothing like the singularity, that desktops are inherently different than tablets or mobile devices.
As a Mac user, I will be curious to see if Apple truly believes that or if it was basically just an anti-Windows 8/10 talking point. Certainly some of the bits like Mission Control *look* like iOS, and at seemed like they spent a bit of time talking it up until it became obvious their users realized it was pretty useless on a laptop.
And I honestly do wonder if one of the reasons they’ve moved to those extremely low-travel keyboards (which many of us abhor) is to try and make the eventual shift to a non-moving, haptic-only “keyboard” less jarring.
Re:That's not what is happening (Score:2)
I don't see any reason to doubt Apple on this. It's not like they aren't aware of the disasters of Windows 8 and Ubuntu's failed Unity experiment, all done in the name of trying to merge mouse+keyboard and touch-first paradigms.
As for the keyboards, I think it's probably mostly power users who hate those, and it seems evident that Apple isn't really focusing on power users these days. With their iPhone's success, they're clearly focused on the mass market, and those keyboards are (apparently) fine with most normal users.
Re:Apps development (Score:4, Insightful)
history repeats (Score:3)
I'm pretty sure, this is how Apple killed the Mac the first time... History repeats itself?
Re:history repeats (Score:2)
You must be remembering wrong. Apple Mac switched CPU twice already, and it was after Jobs returned to the company so that was the rebirth of the Mac.
They (almost) killed the Mac once before, while doing nothing with it.
Re:history repeats (Score:2)
They (almost) killed the Mac once before, while doing nothing with it.
Yeah, you have it right. Apple almost killed Mac by sticking with 68k. But they had an ARM core that would rival the 68k processors of the day in the last of the Newtons, and they neglected to go ARM then like they should have.
Re:history repeats (Score:3)
They were on PowerPC in "the last days of the Newton" and while the DEC StrongARM was definitely amazing in MIPS/Watt, the PowerPC chips had better absolute performance, and definitely better memory bandwidth. (But yes, they completely crippled it with brain-dead designs at times, like the Performa/LC 5000 series with its half-width system buses. They'd done the same thing previously putting 16-bit memory on 32-bit 68k chips. There were a disturbing number of Macs that should've and would've performed a lot better if the system design wasn't brain-dead for cost-reduction or compatibility with old PDS/Comm Slot cards.)
"awkward period" == 10+ years/Look at alternatives (Score:3)
I can't see this being a very happy transition, especially for developers and product support.
Looking back, it took 5+ years to end support for PowerPC Macs, I can't see it being any less and I would expect it to be twice that especially for Mac Servers.
Maybe this is why Linus made his comments about ARMs a couple of days ago: https://slashdot.org/story/19/... [slashdot.org]
If this is all a reason for having apps that work on iPhone, iPad & Macs, I again point to HTML5 and WPA. I can see that their growth could result in a downfall of Apple specific hardware and apps.
Re:"awkward period" == 10+ years/Look at alternati (Score:2)
Apple has done transitions before: classic MacOS to MacOS X, Motorola 68000 to PowerPC, and PowerPC to Intel. They survived all three. Given their history, they're obviously capable of handling transitions well enough.
Re:"awkward period" == 10+ years/Look at alternati (Score:3)
Looking back, it took 5+ years to end support for PowerPC Macs, I can't see it being any less and I would expect it to be twice that especially for Mac Servers.
Apple doesn't have servers any more. "macOS Server" is an app in their desktop app store which costs twenty bucks, and provides some of the functionality which comes with NT server. The last time Apple had a server hardware product was 2011.
If this is all a reason for having apps that work on iPhone, iPad & Macs, I again point to HTML5 and WPA.
Ugh. What a PITA. If that's the best Apple can do, their best isn't very good.
Re: "awkward period" == 10+ years/Look at alternat (Score:2)
They already have it, and have for years. Developing for iOS means code is first compiled for x86/x64 to test on the desktop, and then its recompiled with the ARM toolchain when you deploy to a device. Their development pipeline is relatively platform agnostic. Xcode kind of sucks as an IDE though.
Re: "awkward period" == 10+ years/Look at alternat (Score:2)
Their development pipeline is relatively platform agnostic. Xcode kind of sucks as an IDE though.
Then use Eclipse or IDEA Intelli J
Will be interesting to see the performance (Score:2)
I am not tied to AMD64 if I can get the same or better performance elsewhere at the same or better price. However, I expect that single-core performance will be pretty lacking and that would be a show-stopper.
The plan all along? (Score:5, Interesting)
Apple's own CPUs are not strictly "ARM-based", as they do not have cores developed by ARM itself.
They have their own cores that are merely using ARM's ISA.
Apple's CPU designs are likely to have lineage to P.A. Semi [wikipedia.org] which Apple acquired in 2008.
Before then, P.A. Semi had made processors running the PowerPC ISA. Apple had previously been interested in using those, but opted not to in favour of x86.
Re: The plan all along? (Score:3)
They *are* strictly ARM based â" they use a strict superset of the ARM specifications. They add on a few of their own SoC features and performance enhancements, but anything written for an ARM processor will run on Appleâ(TM)s Ax processors. Itâ(TM)s not like Apple uses unique instruction sets or anything.
Re:The plan all along? (Score:2)
I bet they license a lot more than an ISA (which may actually be free to replicate, not sure) - ARM is an IP company and have modular hardware designs which Apple likely uses to a great extent, tweaking it here and there and adding or removing modules.
combine this with Linus' recent thoughts about ARM (Score:5, Interesting)
Linus Torvalds has stated that ARM won't win the server space because developers want to run their apps on the architecture it has been developed on and almost all are developing on x86. Many application bugs are still architecture specific. Application performance optimization is also highly architecture specific, especially for database applications.
Given the Mac's popularity among developers, this argument should apply to the Macs too when looked at from the opposite angle. The vast majority of servers are x86, and developers want to run their apps on the architecture they are developing for. Running in an emulator is nowhere near the same experience. I would think a switch from x86 to ARM would decimate the number of developers calling the Mac home.
Separately, I don't see the appeal of running phone apps on my laptop or desktop. Smartphone apps do not have the feature density that I'm looking for with a desktop app and desktop apps are not generally appropriate for smartphones. On my desktop, I don't want simplicity. I want to see everything I can at once and to be able to do almost everything with my keyboard.
Re:combine this with Linus' recent thoughts about (Score:2)
Linus Torvalds has stated that ARM won't win the server space because developers want to run their apps on the architecture it has been developed on and almost all are developing on x86.
Almost all are developing in Java, so the actual hardware does not matter. (*facepalm*)
Re:combine this with Linus' recent thoughts about (Score:2)
I think the chicken and egg issue will dominate though. Until a large portion of the datacenter systems are ARM, there would be no compelling reason for a developer to switch their development platform and many compelling reasons not too. And until a large number of developers are on ARM, the datacenters would be fighting the developer's platform if they switch.
Why would I buy a development system as premiumly-priced as the Macs to target a platform that might be successful in a few years? These things don't happen overnight.
I CAN play devil's advocate with myself here. I do realize that this will help front end developers, and that is almost certainly in Apple's thoughts. But, in my experience, the back end is where the real tech is. If front end developers jump to ARM-based Macs because it makes their jobs easier, we'll see the already damaging gulf between front and back end development widen. That would be a bad result for the industry. Of course, I say this as more of a full-stack guy that believes it is much harder to develop a quality product when nobody on the team fully understands both worlds.
Re:combine this with Linus' recent thoughts about (Score:2)
Why would I buy a development system as premiumly-priced as the Macs to target a platform that might be successful in a few years?
Because Macs run OS X or macOS. It is even preinstalled. A random hardware does not.
Why Mac haters don't shut up is beyond me. If you have no use/need for macOS, fine. then simply shut up, idiots.
Re:combine this with Linus' recent thoughts about (Score:2)
I'm leaning toward this future. I've switched to using a Galaxy S9+ as my daily computing device, and the Thinkpad is reserved for longer coding sessions.
But, more and more, I just use a bluetooth keyboard from Omoton and use Termius to SSH in to my servers and the laptop from my phone. File Manager+ has SFTP support, among many features. I have browsers, VNC, DroidVIM (really an excellent port), etc.. Do I need a monitor? I cast the screen to a ChromeCast.
I write code primarily. If I need horsepower, I spin up a VM from my phone and use SFTP/Git to load up some code, and SSH in to administer it. The phone fits in my pocket while I'm running around between the machines at work, too. Nice bonus there, not lugging the laptop itself.
Hybrids? (Score:4, Interesting)
If Apple is making their own ARM chips, presumably they can put them in at-cost as a co-processor along with an Intel chip on their home computer line.
Benefits of the Hybrid:
* Increase adoption of ARM as you deprecate Intel chips over a few generations
* Run iOS apps at full speed while the Intel processor handles i86 tasks
* Not be shackled by poor performance of ARM on desktop for individuals running apps that are processor-dependent and slow an Intel chip to a crawl.
* (if you choose to make hybrid a long term solution) Have apps that run in multiprocessor mode with some processes running on each chip, making your home computer faster than all other manufacturers who are not selling multi-processor solutions.
Re:Hybrids? (Score:2)
Seems reasonable.
Their ARMs must cost them very little esp. compared to what they pay Intel.
Less worry about heat would give better potential performance.
Xcode will make porting a matter of setting a build option or two, if that.
Apple's GPU future also looks promising.
It's time we finally say goodbye to everything that is 1981's PC.
Bloomberg (Score:3)
Re:Bloomberg (Score:2)
A low pin chip connected at the ethernet port. Or where the PHY is. By this point the data should have already been encrypted and secured. Especially if its in a secure facility, even communications inside a rack are usually encrypted. Besides if they were wanting to get any unsecured data off the network then it would be better just to compromise the switch. That way they get what they need from multiple sources, and compromise the thing that would be used to detect the information drain.
If they wanted to get data that isn't secure, they'd have to tap something on the data bus. I think data buses are around 256 bits in most servers. Add in 40-64 bits for the address lines, and you have over 300 pins on the chip, and then you have to have power, grounds, and the pins to send the data out, which means talking to the PHY. I suppose the chip could send out ethernet direct, requiring only 4 pins, but then it would have to be 12V tolerant, and that you need to use a larger silicon process, and more gap between the pins. Most likely they're would talk to the PHY through SMII (or whatever the gigabit interface is, I'm more familiar with 100 mbit interfaces at the hardware level), which is another 20 pins or so. They probably also need an external oscillator.... So I don't think you're going to find a chip to monitor data in a server with less than 400 pins.
Even with a BGA package this is not a 'small' chip. And then they have to deal with internal RAM/ROM and the processing power to figure out what information they've found and send. There's no way they're going to send all of it. It would take too much time, and make it too detectable.
I'm not saying it can't be done, and the supermicro servers can't be compromised. Just I believe that they can be compromised in the way Bloomberg claims they are. Hollywood Magic doesn't work, you can't just add in a 'chip' and compromise stuff. You have to add it to where it can be effective.
Re:Bloomberg (Score:2)
Because it's Hollywood magic, when it sends the zoomed and enhanced data, it displays at 120 cps and each character will make a little chirp sound.
I doubt they'll be "Macs" as such (Score:2, Troll)
Basically I expect them to be laptops/desktops with the iPhone/iPad/iWatch business model and an i-name like iBook or iNote or whatever. Runs a version of iOS that's adopted Mac interfaces but is locked down with no dual boot to anything else. All applications come from the store so no backwards compatibility with Mac apps, just windowed iOS apps until developers make a store version. The question is just if Apple can resist the temptation to price it crazy, I mean their latest phones are really getting out of hand.
x86 (Score:2)
So everything Apple does, did, or will do is doomed to failure,
but we loves us some x86 architecture from 1979 and can't imagine an alternative.
The future has spoken.
Re:x86 (Score:3)
So everything Apple does, did, or will do is doomed to failure, but we loves us some x86 architecture from 1979
Nonsense. All common x86 processors have been internally RISCy since AMD introduced their Am586 chip, and Intel its Pentium. The only thing they shared with x86 processors from 1979 was an instruction set, with its primitive use of a limited number of registers — literally none of which were "general purpose", as various instructions required operands to be placed in specific registers, and results to be delivered to others. These failings were addressed by the amd64 instruction set, which largely permits use of registers as general-purpose, and which provided for four times as many registers in the bargain. The x86 decoder is a minuscule portion of the silicon in a modern processor.
Painful for developers targeting intel (Score:2)
Numerous of my customers use Mac on Intel as development machines for Linux on Intel servers, to provide mass-market GUI tools and target-specific development tools.
Apple is about to make that unpopular.
This will put a push on Linux distros like Fedora and hardware companies offerings like Dell's XPS 13 Developer Edition, to finally deliver the year of the Linux Desktop. Well, for developers, at least (:-))
Re:Painful for developers targeting intel (Score:2)
They could launch new server modules at the same time, they've been in the space before. With the whole Spectre clusterfuck and the diminishing process lead of Intel, an ARM solution from Apple might well be superior. If so with Apple's weight behind it a transition could happen very quickly IMO.
It would be a scary situation for a vertically integrated company to be responsible for so much of computing though, so let's hope not.
Re:Painful for developers targeting intel (Score:2)
They could launch new server modules at the same time, they've been in the space before.
Who would trust them? Their first servers were grossly overpriced, so were their second servers, and they dropped their third server line just about the same time people got used to using them.
Re:Painful for developers targeting intel (Score:2)
Like the IT industry has a memory which spans more than minutes and can learn from mistakes.
No gripes like Apple gripes (Score:2)
Please, enough about bitching about the butterfly keyboard, $1000 phones, and the RDF.
Tell us again about how Jobs ripped off Xerox PARC. That's always a hoot.
what ARM chip would do that job? (Score:2)
What was the ARM's floating point coprocessor again? what chip competes against the intel 8th gen?
Re:what ARM chip would do that job? (Score:5, Funny)
what chip competes against the intel 8th gen?
None, you need Genuine Intel for a flawless Meltdown experience.
Re:what ARM chip would do that job? (Score:2)
None, you need Genuine Intel for a flawless Meltdown experience.
Almost [wikipedia.org]. You can also get it with POWER, or ARM Cortex-A75.
Re:what ARM chip would do that job? (Score:2)
https://embeddedartistry.com/b... [embeddedartistry.com]
I fear it's more about the money for Apple .... (Score:5, Insightful)
When Apple did the huge transition over from PowerPC to Intel CPUs, it was near the height of Apple's success selling OS X based computers. Even then, there was a big fear it would hurt certain markets, like native OS X game development, as it would make an excuse to "just write a Windows only version and let the Mac users boot into Windows to play it". And that, in fact, DID happen. But by and large, Mac users accepted it as a "win" because Intel CPU development was so much further ahead and drove more competitive Macs with their Windows counterparts. Plus, it wasn't half bad being able to run Windows in virtual environments - where a bunch of processor instruction conversion between x86 and PPC didn't have to happen in the background to make it work.
This time around? It's far less clear.... Intel still cranks out great CPUs and nobody I know is complaining that their Mac is under-powered, CPU-wise. The big push seems to be Apple's continual insistence that "most people can just use an iPad and iPhone instead of a computer", and an interest in selling their own CPUs instead of giving all that money to Intel.
I think we're going to see a lot of "dumbing down" of OS X apps if they all start getting coded to run universally on iOS and OS X with ARM. If features in software don't translate well to a touch-screen UI, they'll rip them out instead of keeping "Mac only" versions with more capabilities.
Nevermind this - where's this famous pro tower? (Score:2)
Not Enough CPU for Content Creation (Score:2)
Just talked about this today (Score:2)
A grain of salt (Score:2)
I'm not saying it's NOT happening, but everybody should remember that we've been seeing similar reports to this every year since at least 2011, when it was reported that Apple had internal prototypes of ARM-based MacBooks running OS X. All of the current talk about a 2020 shift to ARM can be traced back to this single unverified Axios article.
Re:It was nice knowing you (Score:2)
Haha we're back to thin clients again. I love it.
Re:It was nice knowing you (Score:2)
While the Intel chips are crufty with all the stuff built up over the years, ARM is not going to be able to replace it for the work I do and plan on doing. I may pick up a Mac mini in the future to cross platform test my games, but its not going to be for any of the major work I do.
I need a powerhouse for what I do, not a phone with a keyboard.
Re:It was nice knowing you (Score:2)
If you want to do anything with pictures or video you're better off with a Windows machine. Especially if you need 3D accelleration MacOS is utter crap. It just doesn't work.
Re:It was nice knowing you (Score:2)
Everything from Microsoft ends in heartbreak.
Re:It was nice knowing you (Score:2)
But at least it started.
Re:It was nice knowing you (Score:2)
Re:It was nice knowing you (Score:3)
They're turning Macs into phones with keyboards and bigger screens, because everything is a terminal for the cloud now, right?
Fortunately some folks are going the other way. 100% free software (modem being isolated) Pinephone is coming, so is some Purism stuff -- no need to use Android nor iOS spyware.
Or Gemini for that matter -- it has nasty non-free drivers, but is pretty functional. Just this Friday I spent a long bus ride hacking on a work project -- as the problem I'm working on involves something multithreaded not scaling well, a 10-core phone is actually better than the 4*2 dev machine. There are folks who use a phone without basics like compiler or valgrind, but I'm not one of them.
Re:Irrelevant to me (Score:5, Insightful)
Gnome 3 is an abomination
Newsflash: in Debian alone there's 57 different window managers (counting packages that declare Provides: x-window-manager). They vary wildly in functionality, but you get both fully-featured/bloated ones and 1990-era alikes.
it has killed my productivity because window management is a pain and inconsistent, and features that used to work no longer do (it's now impossible to suspend while in a docking station, and this is apparently by design according to the bug report).
Aye, Gnome 3 is insane -- even Microsoft has backed out of Metro.
Re:Irrelevant to me (Score:4, Insightful)
Grey-beard here. Over the years I have used *A lot* of window managers / desktop environments. The worst I have used recently is by far Gnome. I updated an Ubuntu machine to 18.04 and said "what the hell", and let it default the window manager to the preferred new one. Gnome was the worst piece of junk I've ever used. All the other desktops I've been able to figure out how to suspend without too much difficulty. Gnome - NOPE. I look at the shutdown menu and can't find anything related to suspend. I see shutdown but no suspend. After a few minutes of googling I discover that someone decided that to suspend you should hit SHIFT or CONTROL or something similar while hitting shutdown. I could live with that, EXCEPT the idiots who designed it didn't change the icon. I TRIED pushing shift and control and alt and other things and there's ZERO feedback. There is PLENTY of space for a suspend icon, but some idiot decided that putting a suspend icon was a bad idea. At that point I seriously questioned the sanity of anyone involved. I couldn't believe they would take away a standard feature like that and hide it.
Re:Irrelevant to me (Score:2)
You know, usually you just "close" the laptop and it suspends ... unless it is a weird "brand" of linux, though.
Re:Irrelevant to me (Score:3)
What if it's not a laptop, or you don't want to flip it shut? I mean, on a Mac you can tap the power button and there's a Sleep button there, Windows can do it from the keyboard or in the menu that appears when you hit the power/standby icon in the start menu, why did the Gnome people decide to make it so unfriendly?
Re:Irrelevant to me (Score:4, Interesting)
What if it's not a laptop, or you don't want to flip it shut? I mean, on a Mac you can tap the power button and there's a Sleep button there
What you do is open the gnome-power-manager preferences, select the "general" tab, and then select what you'd like to happen when you press the suspend button, and the power button. This is essentially the same as on Windows. If I want to reboot Windows, I pick reboot out from the menu; if I press my power button, the machine goes to sleep. I have a hard reset button if I need it; macs used to, if you snapped the programmer's key into place. When I boot into Linux, the buttons work just the same, even though I am using gnome3. I don't spend much time in Linux these days, or I would probably install MATE.
Re:Irrelevant to me (Score:3, Informative)
I'm in the same boat. I used to love Mac hardware from ~2004 (PowerPC) until around ~2010. Then it started to get really bad. MacOS went from a UNIX workstation OS to some sort of media consumer / music player thing (useless to me as I can't stand music) and it's clear Apple wants to make their products fancy televisions.
So I bought a Thinkpad (meh hardware quality, but better than anything Apple has made recently); filled it with tons of RAM (which a MBP can't do) and I'm running Linux with Xfce on it. It's not ideal and it seems to die coming out of sleep 5% of the time (thanks worthless Nvidia hardware).
I really wish there was a better professional laptop available these days. I do ASIC design and I run simulations that need ~64GB of RAM to complete in a reasonable amount of time. There aren't lots of options for me to do my job on the go (which I sometimes have to). Sadly, since some stuff I do needs to be done without an Internet connection (ugh) I can't just toss these big jobs on a server somewhere.
Re:Irrelevant to me (Score:2)
Why are you on Gnome if you do not like it? I use a decades old fvwm config that works exactly as I like. This is not windows where you have no or very little control over how your desktop looks.
Re:Irrelevant to me (Score:4, Informative)
I've just been delaying trying to switch to KDE to see if it's better, but I need to suck it up and just do it.
I've been running KDE Neon for more than a year now and I think it's great.
Kubuntu which I used before that I found to be crappy because it wasn't a "clean" KDE desktop, there were GNOME/Unity things here and there, two or three places to change the same settings, really confusing. Neon is a 100% KDE experience and in my experience it works very well. They've abandoned experimenting with the desktop, and you have a classic desktop experience on top of which you can place widgets if you like (but you don't have to).
Re:Irrelevant to me (Score:2)
I have to give props to Google for what they did with ChromeOS in the past couple of years.
While I still have a "regular" Windows 10 PC mostly for games and odd app or two, everything else is done on Asus CN60 chromebox (Haswell i3, upgraded to 16GB RAM and 128 GB m.2 SATA HD). While this model is too old to support Crostini or virtualization (pushing 5 years now), it satisfies pretty much my every need - and as you said: it runs the OS that not actively working against me.
If only Pixel Chromebooks were not $1300, I would probably buy one tomorrow.
Re:Irrelevant to me (Score:2)
Agreed. I'm pretty much 100% Linux, and the state of desktop Linux is atrocious. Gnome 3 is an abomination, it has killed my productivity
What's preventing you from going back to Gnome 2 via MATE?
Re: Irrelevant to me (Score:2)
Re: What about Boot Camp? (Score:2)
What we have to remember is that this is not the first time apple has done this, and each time they have done it well with good results and benefits. Obviously the last time we got boot camp which at the time was a huge gain. We will see what is up thier sleeve this time. Likely integration across all the product lines.
Re:What about Boot Camp? (Score:2)
with ATT 5G only $10/GB after your 15GB cap.
Re:RIP Mac then (Score:2)