Follow Slashdot stories on Twitter


Forgot your password?
Networking Intel Apple Technology

Apple Behind Intel's USB Competitor? 332

We recently discussed Light Peak, Intel's upcoming, optical interconnect technology that boasts data transfer rates of up to 10 Gbps. While some have speculated that Light Peak will directly compete with USB 3.0, Engadget has now unearthed information that indicates the idea for the technology originated from Apple, who apparently asked Intel to develop it. "According to documents we've seen and conversations we've had, Apple had reached out to Intel as early as 2007 with plans for an interoperable standard which could handle massive amounts of data and 'replace the multitudinous connector types with a single connector (FireWire, USB, Display interface).' ... Based on what we've learned, Apple will introduce the new standard for its systems around Fall 2010 in a line of Macs destined for back-to-school shoppers — a follow-up to the 'Spotlight turns to notebooks' event, perhaps. Following the initial launch, there are plans to roll out a low-power variation in 2011, which could lead to more widespread adoption in handhelds and cellphones. The plans from October 2007 show a roadmap that includes Light Peak being introduced to the iPhone / iPod platform to serve as a gateway for multimedia and networking outputs."
This discussion has been archived. No new comments can be posted.

Apple Behind Intel's USB Competitor?

Comments Filter:
  • by CharlyFoxtrot ( 1607527 ) on Sunday September 27, 2009 @11:02AM (#29556405)

    They will have a hybrid copper/optical [] wire to power devices : "In addition, Intel said it's working on bundling the optical fiber with copper wire so Light Peak can be used to power devices plugged into the PC, he said."

  • by Anonymous Coward on Sunday September 27, 2009 @11:21AM (#29556577)

    Do you only use it for your mouse and keyboard? If that's the case, then you'll probably be satisfied.

    Now, back in the real world, it becomes the bottleneck for even low-end, high-capacity storage devices built around traditional spinning media. With us now moving towards solid-state storage, USB 2.0 fails us horribly. We can only manage 30% to 35% read/write capacity utilization under real-world conditions.

    The same goes for connecting high-end visual displays via USB. Once you get above a resolution of 2000 pixels in either direction, USB 2.0 just can't handle it.

    USBNET2, basically IP networking over USB 2.0, never took off because it's just too damn slow.

    There are many applications where we need much, much faster transfer rates than USB 2.0 can support.

  • by lagfest ( 959022 ) on Sunday September 27, 2009 @11:23AM (#29556597)

    Because you are being scammed, $10 is more realistic.

  • by CharlyFoxtrot ( 1607527 ) on Sunday September 27, 2009 @11:25AM (#29556621)

    Why apple wouldn't choose to use 10 Gigabit Ethernet instead?..

    Partly because the first iteration will be 10 Gigabit but the next generation will be 100 Gigabit.

  • by nxtw ( 866177 ) on Sunday September 27, 2009 @11:25AM (#29556625)

    Transferring a 100 MB app to my iPhone takes a noticeable amount of time, for example.

    That's not USB 2.0's fault. The bottleneck is almost certainly the slow/cheap flash memory in the iPhone. Fast flash is expensive.

  • by SimonTheSoundMan ( 1012395 ) on Sunday September 27, 2009 @12:35PM (#29557305) Homepage

    I have never known why industry standards such as HD-SDI have never made it to the consumer market. Single coax cable terminated with BNCs that can deliver 4k (four times the resolution of 1080p) or higher with 16 channels of audio, all uncompressed, at a length of over 100m.

  • by mr_lizard13 ( 882373 ) on Sunday September 27, 2009 @12:37PM (#29557311)
    I doubt Intel broke an EULA, for what they are worth anyway.

    If all this is true, and Apple did ask Intel to develop this initiative, then I'm pretty sure Apple would have been happy to license that version of OS X for development purposes.

    In any event, couldn't that motherboard been ripped out of an apple computer?
  • Re:Purpose (Score:5, Informative)

    by CajunArson ( 465943 ) on Sunday September 27, 2009 @12:40PM (#29557343) Journal

    You're wrong. USB is and was for hooking up peripherals like keyboard/mice/printers/low-bandwidth devices to effectively replace the old RS-232 serial and parallel ports of yore. USB was never intended to replace the interface that goes to your monitor, your hard drives*, and your ethernet.

    * Yes, we're all aware of USB storage, but see all the comments above about how even low-end devices today can swamp USB... if USB was so great for this then eSATA never would have come into existence.

    This new standard appears to be point-to-point and with all the knowledge we have now it will hopefully be efficient. Additionally, 10Gbps is the starter speed... Intel was talking about scaling it to 100Gbps without too much difficulty.

  • by petermgreen ( 876956 ) <plugwash @ p> on Sunday September 27, 2009 @12:58PM (#29557501) Homepage

    see 5G iPod,

    recent Macbooks
    The air never had firewire, probablly because it always designed as a cut down ultraslim machine.

    The basic polycarbonate macbook has always had firewire 400.

    All apples other current machines have firewire 800 (which is compatible with 400 with a wiring adaptor)

    The 13 inch unibody didn't initially have firewire which many people at the time thought was a sign of apple dropping it. However either the pundits were wrong or apple decided the backlash was too much because soon afterwards the 13 inch unibody was redesignated as a macbook pro and had firewire 800 added.

  • Re:Why not USB3? (Score:3, Informative)

    by level_headed_midwest ( 888889 ) on Sunday September 27, 2009 @01:03PM (#29557549)

    USB3 is pretty marginal for connecting a monitor. Your average single-link DVI interface has up to 3.96 Gbps of bandwidth, which a typical 1920x1080 LCD @ 60 Hz nearly saturates. USB3 is rated at 5 Gbps but if it's anything like USB2, you'll probably see ~2 Gbps of actual throughput and a huge CPU load. USB2 is horrible as a display interface as it is really only good for connecting small secondary displays to display static 2D images. You have only 480 Mbps of theoretical bandwidth, which is enough to drive only a 640x480 monitor at typical 24-bit color and 60 Hz. If you figure in the fact that USB2 maybe has 200 Mbps in real bandwidth, you see there's a huge bandwidth problem.

    USB2 is okay for 100 Mbps Ethernet and there are a lot of USB2 10/100 Ethernet dongles and docks out there. I have one and it works as well as any PCI-based 10/100 Ethernet interface. However, most computers have gigabit Ethernet connections because 12 MB/sec won't cut it for transferring files any more. USB2 won't even come close to cutting it for a GbE replacement, which is why you don't see any USB GbE dongles, only the 10/100 ones.

    USB does well for connecting relatively low-speed peripherals like mice, keyboards, printers, and small flash memory devices. It's just not a good replacement for high-bandwidth connections, which will continue to have specialized and much faster cables and connectors.

  • Re:Why not USB3? (Score:1, Informative)

    by Anonymous Coward on Sunday September 27, 2009 @01:04PM (#29557555)

    The overhead associated with USB 2 (have not read the spec for USB 3, so I can't say anything about that) is too big for networking. Every single packet that is sent over USB has to be acknowledged by the other end before more packets can be sent. DMA is also non-existent, which means that rather than having the hardware move X amounts into a specific memory location (like Firewire) we have to have the CPU do this. This is slow. This is also the reason why USB 2 devices versus FW400 devices is a fair match. I get the same if not more speed over FW400 from the same hard drive enclosure than when I use USB 2, AND my cpu usage is lower.

    USB was not designed to be used for high speed interconnects between devices, it was meant to replace the ageing serial with something that allowed for slightly higher speeds and multiple devices on the same bus with less wires so that the connectors were smaller, and now you could have your printer and scanner connected at the same time without using a device to switch printer ports. Universal Serial Bus is after all what it was named for a reason.

  • Re:Replace? (Score:4, Informative)

    by Alef ( 605149 ) on Sunday September 27, 2009 @01:13PM (#29557641)
    Perhaps, perhaps not. If the transition is made as a step to a new generation of connectors, you will hopefully end up with a generation that has fewer connector types. After all, we have managed to go from

    DE-9 (Serial port) + DB-25 (Parallel port) + DA-15 (Game port) + PS/2 (Keyboard and mouse) + VGA (Screen)


    USB + DVI (+ FireWire for some cameras).
  • Techically the bandwidth would be there.

    The tricky bit with replacing video with a general purpose interface would be to sort out signal routing inside the computer. There still needs to be a GPU/framebuffer and that GPU needs a high bandwidth path (we are talking a couple of PCIe 1.x lanes worth per display) from the framebuffer to the general purpose interface.

    Not saying this couldn't be done but it would definately require cooperation between the GPU vendor and the vendor of the general purpose interface in question to allow them to communicate over PCIe without involving the CPU.

  • by NuttyBee ( 90438 ) on Sunday September 27, 2009 @02:22PM (#29558197)

    HD-SDI never made it to the consumer market because it is expensive to handle and nobodys TV will decode it.

    As for the rest of your comment:

    "Single coax cable terminated with BNCs that can deliver 4k (four times the resolution of 1080p) or higher with 16 channels of audio, all uncompressed, at a length of over 100m"

    No, what you are referring to is 3Gig, which is actually 2 HD-SDI cables and my experience has been that 300 feet out is sometimes a touchy place to be. 3gig on 1 cable = fiber

  • by nxtw ( 866177 ) on Sunday September 27, 2009 @02:27PM (#29558257)

    Even if that's correct (which I doubt), do you suppose that the next iteration of pretty much every device might have faster memory in it? Or will if there's an interconnect that can take advantage of it?

    The iPhone uses cheap MLC NAND flash. If Apple wanted faster flash memory, they could have installed more expensive and faster SLC flash. But it will be a while before Apple puts something the iPhone that will even saturate USB 2.0.

    I estimate the flash write speed on my 16GB iPhone 3G to be around 5 megabytes/sec. The iPhone takes at least twice as long to sync the same music as does my old iPod (5G 60 GB, three and a half years old), both using USB 2.0.

  • by Penguinoflight ( 517245 ) on Sunday September 27, 2009 @02:30PM (#29558287) Homepage Journal
    I think it's important because not even Apple and companies working in cooperation see the advantage of using "genuine" hardware. Why should a customer see an appreciable advantage, or spend the extra money for an Apple branded system when it's clearly just the same hunk of electronics in a different box?
  • Re:Replace? (Score:3, Informative)

    by 10Ghz ( 453478 ) on Sunday September 27, 2009 @02:46PM (#29558385)

    "replace the multitudinous connector types with a single connector" = multitudinous connector types + 1;

    That would be true of most companies. But this is Apple we're talking about. They nearly went out of business back in 1997 because they got rid of standard serial/keyboard/mouse/parallel/SCSI connectors and replaced them with USB (and occasionally Firewire).

    What exactly makes you think that Apple went nearly bankrupt (they didn't) because they dropped legacy-ports? Besides, if Apple went nearly bankrupt in 1997, I fail to see how it applies, since it was the iMac that dropped legacy technology (floppy, and only expansion-ports it had were USB). iMac was released in 1998.... And last time I checked, it was pretty popular....

  • Re:Put it on iPods (Score:3, Informative)

    by willy_me ( 212994 ) on Sunday September 27, 2009 @05:02PM (#29559529)

    From the photos, it looks like it is a standard USB connector. The optical part likely connects through the centre of the connector. I imagine the standard 4 copper conductors are still in place. This makes sense as it enables low cost cables and peripherals by simply using the existing USB standard.

    Future computers could use the physical connector as the only interface to the machine while retaining compatibility with existing USB devices. Kind of like how those Mini-TOSLINK [] cables work.

  • by PhunkySchtuff ( 208108 ) <kai@automatica.c[ ]au ['om.' in gap]> on Sunday September 27, 2009 @07:03PM (#29560415) Homepage

    Apple had to drop FireWire? I don't know if you've looked at an Apple computer recently, but every single Apple computer sold today, with the exception of the entry-level white polycarbonate MacBook has FireWire.

    The iPod dock connector is what I believe you're referring to, and while it's proprietary, it's also very well documented for developers [] and carries a lot more than just plain ol' USB. It has, among other things, pins for FireWire (deprecated on iPods) analogue audio and video and a control channel...

  • Re:Put it on iPods (Score:5, Informative)

    by MCSEBear ( 907831 ) on Sunday September 27, 2009 @07:49PM (#29560751)
    So using a 400 megabit per second Firewire port was less efficient than using a 12 megabit per second USB port? USB 2.0 did not exist yet.

    Say you have a 32 Gig flash based MP3 player. The original USB spec can fill that up in just under six hours! Convenient!

    If you have a larger 160 Gig hard disk based MP3 player, then the original USB port can fill that up in just under one day and six hours! Why would anyone want a faster interface than that?

    In comparison, the original Firewire standard can transfer 32 Gigs in just under eleven minutes. 160 Gigabytes can be transferred in just under one hour.
  • Re:Glass fibre (Score:3, Informative)

    by MCSEBear ( 907831 ) on Sunday September 27, 2009 @08:06PM (#29560869)
    Since they are limiting the cable length to 100 meters you don't need the the same properties a telecom would need in long haul fiber.

    Video from Intel's lab with more information is here: []
  • Re:Put it on iPods (Score:3, Informative)

    by Mr Bubble ( 14652 ) on Sunday September 27, 2009 @10:16PM (#29561615)

    The iPod was new, and not yet ubiquitous. Also, they were fighting against Intel rather than with them. With Macs, iPhones, iPods, iTablets, and Intel, they can start a new standard overnight. BTW, when they switched to USB, I understood, but it was soooo much slower than Firewire.

You know, Callahan's is a peaceable bar, but if you ask that dog what his favorite formatter is, and he says "roff! roff!", well, I'll just have to...