Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Handhelds Displays Upgrades Apple

Apple Unveils New iPad 989

adeelarshad82 writes "As expected, Apple announced the new iPad complete with a Retina Display, quad-core processor, 4G LTE, and an improved camera. The new iPad will run the rumored A5X processor, which according to Apple will provide four times the performance of the Tegra 3. The revamped tablet will also include a 2048-by-1536 display, apparently the most in any mobile device. And finally with 4G LTE, the new iPad will provide up to 73 Mbps download speeds; partners for which include Verizon, Rogers, Bell, Telus, and AT&T."
This discussion has been archived. No new comments can be posted.

Apple Unveils New iPad

Comments Filter:
  • Quad core (Score:5, Informative)

    by WilyCoder ( 736280 ) on Wednesday March 07, 2012 @03:08PM (#39277727)

    Quad core graphics, not quad core CPU...

  • by dingo_kinznerhook ( 1544443 ) on Wednesday March 07, 2012 @03:16PM (#39277895)
    0.07 pounds heavier. I stand corrected.
  • by UnknowingFool ( 672806 ) on Wednesday March 07, 2012 @03:17PM (#39277907)
    Apple also announced a new Apple TV that will have 1080p for the same price as the current generation: $99. I didn't read any other changes.
  • by Picass0 ( 147474 ) on Wednesday March 07, 2012 @03:19PM (#39277949) Homepage Journal

    So get a bluetooth keyboard.

  • Re:yawn (Score:4, Informative)

    by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Wednesday March 07, 2012 @03:21PM (#39277985) Journal

    Because they don't have much of a choice [slashdot.org]

  • Re:Bandwidth (Score:3, Informative)

    by UnknowingFool ( 672806 ) on Wednesday March 07, 2012 @03:21PM (#39278001)
    Well if you are at home, you would use your home wi-fi not 3G/4G. And isn't this a negative with all 4G devices?
  • 73mbps != 4G (Score:5, Informative)

    by Anonymous Coward on Wednesday March 07, 2012 @03:23PM (#39278027)

    You will never get that speed on the device. I have a 4G LTE cell phone and it doesn't even get 10% of that speed.

    Hell it shouldn't even be allowed to be called 4G. The 4G standard is 100Mbps for high mobility devices (cellphones in cars) and 1Gbps for low mobility devices (people walking down the street or in their homes). This is a fraud in advertising.

  • Re:73mbps != 4G (Score:5, Informative)

    by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Wednesday March 07, 2012 @03:30PM (#39278159) Journal

    Hell it shouldn't even be allowed to be called 4G.

    That's what the standards body thought too, but the telcos started marketing 3.5G as 4G and the standards body did the worst possible thing and caved.

  • by Tr3vin ( 1220548 ) on Wednesday March 07, 2012 @03:31PM (#39278187)
    But in the mobile world, most rendering is done through tile-based deferred rendering. The frame-buffer was already being split up during rendering, so having cores work in parallel isn't that big of a change.
  • by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Wednesday March 07, 2012 @03:34PM (#39278249) Journal

    They managed to pack a 5 megapixel camera, into that tiny tablet! Wow, we wouldn't have dreamed that was possible back when the iPad 2 came out.

    You're right, I wouldn't, because my N900 had already been packing a 5.2MP camera for over a year at the time.

    But kudos to Apple for being the first to bring such an amazing camera to a mobile device.

  • by Anonymous Coward on Wednesday March 07, 2012 @03:45PM (#39278465)

    excuse me here, but i thought a retina display implied ~300dpi
    this one falls a fair bit short of the mark if i'm not mistaken

    so once again apple marketing triumphs over all

  • Re:yawn (Score:5, Informative)

    by slew ( 2918 ) on Wednesday March 07, 2012 @03:55PM (#39278679)

    Not by much. Inflation rate is pretty much nil these days (CPI when ipad2 launched ~217, CPI today ~227). @$500, that's only about $20... Since the wholesale electronics probably got cheaper it's probably a wash for Apple's profit.

    On the other hand, median wages have been falling, so relative to typical purchasing power, the price has gone up.

    Of course just like any electronics, the specs always get better for new models, so that isn't anything to sneeze at.

  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Wednesday March 07, 2012 @03:56PM (#39278707)
    Comment removed based on user account deletion
  • by GameboyRMH ( 1153867 ) <gameboyrmh&gmail,com> on Wednesday March 07, 2012 @03:59PM (#39278793) Journal

    Wait until Apple begins to deprecate the OS on your current iPad, you'll find out ;)

  • Re:battery life (Score:5, Informative)

    by Overzeetop ( 214511 ) on Wednesday March 07, 2012 @04:06PM (#39278903) Journal

    Same as current iPad2 (10h, 9h with 4G).

  • by rhook ( 943951 ) on Wednesday March 07, 2012 @04:22PM (#39279191)

    And now you have 150+ employees who get less work done than ever before.

  • Cores: the reason (Score:5, Informative)

    by DrYak ( 748999 ) on Wednesday March 07, 2012 @05:00PM (#39279843) Homepage

    For whatever reason, the PowerVR mobile GPUs are described in number of cores.

    Well there's a technical reason. To go back to the grand parent:

    You talk about them in terms of shaders, ROPs, TMUs and so on.

    These are organised in a module. The various different models in a range a distinguished by the number of such modules. (That's what the number of "Streming Multiprocessor" is in tables of GeForce cards).

    Either by product binning: the factory makes GPU with 8 such modules, then test how many of these are actually usable and how many have defects and then activate between 1 and 7 of them and sells them as a different product in the same range. From "GPU Destructor 990 XL-Deluxe Elite" (with 7 of the 8 core activated - and needs 3 12v connectors) down to "GPU Destructor 120 Light Laptop Edition (only 1 usable core, but sips only 30 Watts). (And some time less core are activated than actually usable due to demand and offer economic laws, leading to users who try to unlock core and convert one card to the next one in the series simply by flashing a new firmware - a classic with some GeForce series).

    Or by producing variants: Desktop range a based on a 8-module design, Laptop range have a 4 module design at a finer process, so it uses a lot less energy.
    (I know some ATI/AMD GPUs are organised so).

    In the case of PowerVR, from what I remember, GPUs were designed to be able to work in parallel (I think: each GPU taking care of a different tile of the deferred tile-based rendering). So it's very likely that maket speak "4 core GPU" means "4 modules" which in fact is "4 powerVRs working in parallel" (for a speed increase approaching 4x).

  • by blueg3 ( 192743 ) on Wednesday March 07, 2012 @05:03PM (#39279899)

    But in general it can't. GPUs are designed differently and don't actually run a large number of different shaders in parallel. They use a combination of multicore processing and data-parallel execution to run the same shader (or a small number of shaders) on a large pool of data in parallel. A lot of it is more similar to SIMD instructions available on many CPUs than multiple cores.

  • Pay More Do Less (Score:2, Informative)

    by harl ( 84412 ) on Wednesday March 07, 2012 @05:22PM (#39280213)
    It still costs more than a laptop and does less than a laptop.
  • by d4fseeker ( 1896770 ) on Wednesday March 07, 2012 @05:32PM (#39280355)
    No and yes.
    A dual-core CPU as first produced by Intel really was only 2 CPU's on a single die.
    However a real multicore CPU has multiple calculation cores while sharing certain parts (e.g. a common L3 cache or the ALU) and using a ultra-high-speed inter-core communication system

    Furthermore a x86 CPU is huge due to it's sheer size of internal high-clock memory, X86 instruction set with additions like 64-bit, Hyperthreading,....
    ARM CPU's as found in your mobile devices are a lot smaller due to a different (smaller) instruction set and being optimized for size, not high performance.

    A GPU core on the other hand is even smaller (only several thousand transistors as compared to several millions in an average CPU) as it has only one and one purpose alone; mathematic calculations.
    At least that was true untl OpenCL and Cuda came along, now they can execute -with great performance loss as compared to int and float calculations- other commands.
  • by shutdown -p now ( 807394 ) on Wednesday March 07, 2012 @05:33PM (#39280363) Journal

    There are already a bunch of Android tablets announced with 1080p screens, starting with Asus TF700. They are probably not aiming higher than 1080p because Android apps can generally adjust pretty well to screen of a different size and proportions.

  • by BasilBrush ( 643681 ) on Wednesday March 07, 2012 @05:35PM (#39280403)

    Right. So when you didn't think AAPL was growing it was a "big yawn". Now you know it is, it's "bubble". How childish are you?

    This despite the fact that my post you replies to said iPad growth, not AAPL growth. Here, take a look at those.
    http://frncs.co/apple/ [frncs.co]

    Or look at it another way. Only 2 years on the market and Apple already sells more iPads than any PC manufacturer sells PCs.

    Bubbles are irrational rises in stock price based on sentiment. AAPLs rise is based on unprecedented sales growth.

  • by vitaflo ( 20507 ) on Wednesday March 07, 2012 @06:03PM (#39280789) Homepage

    You're not getting any more screen real estate on the iPad3, you're just getting twice the resolution. There's a difference. The browser for instance will still tell websites the device width of the iPad3 is 1024x768. So the same website on the iPad2 and iPad3 will have the same layout and width, but the iPad3 will look less pixilated.

  • by White Flame ( 1074973 ) on Wednesday March 07, 2012 @06:23PM (#39281071)

    "it just really doesn't work well with most people, and the software (which is *absolutely* and *inextricably* part of the technology needed for this to work) is not up to the task".

    Says who? Why do you think this is the case? And no, the software has absolutely nothing to do with it. I'm looking right now at my 22" >200dpi displays running 4pt fonts (not OS scaled) in my code editors, having salvaged IBM's monitors that used to be made for medical imaging, and yes the pixels are discernable. 99% of the onscreen text is still made with 1-pixel wide strokes and is clearly legible (at normal ~10 pt non-OS-scaled sizes) to regular people over my shoulder as I show them things. Comparable pixels are also discernable on the N800 I used to use, which is also in a similar DPI range iirc.

    Lots of high-end technical users want to do the same and run things as small & sharp as are perceptible to them WITHOUT any scaling. All's that's needed are high-DPI monitors, no OS changes at all. Works fine in Windows, Linux, and whatever else you want. Again, this is a prosumer market desire, not something to foist onto the average person.

  • by Anonymous Coward on Wednesday March 07, 2012 @10:18PM (#39283231)

    There is a Crossfire/SLI mode like this but the most efficient and common one is to allow GPUs to render alternating frames.

    Alternate frame rendering is common, but it's not "efficient". It's a hack used to permit multi-GPU cooperation when the GPUs aren't really designed for it. Having everything work on the same frame in parallel is much better in practice.

    Reasons: AFR overlaps the processing of frames, so each individual frame still has the same amount of latency (increasing frame rate isn't the only important thing, latency is always important). Also, 3D APIs aren't designed for AFR, so applications usually don't have good control over how the driver decides to manage it. The API model is to submit work for 1 frame at a time, close it, then start a new frame. This can result in zero performance gain since some applications serialize themselves on true frame completion (often in order to do post processing on a frame after it's done rendering). Finally, AFR has long been plagued by timing jitter. Ideally if you have 2 GPUs that each can produce 1 frame every 20ms, you want GPU 1 to output at 0ms, 20ms, 40ms, etc., while GPU 2 outputs at 10ms, 30ms, 50ms, etc., so that the sequence seen by the eye is 0, 10, 20, 30, 40, 50 -- smooth motion. Instead, it's common for AFR systems to drift and start doing bad things like 0, 5, 20, 25, 40, 45, etc. This uneven frame-to-frame timing (jitter) is visibly unpleasant.

    These problems cause some review sites (AnandTech) to recommend against multi-GPU unless you absolutely need it to drive a crazy number of pixels. An equally powerful single GPU (if you can get it) is usually a better choice (if you can afford it), because its performance will be usable across more games and there is no chance of it suffering from jitter.

    The iPad GPUs are PowerVR tile based renderers, so adding more "cores" improves the amount of parallelism in rendering the tiles of a single frame rather than implementing an AFR scheme.

To the systems programmer, users and applications serve only to provide a test load.

Working...