Nvidia Blames Apple For Bug That Exposes Browsing In Chrome's Incognito (venturebeat.com) 165
An anonymous reader points out this story at VentureBeat about a bug in Chrome's incognito mode that might be a cause for concern for some Apple users. From the story: "If you use Google Chrome's incognito mode to hide what you browse (ahem, porn), this might pique your interest. University of Toronto engineering student Evan Andersen discovered a bug that affects Nvidia graphics cards, exposing content that you thought would be for your eyes only. And because this only happens on Macs, Nvidia is pointing the finger at Apple."
You're doing it wrong (Score:3, Funny)
>> I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen
I think you're either doing it wrong or you're not looking at the right stuff. (Hours? Really?)
Re:You're doing it wrong (Score:5, Funny)
>> I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen
I think you're either doing it wrong or you're not looking at the right stuff. (Hours? Really?)
Not everyone's a minute man, Johnny Boy.
Re:You're doing it wrong (Score:4)
>> I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen
I think you're either doing it wrong or you're not looking at the right stuff. (Hours? Really?)
"Hours previously" could simply mean "last night" (~8 hrs ago when you wake up).
It's your own fault Apple (Score:2, Insightful)
You insist on having your own slow ass OpenGL implementation for our cards, I guess you fucked up on security too.
Re: (Score:2, Informative)
You insist on having your own slow ass OpenGL implementation for our cards, I guess you fucked up on security too.
Patches from your proprietary GL implementation donated to the OpenGL Open Source project welcome, nVidia... don't bitch that it's slow when you're able to fix the slow yourselves.
Re: (Score:3, Insightful)
You insist on having your own slow ass OpenGL implementation for our cards, I guess you fucked up on security too.
Patches from your proprietary GL implementation donated to the OpenGL Open Source project welcome, nVidia... don't bitch that it's slow when you're able to fix the slow yourselves.
You don't understand. *Apple* insists on having their own OpenGL implementation for GPUs they use (so they have identical GL api support on intel, amd and nvidia). They don't use Nvidia proprietary driver code, nor Open Source code, and since they don't care about performance (because of metal), their implementation is slow-ass...
Now get off my lawn ;^)
Re: (Score:2)
Fighting is simple: buy non-DRM content only.
Which feature films are non-DRM? And do they cover a wide variety of genres?
Re: (Score:1)
> Which feature films are non-DRM?
Maybe we could have lists of non-DRM movies (probably just Creative-Commons right now) and perhaps DRM ones so we can avoid them.
> And do they cover a wide variety of genres?
Even if we establish a DRM-free area in just one genre (e.g. sci-fi), that would still be worthwhile.
This is prosumerism, if we cannot buy them, we can make them.
Want to watch video game documentaries all day? (Score:2)
Which feature films are non-DRM?
Maybe we could have lists of non-DRM movies (probably just Creative-Commons right now)
The Creative Commons movies I can think of are Blender tech demos [blender.org] such as "Big Buck Bunny" and "Sintel". These are shorts, not feature-length.
Even if we establish a DRM-free area in just one genre (e.g. sci-fi), that would still be worthwhile.
I was trying to allude to FSF's guide to DRM-free video [defectivebydesign.org], which links to GOG.com's movie section [gog.com]. And last time I checked, GOG.com's movie section was full of video game documentaries and little else.
This is prosumerism, if we cannot buy them, we can make them.
I have a couple questions that would need to be answered before that can become practical: Who pays for their production? And who would pay the damages if, say, it turns o
Re: (Score:2)
Apple are welcome to base their drivers around Nouveau and Mesa. :)
Re: It's your own fault Apple (Score:5, Interesting)
IOS saves screenshots of the applications for the task selector thingy and also for "fast" application switching where the screenshot is used for the zooming effect and as placeholder while the real application is still being (re)loaded. There is a separate screenshot for each orientation. It is possible that you launch or switch to the the browser or some other application and IOS will display a possibly very old screenshot of your private porn browsing session or some other private stuff that you had closed and purged from the logs ages ago. During the application switch effect the old screenshot is visible only momentarily but the same images can also be viewed from the task selector.
1. Device at orientation A: open browser, enter private mode and browse for some pron.
2. Switch to the home screen (screenshot it saved) and change to orientation B
3. Go back to browser and close all pron tabs
4. Switch to the home screen (screenshot is saved but this one is for orientation B)
5. Change back to orientation A and enter the task selector or go back to the application. The old private browsing screenshot should be visible.
Re: (Score:2)
Re: It's your own fault Apple (Score:5, Interesting)
Re: (Score:2)
I've got a similar problem happening with one of my applications with a mobile GPU (980m). Resizing a texture causes all sorts of odd contents to appear in the mip-map levels of that texture.
Re: (Score:2)
I take it you've never worked with secured material or systems. What you suggest would be an extreme violation of rules. If the system is cleared for TS then it is never to be provided to someone with a lower clearance. Heck, only dealing with secret requires special handling. For example, secret must be protected in a safe. At least some years ago the common practice was to use removable hard drives -- office is secured during work, and at the end of the day all of the hard drives were secured in the safe.
Re: (Score:2)
Depends on the system. Some systems are specifically designed to handle multiple levels of material and control access based on the authorizations of the logged in user. The code has been modified to tag all information in the system with its security level and prevent data transfer from a high security window to a lower security window by pretty much any means other than manual transcription.
As I recall they were intended for use on ships, where space limitations makes having entirely separate systems less
Re: (Score:2)
This is not merely about having embarassing porn exposed - someone may use a mac for top secret work, close the application, and hand it over to someone with lower clearance.
Sincerely, the guy in charge of Hillary Clinton's email server.
Except it's not. (Score:5, Insightful)
This isn't just on Apple's OS. While I have nothing like Mr. Andersen's writeup to prove it, I've seen this kind of bug happen on Windows.
Re: Except it's not. (Score:5, Informative)
Re: (Score:2)
This is a huge side-channel that can be exploited to communicate between "isolated" processes. You can e.g. have some malicious javascript working on an off-screen webgl context, inside of a sandbox, and it can use this to communicate with malware running elsewhere on the same host. This is pretty bad IMHO.
Re: (Score:2)
There was an article some time ago on how you could get an application to scan the USB devices for keyboards and other devices, figure out the location of the device driver in memory and the character buffers used by interrupts, then pass this information onto a CUDA or OpenCL application and do all the snooping on the GPU. The application containing the original USB scanner could then terminate.
Re: (Score:2)
I don't know if it still does, but the opensource radeon driver used to do something similar this as well... when you logged in to X. It all started when they committed the fix for the screen briefly displaying jumbled garbage when when logging in. Yeah, it didn't display garbled garbage on the screen, but showed a screen from an old browsing session instead, that was also cached somewhere on disk so was persistent even between full power downs.
Hell Chromebooks did it as well when context switching between
Re: (Score:2)
I've seen it on GNU/Linux with Nvidia cards and their non-free driver for several years. This is not new and its not just Chrome.
That is because OpenGL does not require NVidia to do the right thing. Most other driver do, but NVidia doesn't because it might impact performance, and they are not forced to doit. As far as I an parse their blame on Apple, they are basically saying Apple is not requiring them to blank textures either, so NVidia doesn't. It is Apple's fault for not forcing NVidia.
Re: Except it's not. (Score:5, Informative)
The OS has very little control here, since the memory is not handled as memory, but as graphics resources. What's passed around isn't memory pages but texture buffers etc. These are managed by the graphics driver, and the OS expects the driver to do the right thing. I don't even think it's possible for the OS to handle this properly without there being clear API protocols that give the OS enough knowledge about what resources are passed around, and when they should be zero-initialized.
Re: (Score:2)
Except that Apple wants to write most of the driver, you'd be right. Nvidia doesn't get to write the driver for the Macs, that's why there are never any direct graphics driver updates from any GPU provider for Apple hardware, it all goes through them. Apple gets the blame because they're the ones doing the development.
Re: (Score:3)
Me too, and on tablets as well, in "Edge", no less. Usually it's very short lived. According to me the lesson is: applications need to erase all memory before closing a "private" session if the OS doesn't guarantee it.
Re:Except it's not. (Score:4, Interesting)
Unlikely, because Windows does enforce clearing of newly allocated memory, including on the GPU. The drivers would fail WHQL certification if they didn't. The probably didn't bother on Mac OS either because of an oversight or to get a little more performance.
It might be possible within specific apps if they mismanage GPU memory, but certainly not across apps as described in TFA. Well, unless there is some unknown bug, but Nvidia are saying there isn't and it is tested for WHQL certification.
Gonna need to see some more evidence than an anecdote I'm afraid. All available evidence says that Windows is unaffected.
Re: (Score:2)
Unlikely, because Windows does enforce clearing of newly allocated memory, including on the GPU. The drivers would fail WHQL certification if they didn't.
I see pieces of other applications while starting or quitting applications all the time through my nVidia driver, and sometimes those things are old. nVidia is definitely not scrupulous about clearing video memory on Windows.
Re: (Score:2)
Well, since it's anecdote vs anecdote, I'm on an nVidia based system running windows and have never seen it happen. So proof?
Nobody has provided any proof in this discussion, so it seems like demanding it is setting a relatively high bar. It's the kind of thing that's difficult to demonstrate if you're not recording all of your video output. I see it happen after programs have crashed.
Re: (Score:2)
Thing is, if GP is correct, it does happen. If it never happens to you, that's not good evidence that it never happens. A single data point can prove the existence of something.
Re: (Score:2)
Windows does enforce clearing of newly allocated memory, including on the GPU
It doesn't. Really. It only clears the paged memory when pages change ownership or are first initialized. A set of pinned pages is retained by the graphics driver for as long as the driver wishes, and it's completely up to the driver to clear those when the logical ownership of the resource stored on any given page is assigned to a new process. Furthermore, it might not even be that a single page in the GPU memory belongs to one process only. It might store textures or other buffers from two or more process
Re: (Score:2)
And it isn't just Chrome either. In fact it has nothing to do with Chrome. It applies to any application you may have that might display content/information that you don't want to be randomly visible at a later time.
Re: (Score:2)
I've experienced the issue in multiple iOS revisions and devices as well
Simple explanation (Score:3, Insightful)
So, your program allocates some memory. Should it initialize the memory to make sure it's all a bunch of zeros? Apparently, Nvidia doesn't think so.
So, a program running on your OS requests some memory. Should the OS initialize the memory before handing it to the application? Apparently, Apple doesn't think so.
Either answer is right.
Re: (Score:2, Informative)
It needs to be cleared by the OS 100%. The OS can't expect/assume this is done elsewhere, or stuff like this happens.
Re: Simple explanation (Score:1)
The application doesn't know the physical addresses. If it's memory got paged out, it probably got put back at a different location. Only the OS can do it. The application does have the information or access.
Re: (Score:2)
(For as far as I know it was taught a lot earlier, but I started in the 80s.)
Re: (Score:2)
It is nice if the OS does the clearing, but the application programmer should not rely blindly on it.
For the application programmer, it depends on how much a data leak in the application would hurt.
-If the OS specifies that memory is cleared, relying on it it is IMHO OK for low security applications.
-For high security applications, always clear the memory. Don't trust that the OS does it.
Re: (Score:3)
The OS has the responsibility to ensure that there is no unintentional sharing of data between processes.
In general, it's impossible for the OS to guarantee that, since drivers are nominally accessible by multiple processes at once. Say a serial port driver has a bug and doesn't wipe the buffer between closing and reopening of the device: OS can't help here, it's a driver bug that leaks data between processes here. Same goes for graphics drivers.
Here's what happens: GPU memory is allocated in chunks of wildly varying sizes. To maximize the amount of usable GPU memory, the objects (buffers) owned by various cli
Re:Simple explanation (Score:4, Informative)
So, your program allocates some memory. Should it initialize the memory to make sure it's all a bunch of zeros? Apparently, Nvidia doesn't think so. So, a program running on your OS requests some memory. Should the OS initialize the memory before handing it to the application? Apparently, Apple doesn't think so. Either answer is right.
Not really. An application will typically allocate and release memory all the time, being forced to clear it every time is massive overkill and a performance problem. The driver exposes the GPU memory, the OS allocates it to applications just like with RAM. It's the only one that knows when memory switches application context and must be cleared. So there's really only one sane solution.
Re:Simple explanation (Score:5, Insightful)
The usual solution is basically:
This works well as long as the CPU is in charge, ensuring that any dirty data must have originated in some other part of the app (by reusing a pool region). Where it starts to get hairy is when you have a GPU that has access to all of RAM and uses a separate page table with separate COW flags, etc.
I'm not certain what went wrong in this particular case. However, I do remember a really annoying change in about 10.6 or 10.7 where Apple stopped using a vertical blanking interrupt to control various aspects of the GPU's operation and maybe some other parts of the OS. This improved battery life, IIRC, but the result is that you'll often see the GPU draw a frame of video before the previous contents of VRAM have gotten wiped. I would not be at all surprised if that was what happened here.
As for whose responsibility it is to clear the memory, my gut says that if Chrome wants to guarantee that its video buffers are cleared, Chrome is responsible for doing it. Otherwise, it should assume that VRAM is a shared resource, and anything it puts in VRAM can potentially be accessed by any other app at any time for any reason. With that said, I'm open to other opinions on the matter.
Re:Simple explanation (Score:4, Informative)
It's not just about a moment of graphical corruption, that's an annoyance. But a process being able to access the RAM leftovers from a previous process is begging for memory based attacks. Even though it's on the GPU, it's a vulnerability. What's to say that GPU wasn't just displaying banking info? The OS should not assume the application is friendly and blanking the VRAM. That security is on the OS.
Re: (Score:2)
In principle, I agree with you. In practice, though, this sort of bug is really easy to make when designing
Re: (Score:2)
Re: (Score:2)
Copy on write does not work that way. It copies the originally mapped page, which is a single pre-zeroed physical page that is kept around specifically for that purpose. That copy operation completes (thus wiping the victim physical page) before the OS returns control to the process.
Re: (Score:3)
Clearing allocated memory before handing it to applications is required by POSIX and generally a really, really good idea from a security point of view. The performance hit is minimal (modern CPUs and RAM can write gigabytes per second easily) and can be mitigated, e.g. by using a pre-allocated pool for small allocations.
All modern operating systems clear allocated memory. It's basic security to stop one app stealing data from another, or even worse the kernel. You could do it the other way around and try t
Re: (Score:2)
This of course is true, but what is handled by the graphics driver is not memory, but OpenGL primitives like buffers. The OS has absolutely no bearing on any of that, it's up to the driver to do the right thing. Just as a graphics driver can pass unclean buffer primitives around, a serial driver bug might not clear a communications buffer, or a fancy keyboard driver with OLED displays in the keys might leak the data from your porn setup where the function keys have pics of your fave pornstars on them :)
Basi
Re: (Score:2)
So, your program allocates some memory. Should it initialize the memory to make sure it's all a bunch of zeros? Apparently, Nvidia doesn't think so. So, a program running on your OS requests some memory. Should the OS initialize the memory before handing it to the application? Apparently, Apple doesn't think so. Either answer is right.
Not really. An application will typically allocate and release memory all the time, being forced to clear it every time is massive overkill and a performance problem. The driver exposes the GPU memory, the OS allocates it to applications just like with RAM. It's the only one that knows when memory switches application context and must be cleared. So there's really only one sane solution.
No. The driver knows as well. There is a concept called OpenGL contexts, and they can be configured to share texture data with eachother, the problem is that the driver leaks texture-data between contexts that shouldn't be sharing texture data. They perfectly well know those contexts should not be sharing texture data.
Re: (Score:2)
There is a reason why C has memory allocation functions like alloc, malloc, calloc, nalloc ...
and there is a reason why C++ says: CTORs are only called if the class defines a CTOR.
Re: (Score:2)
All modern OSs will initialize the memory because there is a clear security issue with allowing one application access to the old contents of a random block of memory. It could contain passwords or who knows what else.
On the other hand, GPU memory is primarily used for rendering graphics. The security implications are less severe if information leaks. Has there ever been any guarantee information won't leak? So why do users assume that it won't? It is likely NOT cleared for speed reasons. Everyone wants a f
Re: (Score:2)
Arguably, when you request memory from a modern OS, basic security says it shouldn't be filled with random stuff from other programs.
This has been true since at least the 90s.
Multi-processing environments have been solving this for years, like they're supposed to.
Re: (Score:2)
If you want to get a strict view on memory data it shall be cleared as soon as it's released, not necessarily cleared at allocation.
However clearing unused memory (RAM or Disk) is also impacting performance. The performance penalty can be easier to hide when freeing memory than when allocating it - but you need a nice background thread to take care of that.
LOL I think, given Apples graphics performance overall, they don't really care about performance and optimizing THIS would make fuck all difference to the SHIT performance of their graphics.
Not cool (Score:2)
Does that mean I have to throw away my porn iPad and go back to my porn ChromeBook?
I hate that. Just moving the bookmarks will take forever.
Re: (Score:3)
Does that mean I have to throw away my porn iPad and go back to my porn ChromeBook?
I hate that. Just moving the bookmarks will take forever.
Joke Fail.
You're using Chrome on both, so bookmarks are synced through your Google account.
Re: (Score:2)
Erm, is actually anyone using that feature?
I for my part don't I like it that different devices have different book marks. The bookmarks I really want to share are on delicious anyway.
Re: (Score:2)
You'd really put porn bookmarks in the cloud?
I don't use Chrome sign-on ever, even for regular browsing.
Re: (Score:2)
I don't use Chrome sign-on ever, even for regular browsing.
If you're that paranoid, you shouldn't be using Chrome to start with.
Sharing malware, no thanks (Score:2)
I'm not paranoid, it's based on an unpleasant incident.
Two years ago, many of my friends complained that they were receiving spam from one of my Outlook.com email addresses. It was weird because it was not the sign-in address for my Outlook.com account; the spam was sent using one of my aliases that I used only with a Google account for non-important stuff (Chrome, Youtube, Google search preferences and such but no Gmail) on one specific machine.
I didn't know how this happened, so I turned off that laptop (
Re: (Score:2)
Who said anything about hiding? It's containment. The best way to browse porn websites is probably a VM but it's not as convenient as a mobile device.
Easy Fix for the Paranoid: Cold Reboot (Score:4, Interesting)
I've done some GLSL programming and it's not unreasonable for clearing a GPU buffer to take 1/20 to 1/10 the time as the actual operation on that buffer. How many Nvidia users (read gamers) would prefer to take a 5% performance hit to prevent occasional glitches like this?
This has absolutely nothing to do with Nvidia's drivers. It is a glitch in Diablo III and maybe something Chrome could address for the paranoid out there. Meanwhile, if you're really that worried about someone seeing a glimpse of your porn hours earlier, just turn your computer off/on before allowing anyone to use it next. Problem solved.
Re:Easy Fix for the Paranoid: Cold Reboot (Score:4, Interesting)
Clear on exit (Score:1)
"You only have to clear the buffer once on exit."
One of the cases I've heard of this is during a crash. In that case, you may have no clean exit in which to clear the buffer.
Re: (Score:2)
That would mean that the application has to keep allocated every texture, framebuffer and memory block ever reserved by the GPU. If anything, the GPU is going to have to maintain a "must-be-wiped" list of memory blocks that are cleared when the application is closed or when they are reused.
Re: Easy Fix for the Paranoid: Cold Reboot (Score:4, Interesting)
There isn't a single OS that doesn't do this. You wrote a bunch of crap that has nothing to do with what the GP wrote.
Re: (Score:1)
Huh, all the real operating systems made since nineties have had resource tracking and cleanup on exit. The memory, GUI resources, sockets, etc are always freed by OS upon exit of the application. The same should apply to any OpenGL stuff also, or does the Apple system really run out of GPU memory is a application crashes or forgets to free the resources it had allocate?
Re: (Score:2)
Your first paragraph is completely wrong. You've been moderated accordingly and Dog-Cow told you as much, but I will provide an explanation.
The application is mostly not responsible for cleaning up up after itself, except for some shared resources (for example, POSIX shared memory).
In your "test", you don't actually get a zombie process. Ironically, a zombie process would mean that resources have been cleaned.
A zombie process is a process that has/was terminated, and only the in-kernel process structure con
Re:Easy Fix for the Paranoid: Cold Reboot (Score:4, Informative)
The thing is, for security the operating system should scrub memory before before supplying it to an application. Otherwise you get all kinds of data leakage. The virtual memory system does this when an application requests more pages. SPARC CPUs generate an interrupt when they run out of "clean" register windows. NTFS ensures sectors that are allocated but not written in a files read as zeroes (FAT32 on Windows 95 didn't, you'd read back whatever was there on the disk). By the same token, the OS should scrub GPU resources before supplying them to an application. You don't need to do this on every allocation, only when the allocation comes from RAM that was not previously assigned to that application.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Clearing the buffer on app context request or at context release is a one time event.
And no, it doesn't take long, it is in fact the quickest way to touch every pixel. Anyone who's telling you it takes too long is using the wrong API to do it.
Blame Chrome (Score:5, Interesting)
Chrome advertises its Incognito mode as leaving no traces behind. Therefore, it should be responsible for wiping its framebuffer, just as it clears caches, cookies and history. It's like writing a file shredder that doesn't actually overwrite files, then blaming the OS and hard drive manufacturer for the oversight.
It might be nice if framebuffers and such were zeroed on release, but like overwriting files, it's a time/energy/security tradeoff. Besides, the screen isn't really protected anyway; IIRC applications on most OSes can capture the screen without even admin privileges. After apps are sandboxed into seeing only their own windows we can talk about securing the framebuffer.
This, this, this! (Score:5, Insightful)
Chrome advertises its Incognito mode as leaving no traces behind. Therefore, it should be responsible for wiping its framebuffer, just as it clears caches, cookies and history. It's like writing a file shredder that doesn't actually overwrite files, then blaming the OS and hard drive manufacturer for the oversight.
This, this, this!
If it's incognito, it should not trust anyone else to ensure the privacy of the user's data, not even the OS. We already know that it's possible to use CPU cache bugs as a covert channel to snoop on other processes running on your computer; if the application claims to maintain security, it needs to zero the memory itself.
As an aside, a GPU is a better machine for zeroing pages than the main CPU, and won't pipeline stall or time stall the main CPU by doing it, and GPUs are traditionally really good at manipulating large amounts of memory. So one has to wonder: why doesn't nVidia expose a primitive that Chrome can then use to zero the pages of a frame buffer, before or after it is used?
Re: (Score:1)
Yeah, seems to me incognito mode should do this as its shutting down a tab 'to remove all traces'. That said, apps should probably zero out before they start as well since they are the ones that look unkempt when they display legacy data.
Re: Blame Chrome (Score:4, Insightful)
There are also some limitations to what a program promises vs what it can do. File shredding is an optimal example: modern SSD do not even write to the same physical location every time you write to the same file. Battery backed controllers fool the OS in thinking a certain action was completed while it really wasn't committed to disk yet. If you pull a disk between the shredding event and the cache flush, you could easily read things. Heck, if your magnetic drive says a portion of it's drive are "bad blocks" the data on those blocks doesn't get overwritten, SSD's have cells that can physically go "read-only", with the right tools you can read the data in the "bad blocks" or "read only" cells.
Re: (Score:2)
SSD's have a very good secure erase mode, but it is very low level. I had to do it once, when I forgot the password on my Samsung portable SSD. Basically the drive sends a concurrent pulse to all cells, and then drains them (that's what I understood). It took a very short time, and since it happens to the entire drive, and the initial data was encrypted anyways, I don't think any data woul dbe recoverable after that point.
But this is not an advertised feature, and I had to speak with the customer service to
Re: (Score:2)
It wouldn't clear any blocks that are so worn out that they've gone read only would they? There are states in SSD's where the cells individually have become ROM's
Re: (Score:2)
Re: (Score:2)
Chrome advertises its Incognito mode as leaving no traces behind. Therefore, it should be responsible for wiping its framebuffer, just as it clears caches, cookies and history. It's like writing a file shredder that doesn't actually overwrite files, then blaming the OS and hard drive manufacturer for the oversight.
Copy/paste from Chromes incognito mode. The emphasis is theirs.
Pages that you view in incognito tabs won't stick around in your browser's history, cookie store or search history after you've closed all of your incognito tabs. Any files that you download or bookmarks that you create will be kept. Learn more about incognito browsing [google.com]
Going incognito doesn't hide your browsing from your employer, your internet service provider or the websites that you visit.
So they dont advertise that it leaves no traces behind. In fact it's quite obvious that it does leave things behind.
Re: (Score:2)
Re: Blame Chrome (Score:2)
If the OS (or OpenGL driver) changes an application's memory allocation from physical pages A-F to pages U-Z, it doesn't tell the application or give the app a chance to clear the memory first. The app *cannot* guarantee that its memory is cleared before being handed to another process.
Re: (Score:2)
Exactly. Applications CANNOT be expected to zero memory. The OS absolutely MUST be required to do that, to be considered in any way secure.
They've been doing with system RAM for a long time. That practice needs to be standard for VRAM as well.
incognito starts remembering history (Score:3, Interesting)
Re: (Score:2)
Nice try. I've been caught on that "watch this GIF long enough and something amazing will happen" jokes before. You won't fool me again!
It doesn't just happen on Apple (Score:2)
I've got an older GTX 760 running on an HP Z820. I run ubuntu on this thing and use nvidia-352 drivers. When I log out of gnome3 and log back in through lightdm, I see the same exact symptoms. I can see what was previously displayed on my framebuffer, including firefox and chromium windows.
Skim reading (Score:2)
*opens link*
... pornography ... splashed on the screen ...
*closes window*
Not just Macs (Score:1)
If you enable 3D mode in your VirtualBox VMs, then you will see the screen buffer contents when you reboot them, Tried this with RHEL7 guests on Linux host, the host has NVIDIA card.
Also, with same NVIDIA setup, if I boot from RHEL into Ubuntu, I can see the RHEL screenbuffers in Ubuntu when logging into the desktop.
NVIDIA isn't clearing the buffers properly.
Re: (Score:2)
This is Nvidia being Nvidia. They fucked up in their drivers, AGAIN, and are buying time by pointing blame and spewing nonsense until a new driver comes out next week that randomizes frame buffer addresses for different applications. Then they will crow about how they fixed someone else's problem, because they are such nice people.
This story is 60% bullshit, with 40% slashdot dupe added in, as we already saw this one on Windows earlier in the week.
Only part of the problem. (Score:3)
There are two real issues here.
The first is that malicious programs could open up, grab screen buffers, and get access to stuff that had been on the screen to use for their nefarious purposes.
This is bad, and unless we get decent support to isolate the frame buffers (and other graphic memory) between apps at either the driver or hardware layer, it's not going away anytime soon. Dont want this? Power cycle (all the way off - not just hiberante) between application launches would do it.
The second is sloppy programming on the part of non-malicious applications. That's what is being talked about in the application. Diablo apparently asked for a frame buffer, and then presented it, as is, to the user without putting what it wanted in place, trusting for it to be in a particular state. Which it wasn't.
You want a black screen to show to the user, then write zeros into your buffer before you show it to the user. Decent compilers/languages will tell you if you've tried to read from unitialized variables, and you should never trust that anything you've asked for dynamically is in a safe state, unless you've explicitly requested that it's cleared before being handed to you. Why should a resource from the graphics card be treated any differently?
NVidia is right about one thing here - most of the time, nearly all of the time, the thing you do with that buffer you're given is to write your stuff into it, completely overwriting it, and it would slow things down if they had to guarantee that it was cleared before handing it out to you. If your program doesn't care enough to do so itself, that's not really their fault.
It would be nice if, on program exit, all GPU resources used by that app were flushed, but again, that would involve the OS needing to be told of all the GPU resource allocations and deallocations so it could clean up properly, and that too would probably slow things down. Not a lot, but enough to be annoying when your game stutters.
It's *everyone's* fault (Score:2)
As far as I'm concerned, it's *everyone's* fault. What we have here are a bunch of companies that are playing an immature pass the buck game.
Chrome's incognito is supposed to be secure. Wouldn't any reasonable person expect a wipe of used VRAM to be included as part of cleanup process when an incognito window is closed? I know I would. But they don't, because they expect it to be handled by the driver.
NVidia's driver should be wiping memory that has been released by the calling app. It's *their* driver
Shouldn't Chrome should be doing this? (Score:2)
Isn't it very likely that users would have 'regular' Chrome running almost all the time and periodically open up incognito tabs to do banking or just browse pr0n. Once finished, they would close the incognito tabs/windows but would most likely keep Chrome itself running for a good while longer.
Another use case is working in MS Word on two documents at once. One is top secret, the other is not, you finish with the top secret one and close it, but you keep working on the other document, keeping Word open.
In
Reproducer (Score:2)
Re: (Score:1)
So are you implying that this bug is due to wanting to keep images in RAM for advertising purposes?
Re:Chrome? (Score:5, Insightful)
No, his reason is that sweet sweet +5 insightful. We don't need your facts around here.
Re: (Score:2)
Re:Chrome? (Score:5, Informative)
Somehow, the idea that people would trust incognito mode in a browser made by a company whose profits mainly come from targeted advertising strikes me as really hilarious.
Why? They are two different and not incompatible processes. The company performs analytics and collects information about you to store on its servers. The incognito mode is designed to ensure a trace of the browsing session is not left on your PC.
There is a very big difference between the form of data collection here as well as the result of it. Mother is not going to know I search for dirty things based on Google's data collection.
Re: (Score:2)
I keep hearing slashdot talk about all the evil shit that Chrome does behind the scenes. So let's see some sniffer logs or some evidence of Chrome doing something its not supposed to. I'll wait.
Re: (Score:2)
Re: (Score:1)
Nope, she's a rary!
Re: (Score:2)
More than that, Nvidia has a web driver available for OS X for Quadro-series cards, but it works just fine with GeForce series cards (from anyone, and not just Apple) as well. Download it and test - if it still happens, then Nvidia is just as full of shit as always.