Skip to main content

Latest OS X 10.10.2 beta kills Google-disclosed vulnerabilities dead

Google's Project Zero research program has disclosed and released proof-of-concept code for a series of 0day — previously unknown — vulnerabilities found in Apple's OS X operating system for the Mac. These exploits are all fixed in OS X Yosemite 10.10.2, now in beta. Here's a report on the vulnerabilities from Ars Technica:

In the past two days, Project Zero has disclosed OS X vulnerabilities here, here, and here. At first glance, none of them appear to be highly critical, since all three appear to require the attacker to already have some access to a targeted machine. What's more, the first vulnerability, the one involving the "networkd 'effective_audit_token' XPC," may already have been mitigated in OS X Yosemite, but if so the Google advisory doesn't make this explicit and Apple doesn't publicly discuss security matters with reporters.

These vulnerabilities were reported to Apple in October of 2014 and made public as part of Google Zero Day's 90 day disclosure policy. (You can argue the merit of that policy in the comments below.)

None of these exploits can be used remotely, which means they'd need to be combined with remote exploits or with physical access to the hardware to be put to any practical use.

The first vulnerability, 130, which could result in privilege escalation, contains the following comment:

See https://code.google.com/p/google-security-research/issues/detail?id=121 for a discussion of mitigations applied in Yosemite.

It includes the following:

Apple added some hardening to libxpc in Yosemite - xpc_data_get_bytes now has the following check: [list of checks]

That vulnerability, 121, is marked as fixed and closed as of January 8.

Status: FixedClosed: Jan 8

This could indicate the 130 vulnerability is also no longer an issue for people running Yosemite.

What's more, based on the latest build of OS X 10.10.2, seeded yesterday to developers, Apple has already fixed all of the vulnerabilities listed above. That means the fixes will be available to everyone running Yosemite as soon as 10.10.2 goes into general availability.

Nick Arnott contributed to this article.

Updated with reference to vulnerability 121.

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.

35 Comments
  • You are being misleading here, making it seem like Project Zero dropped a surprise bomb on Apple just the other day. Three sentences after the end of your quote, the article notes that Google privately reported the issues to Apple back in October, and that Project Zero notifies developers that the issue may be made public once 90 days have elapsed since notification. This is precisely what happened. Once a white hat discovers a 0-day exploit, that person knows that everybody's security depends only on nobody else having discovered the same hole. This is a foolish thing to trust, and it gets more foolish with each passing day. As a sometime sysadmin, I would much rather know what long-time holes are out there, so I can take my own precautions, than rely on that diminishing-over-time chance. Frankly, three months is far more time than any company that cares about security should allow *known 0-day exploits* to remain unpatched. Lets hope Apple and the OSX team remember there are actual humans on the other end of these attacks.
  • I wasn't implying anything about google dropping bombs or not. Like I said, discovery is great. Disclosure or always going to be debated. Same thing happened with Microsoft last patch Tuesday. What happens when 90 days legitimately, technically, isn't enough? It only has to go wrong once for it to be bad.
  • "What happens when 90 days legitimately, technically, isn't enough?" What happens when 1000 days legitimately, technically, isn't enough? Apple has bigger issues to worry about if it can't deploy a fix for a zero-day security exploit within 48 hours much less 90 days. Either way stakeholders in charge of administering these platforms NEED to know that their assets are vulnerable to attack so they can take measures to disable the service in question, patch it, or look for alternative solutions. Security by obscurity is no security, and hiding dirt under a rug doesn't mean the room is clean. Open and public disclosure of security vulnerability is the only responsible and ethical thing to do for both users and investors in the platform. It's also an incentive for these companies to take security very seriously.
  • I agree it's a strong incentive, but I disagree that open and public disclosure is responsible or ethical, especially when most users don't know what to do with that information, if they even receive it. Relatively few users read computing web sites. Private dialogue with the vendor is much more ethical and safe for the public at large. If it appears the vendor isn't taking it seriously, then it's time for public disclosure. I think we understand the oxymoron that is "security by obscurity" but it's a bit preposterous to assert that all security exploits can be fixed, tested, patched, and distributed within 48 hours.
  • > I disagree that open and public disclosure is responsible or ethical, Almost all security experts disagree with you, in fact, Google is not the one with the shorter disclosure time.
  • >but I disagree that open and public disclosure is responsible or ethical, especially
    >when most users don't know what to do with that information, if they even receive
    >it. I disagree. That's like saying doctors shouldn't tell patients they have an incurable disease because patients wouldn't know what to do with it. It's not for you to decide that. Users have a right to know that the platform they use is vulnerable to attack, and thus unsafe to use. How they attain that information and what they choose to do with it is irrelevant, and frankly, not our business. What's important, however, is that such information affords users the opportunity to take proactive action if they choose to (e.g backing up important files, moving important files and information from the system to a safer location, consulting with an geekier friend or expert, moving to a safer platform, etc). This is an opportunity that would have been denied them if the known vulnerability remained undisclosed and eventually becomes exploited by malicious actors. That's why anything both public disclosure is irresponsible and unethical.
  • No argument there re: ethics-- but private dialogue with the vendor happened before Halloween. 48 hours may be absurd -- but 3 months is not. At some point, the vendor has to be held reponsible.
  • You are approaching it from the wrong angle. The relevant question is "Given that I (as the white hat), am not necessarily the smartest human alive, is 90 days enough for a good probability for black hats to have discovered this vulnerability?" If so, it is irresponsible *not* to disclose the issue, as you are willingly leaving people open to a security exploit they need to know about, and may be able to guard against with your information.
  • And yes, whether intentional or not, called them "Google-disclosed vulnerabilities...marked as fixed and closed" while leaving out the crucial fact that Project Zero reported the issues privately to Apple 90 days ago for remediation does leave that implication. In fact, the timeline suggests we may have Project Zero's private notification to thank for these issues being closed at all, but the article scolds them instead of bringing up that point, or even reporting the events that might lead the reader to stumble across that conclusion him/herself.
  • Updated to remove "scolding" and add report dates.
  • Thanks. To be honest, if I was Project Zero, I would not have disclosed the issues publicly, either - assuming Apple told me that they were addressed in this soon-to-drop release. In that case, it would have been proper to wait until after the release. Of course, if Apple had *not* communicated back to me any release date, then I probably would have felt obligated to disclose after the 90 day grace period, to inform other users.
  • ProjectZero has always seemed more of a way to harm competitions reputation than really help anyone. I was looking threw the reports, not-a-single report on android, chrome, or any google products and their vulnerabilities.. ALL on Microsoft, Linux and Apple products.. nothing on Google.. and we KNOW they have quite a few ... Convenient wouldn't you say? A tech version of a propaganda machine in action.. You can see a recent list here: https://code.google.com/p/google-security-research/issues/list?can=1&num...
  • Android and Chrome have their own lists. I'm not sure why you want Google to issue a 90 days notice to Google for bugs. That sounds like a stupid thing Apple or Sony would do
  • No one discloses their own vulnerabilities, nor Google, nor Microsoft and not Apple.
  • Yes they do. Google holds big events and pays those who find the exploits so they can patch them. One perks of an open source OS Sent from the iMore App
  • I think you're confusing rewarding vulnerabilities found with disclosing the vulnerabilities the found in their projects.
  • Propaganda? I don't think so. Considering Google uses OS X, & Windows for their heavy lifting, it benefits them to have any vulnerabilities fixed. Posted via the iMore App for Android
  • I'm glad all of you keep track of this stuff. I just get tired - head over it. Regardless of the nature of the hole, I wish the fix didn't have to wait for a beta cycle to be completed. Posted via the iMore App for Android
  • I'm of two minds on that. I always want fast security updates, but I don't want them so fast they create different problems. Apple did their first patch push last a few weeks ago. I'm in favor of that, but at the same time, it's a little nerve-wracking.
  • Agreed. I suppose there's no perfect scenario. Posted via the iMore App for Android
  • "Also, again, the first exploit, which could result in privilege escalation, was marked as fixed and closed by Project Zero on January 8." umm. I think you guys need to pay closer attention to what you are linking, as all issues that are linked are marked as open by Google..
  • 130 links back to 121 for mitigations which is marked as fixed as of Jan 8.
    See https://code.google.com/p/google-security-research/issues/detail?id=121 for a discussion of mitigations applied in Yosemite. Status: Fixed Owner: cev...@google.com Closed: Jan 8
    Updated the post to clarify.
  • Whose workstation is in the photo?
  • CERT discloses the vulnerabilities after 45 days, IEFT discloses vulnerabilities after 30 days. Nobody has complained until Google has started to disclose vulnerabilities with Project Zero
  • I think it was because Google released so many in a short amount of time...only the past two weeks. It "seemed" like Google was picking on Microsoft by some.
  • Not to mention they have had dialog with companies they disclose these issues with, but choose to release them to the wide web at 90 days regardless of the effort that is happening within the company.
    They beat Microsoft by 3 days with a major Windows patch, they beat Apple by n weeks with minor patches... Knowing full well these things were patched and a release date was imminent. The logic does not add up.
  • What do you think CERT, Secunia and the others do? I can assure you that they won't stop disclosing a vulnerability because someone will patch it in n days
  • If you've read their disclosures, you can see that they work with the vendors of the software to prevent that type of thing from happening. In their book, disclosure of these flaws publically before they're patched is a last case scenario for getting the vendor (or others related to the case) to respond.
  • I have read them but you can point where they say it is a last case scenario http://www.cert.org/vulnerability-analysis/vul-disclosure.cfm?
  • Can you not infer? While it may not spell out "last case scenario" for you, the emphasis from reading that page, as well as IEFT's, is that their goal is to work towards fixing problems and publicly disclosing them is NOT a solution in cases which lack the black and white contrast so many people think they see or understand.
    They will not disclose a bug without notifying the vendor they are as PZ did with Windows (and OS X, here), they are willing to float deadlines more than 45 days for appropriate situations, and they openly admit that not all issues can be solved in the time given and action from those responsible is their primary concern towards repairing these problems. Their goal is to fix the problems, not to shame others needlessly.
  • I don't know what you are reading. " Threats that require "hard" changes (changes to standards, changes to core operating system components) will cause us to extend our publication schedule. " That says that the will not disclose the vunerabilities if there are important reasons and not disclosing them is the last case. Really, what are you reading? "They will not disclose a bug without notifying the vendor they are as PZ did with Windows" What are you talking aboit?
  • You're really really hung up on the "last case scenario" wording. OK, sorry, dude, I didn't mean ~literally~ the last thing they would do in response to a discovered issue but I don't think arguing semantics makes Google look in better in comparison. By "last case scenario" I meant they are willing to look at situations outside of a black or white context and make adjustments to prevent the disclosure of sensitive information, or reschedule it to match a timeline that benefits the user. Had PZ done the same with Windows, they would have realized a patch was about to released 3 days later, and withheld the disclosure until after that patch was made public instead of blindsiding Microsoft riding a high-horse.
  • You're assuming that CERT or IETF would have waited but all that is written in their site contradicts you.. Well, you believe Google is bad, believe what you want, it won't change the facts. Have a good day
  • Apple had 90 days to fix these bugs. It probably took 90 minutes of work to analyze the problem, develop a fix, and test it. For 3 months to pass is inexcusable if you ask me. The only fault that I call out on Google is for them to call this "Project Zero" when these aren't zero-day exploits.
  • These security flaws are fascinating. It's kinda ironic though how there are more OSX problems listed than Windows. Posted via iMore App