Re: "Apple plans to scan US iPhones for child abuse imagery"

26
Android is open source so it's dependent on a phone's manufacturer for updates, manufacturers such as Samsung, LG, ASUS, Xiaomi, Nokia... Google does have its own phone the Pixel that they can run anything on and is tied to the Google store like the iPhone is tied to the Apple Store. Apple's iOS is not open source, they are the only ones who can control and make changes to its operating system. Apple uses its claim of its closed operating system and protecting user privacy as selling tools, this case shatters it.

Apple, Google, Dropbox, Amazon...can make any statements they want about your personal data and its "security" on their servers. As I said before, I have no expectation of privacy if my data is sitting on a server that is not mine. They can say anything they want, but law enforcement comes along with a warrant that has a non-disclosure attached to it like we learned a few months ago and any tech company has to turn it over and is barred from telling the user. This case is different in that Apple is an agent of the police, fishing in our data and turning it over to law enforcement.

Still wonder what the feds are holding over Apple, that Apple the largest corporation in the US agreed to this process. When 5G gets to the level of current LTE, I'll go back to android, apparently the FBI uses Samsung not Apple smartphones.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

27
You never hear the LEOs or the Feds complaining about not being able to crack the Android phones. It is always the Apple products they whine about. We have heard rumors the Israelis have cracked the iPhone but no real hard confirmed proof. Android being open source leaves itself open to be cracked. Its like if Apple is your house then it is securely lock and very hard to break in and when you do everything is hidden behind a wall that only way in is a very long encrypted key that only the user knows. Android being open source leaves the doors locked with the key under the front door mat. Because the source code for the encryption is open source so anybody can read it and figure out the weakness. Even with a user defined encrypted key. BTW I have played and worked with computers since 1981 I have been Microsoft Certified and Cisco Network Certified. FYI iOS and MacOS are a modified Unix system Android is based on the open software Linux and other open source software. Not knocking Linux as I have run it in the past.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

28
There are androids that are more secure than others, some are very secure. iOS 15.0 will be a major update containing more than "NeuralMatch" scanning software, I don't want to skip updates. If the story hadn't been leaked by the Financial Times, would Apple have revealed it?

The FBI got into the San Bernardino terrorists iPhones after Apple declined to assist, reports at the time were that it was done by an Israeli company it now appears that was NSO's Pegasus program. If the feds have a license for the software which is almost certain, are they always getting warrants before they use it?
Apple's Tim Cook has consistently beaten the drum of "privacy first".
https://www.bbc.com/news/technology-58124495

Reminds me of post-9/11 when some Republicans were saying that to ensure our safety against terrorism we'd have to give up some rights and we got the Patriot Act. Governments and corporations will always try to encroach on our rights and they will use a lot of "justifications". Being socially concerned about an issue doesn't mean the solution is compromising privacy rights. Apple is free to do whatever scans they want on their iCloud which they own, just don't put that on my iPhone. It's the principle that I'm concerned with and it's more important to some people than others.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

29
To clarify:
PSA: Apple can't run CSAM checks on devices with iCloud Photos turned off

What you need to know

Apple's new on-device CSAM checks are only run on photos that are to be uploaded to iCloud Photos.
Photos aren't checked on devices with iCloud Photos disabled.
Apple also confirmed that it cannot check photos that are inside iCloud backups

Apple announced new on-device CSAM detection techniques yesterday as part of its new Child Safety measures, and there has been a lot of confusion over what the feature can and cannot do. Contrary to what some people believe, Apple cannot check images when users have iCloud Photos disabled.

Apple's confirmation of the new CSAM change did attempt to make this clear, but perhaps didn't make as good a job of it as it could. With millions upon millions of iPhone users around the world, it's to be expected that some could be confused.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

The key part there is the iCloud Photos bit because CSAM checks will only be carried out on devices that have that feature enabled. Any device with it disabled will not have its images checked. That's also a fact that MacRumors had confirmed, too.

Something else that's been confirmed — Apple can't delve into iCloud backups and check the images that are stored there, either. That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.

So yeah, there you go.
https://www.imore.com/psa-apple-cant-ru ... tos-turned

Just don't upload photos to iCloud and you're safe.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

30
TrueTexan wrote: Sat Aug 07, 2021 11:28 am To clarify:
PSA: Apple can't run CSAM checks on devices with iCloud Photos turned off

What you need to know

Apple's new on-device CSAM checks are only run on photos that are to be uploaded to iCloud Photos.
Photos aren't checked on devices with iCloud Photos disabled.
Apple also confirmed that it cannot check photos that are inside iCloud backups

Apple announced new on-device CSAM detection techniques yesterday as part of its new Child Safety measures, and there has been a lot of confusion over what the feature can and cannot do. Contrary to what some people believe, Apple cannot check images when users have iCloud Photos disabled.

Apple's confirmation of the new CSAM change did attempt to make this clear, but perhaps didn't make as good a job of it as it could. With millions upon millions of iPhone users around the world, it's to be expected that some could be confused.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

The key part there is the iCloud Photos bit because CSAM checks will only be carried out on devices that have that feature enabled. Any device with it disabled will not have its images checked. That's also a fact that MacRumors had confirmed, too.

Something else that's been confirmed — Apple can't delve into iCloud backups and check the images that are stored there, either. That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.

So yeah, there you go.
https://www.imore.com/psa-apple-cant-ru ... tos-turned

Just don't upload photos to iCloud and you're safe.

That's what I stated above, "NeuralMatch"/CSAM only starts working on our iPhones (post iOS 15.0) when we start to upload to the iCloud. That is according to Apple who didn't release that info before London's Financial Times leaked it. There are plenty of tech savvy companies out there that will test out this scanning system and let us know.

Apple is the largest corporation in the world, I question their motives in this case. Apple could sweep their iCloud for child abuse porn without ever downloading any software to our iPhones, so why aren't they doing it? If they're concerned about breeching users trust, they already have. They've become an agent of the state in the US, like they and Google have in China.

Apple is like all the other corporations, out to protect their own interests not ours.
Last edited by highdesert on Sat Aug 07, 2021 6:02 pm, edited 1 time in total.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

33
How does this software program tell the difference between children and adults? Will humans be scrutinizing photos?

Bet they masturbate to peoples (grownups) nude photo/videos of themselves having sex. How creepy.

Will they arrest parents for taking photos of their children in a bathtub full of bubble bath? There's just so much wrong with this. Glad I have an android.
“The only thing necessary for the triumph of evil is for good men to do nothing,”

Re: "Apple plans to scan US iPhones for child abuse imagery"

34
Here is an article that answers a lot of questions about what Apple is doing and how it is supposed to be secure for most of us that don’t play in the cesspool of child porn. As they say in the article the other major clouds are already doing scans. Worth a read to see what is done and what to expect in the future. Just remember the old saying , If you don’t want your wife, mother or Grandmother to see it don’t put it on the internet. I remember the days of the USENET NEWSGROUPS they had stuff that would make the people on the dark web blush.

https://tidbits.com/2021/08/07/faq-abo ... -children/
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

35
I don't work for Apple, but supporting Apple tech pays my bills. This change makes me sad. W/ device-side scanning (spying), Apple can no longer say "what happens on your iPhone stays on your iPhone" ... they can no longer ring that privacy bell. Yes, the Apple apologists will say if you aren't doing anything wrong, you have nothing to worry about ... they will start with an easy win in the name of child protection. Given the current climate and competition toward claiming the woke crown, I fear politics and popularity will play an increasing role. Hey Siri ... scan this individual's images and texts for dangerous firearms and high capacity magazines. I can't get away from this crap and to a goat farm in the hills fast enough.

https://www.eff.org/deeplinks/2021/08/a ... ivate-life

Re: "Apple plans to scan US iPhones for child abuse imagery"

36
As long as it stays on your phone there is no problem. But, the second you send it to iCloud then it gets scanned. Remember isn’t just Apple doing this it is also Google, Dropbox, Amazon and other cloud providers doing the scanning and not just iPhones but Android phones.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

37
Just did a little search and found out Verizon has been doing the same thing since 2013 using the same scanning technique.
When Congress passed the PROTECT Our Children Act of 2008 mandating that service providers report suspected child pornography in the content that their customers surf and store
https://arstechnica.com/information-te ... its-cloud/

https://www.nbcnews.com/technolog/your ... -1c8881731
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

38
As I said, I'm a realist I have no expectation of privacy on any personal files I might upload to an internet cloud, no matter what the provider says. Those are the providers servers and they have liability over what could be found on them, so they can scan the hell out of anything on their clouds I have no issue with it. My problem is with Apple putting that scanning software on my iPhone, they slickly sell "privacy" as a huge advantage over other mobile phones, but for me the trust factor is dead. There are plenty of smart techies out there and if something similar was on other mobile phones they'd let us know.

They put out a statement but it's their usual slick sales work.
https://www.apple.com/child-safety/
"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations," the company writes. "Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
However, security researchers, while supportive of the efforts, are concerned that Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors. While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.

"It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops." - Ross Anderson of the UoC.

Apple - which likes to make everyone believe it's at the forefront of user privacy - creating this backdoor for the US government would also push governments to make their own demands from other tech companies. While this is being done in the US right now, it is opening the way for other governments to rightfully make similar and more targeted demands from the tech companies.

Security researchers around the globe have been writing about why this is effectively the end of privacy at Apple since every Apple user is now a criminal unless proven otherwise.
"You can wrap that surveillance in any number of layers of cryptography to try and make it palatable, the end result is the same," Sarah Jamie Lewis, executive director at Open Privacy, wrote.

"Everyone on that platform is treated as a potential criminal, subject to continual algorithmic surveillance without warrant or cause."

The Electronic Frontier Foundation has released a full-page argument calling this move a "backdoor to your private life."

"Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it," the digital rights firm wrote, adding that backdoor is always a backdoor regardless of how well-designed it may be.

"But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor." - EFF
The new features are also concerning even without the government meddling and could prove life-threatening for queer kids. Kendra Albert of Harvard's Cyberlaw Clinic tweeted that these algorithms are going to overflag LGBTQ+ content, including transition photos. "Good luck texting your friends a picture of you if you have "female presenting nipples," Albert tweeted.
Green also reminded everyone that while the idea of Apple being a privacy-forward company has brought them a lot of good press and consumer trust, it's the same company that dropped plans to encrypt iCloud backups because of the FBI.

Apple has shared full details of these new changes in this document. While Apple may be well-intentioned, the iPhone maker is not only breaking promises of security and privacy but is also throwing users to rely on their governments for not misusing this access to their personal data - something that doesn't have a good track record.

As EFF says, what Apple is doing isn't just a slippery slope, it's "a fully built system just waiting for external pressure to make the slightest change."
https://wccftech.com/apple-backdoor-to-iphones/
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

39
ZenArcade wrote: Sun Aug 08, 2021 12:58 pm I don't work for Apple, but supporting Apple tech pays my bills. This change makes me sad. W/ device-side scanning (spying), Apple can no longer say "what happens on your iPhone stays on your iPhone" ... they can no longer ring that privacy bell. Yes, the Apple apologists will say if you aren't doing anything wrong, you have nothing to worry about ... they will start with an easy win in the name of child protection. Given the current climate and competition toward claiming the woke crown, I fear politics and popularity will play an increasing role. Hey Siri ... scan this individual's images and texts for dangerous firearms and high capacity magazines. I can't get away from this crap and to a goat farm in the hills fast enough.

https://www.eff.org/deeplinks/2021/08/a ... ivate-life

That's exactly the line that Dick Cheney and Republicans made for the Patriot Act and their National Security Letters that didn't require a warrant from a court. They were telling everyone that if they didn't do anything wrong, they didn't have anything to worry about.

So we pay hundreds of dollars for mobile phones and the manufacturer and our service carrier use them to spy on us as agents of the government?
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

40
Apple can no longer say "what happens on your iPhone stays on your iPhone" .
Apple can still say that because As long as you don't send it to Apple iCloud it stays on your phone and not scanned. But the second it goes to the iCloud it is scanned. Also you can send it to another provider's clouds and not be scanned by Apple. But the other providers will scan it for you.

When the DOJ and law enforcement whined and demanded Apple give them a backdoor to access the iPhone Apple said not no, but hell no. Did you hear this from any of the other phone manufacturers. No why, because they already had the back doors.

Apple is the last major Cloud provider to scan for kiddie porn on the cloud. Apple is very selective as to what they can scan on their cloud servers. They don't go looking at any of your pictures, just a data hash that had a match from data hash they were given for a kiddie porn pictures. That is very restricted data. Your bubble bath pictures of your young child is safe to send because of the odds it would match some kiddie porn is greater than one in a billion.

The other thing you can do is not upgrade the iOS to 15 stay with 14.7

Apple does more to protect you security and privacy online than any of the other cloud or phone providers.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

41
If Apple can scan photos what else will they scan, on a child's iPhone they already scan iMessages. It has the potential that they could scan our e-mails, texts... Sorry I don't put blind faith in corporations or governments or religion, when they say trust me I become extremely skeptical. Apple might support a lot of liberal causes, but they do it because it appeals to those who buy their products. Whoever advised Tim Cook on this one gave him bad advice.

There is an open letter to Apple circulating on the internet.
https://appleprivacyletter.com
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

42
The iMessage is an opt in for the family iCloud. You Identify whose is under age 13 and then their messages and photos are scanned when being sent or received on their iPhone. If the Hash says it is sexually suggestive then a warning is sent to that phone and if the recipient still says to receive the message or photo a message is sent to the parent about the transaction. That is not being done yet except on Beta 15 iOS phones for the Beta users like me. Apple still says they can't read any iMessage as the are encrypted one to end. what their computers read with AI is a hash code.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

43
TrueTexan wrote: Sun Aug 08, 2021 3:51 pm
Apple can no longer say "what happens on your iPhone stays on your iPhone" .
Apple can still say that because As long as you don't send it to Apple iCloud it stays on your phone and not scanned. But the second it goes to the iCloud it is scanned. Also you can send it to another provider's clouds and not be scanned by Apple. But the other providers will scan it for you.

When the DOJ and law enforcement whined and demanded Apple give them a backdoor to access the iPhone Apple said not no, but hell no. Did you hear this from any of the other phone manufacturers. No why, because they already had the back doors.

Apple is the last major Cloud provider to scan for kiddie porn on the cloud. Apple is very selective as to what they can scan on their cloud servers. They don't go looking at any of your pictures, just a data hash that had a match from data hash they were given for a kiddie porn pictures. That is very restricted data. Your bubble bath pictures of your young child is safe to send because of the odds it would match some kiddie porn is greater than one in a billion.

The other thing you can do is not upgrade the iOS to 15 stay with 14.7

Apple does more to protect you security and privacy online than any of the other cloud or phone providers.
I think you are far too trusting of Apple, Inc. During my management and support of their products over the last 2 decades +, I've witnessed and felt the impact of numerous "mistakes" they've made ... just in the last couple weeks I have hundreds of devices that are in broken state due to an error they made on the backend. In iOS/iPad OS 15 they indicate it will only affect iCloud-enabled devices ... my history w/ them makes me skeptical. They've also stated this tech will make it into macOS 12 Monterey.

They are a marketing company first and foremost, w/ the reality of how their products actually work often differing from said marketing spiel. As mentioned, I expect and have no issue w/ cloud-based content scanning (their cloud/their rules), but on-device AI and scanning is an entirely different matter. Not updating to the latest is not really an option ... they've indicated iOS 14 will be supported w/ security updates for a bit ... not defining how long down level support will be (n-2?). This new tact is assuming guilt until proven otherwise. Apple is now an active partner in policing and their privacy posture is laughable.

If it hasn't already been posted, a good discussion making the rounds on YouTube;

Re: "Apple plans to scan US iPhones for child abuse imagery"

44
The picture is hashed on the way out to the cloud and is compared at the cloud. Not on the phone.

I have used Apple products since 3 BM, that’s 3 years before Mac. I have also used Microsoft products some windows 3.2 including windows NT in an ISP environment where we rebooted our servers every Friday just to hope and pray they would stay up and running till Monday morning. I have been Microsoft certified and Cisco certified.

I will take Apple products and their OS for a personal computer use any day of Microsoft. Yes Apple does screw up every now and then. But in this case look at their point of view. If they did nothing and kiddie porn is found on their servers then they could be held libel and lawsuits or felony charges could be brought against them. They could do like the other cloud providers and let the pictures be scanned just on the cloud.where they could be seen by other people. The way that Apple is doing this is the pictures are hashed as the are uploaded then compared keeping the end to end encryption intact. Unlike some other providers. Apple is one of the last to do the scans because they just didn’t want a person having to scan multiple pictures that have nothing to do with the subject. They wanted to provide as much security to their users as possible.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: "Apple plans to scan US iPhones for child abuse imagery"

47
Good video Zen, it confirms exactly what we've been talking about, by an expert a PhD candidate at Carnegie Mellon University, a top engineering and computer science institution.

Apple plays up being a "progressive company" that is "socially responsible", but it's just slick marketing. Steve Jobs was primarily a salesman and he was a great salesman and Tim Cook has kept it going. And in this case they're using an emotionally sensitive subject and trying to rally everyone behind them while compromising our privacy, just like W, Cheney and Republicans did with the emotional issue of terrorism. It's a subtle threat, if you're not supporting our program to fight child porn then you're you're supporting child pornographers, like the Patriot Act if you didn't support that bill to fight terrorism then you supported terrorists.

Remember Republicans saying that to ensure our security, we all have to give up some rights, that was total bullshit and so is Apple.

Apple's plans drew criticism over the weekend, with Electronic Frontier Foundation labelling the features as a backdoor.

"If you've spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system," the EFF wrote.

"Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

EFF warned that once the CSAM system was in place, changing the system to search for other sorts of content would be the next step.

"That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," it said.

"The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers."

The EFF added that with iMessage to begin scanning images sent and received, the communications platform was no longer end-to-end encrypted.

"Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the 'end-to-end' promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company's stance toward strong encryption," the foundation said.
https://www.zdnet.com/article/apple-chi ... raws-fire/
Last edited by highdesert on Mon Aug 09, 2021 12:49 pm, edited 1 time in total.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

49
Not updating to 15.0 is not an option, there are other things included in the update other than their "spyware". So far they aren't making that update optional. I don't buy the BS that Apple, Google and others make about securing cloud data, if you put stuff on their clouds you take a risk, I don't have anything on their clouds.


What is happening now with Apple in China is a preview of the future, sorry it's long.
GUIYANG, China — On the outskirts of this city in a poor, mountainous province in southwestern China, men in hard hats recently put the finishing touches on a white building a quarter-mile long with few windows and a tall surrounding wall. There was little sign of its purpose, apart from the flags of Apple and China flying out front, side by side.

Inside, Apple was preparing to store the personal data of its Chinese customers on computer servers run by a state-owned Chinese firm.

Tim Cook, Apple’s chief executive, has said the data is safe. But at the data center in Guiyang, which Apple hoped would be completed by next month, and another in the Inner Mongolia region, Apple has largely ceded control to the Chinese government.

Chinese state employees physically manage the computers. Apple abandoned the encryption technology it used elsewhere after China would not allow it. And the digital keys that unlock information on those computers are stored in the data centers they’re meant to secure.

Internal Apple documents reviewed by The New York Times, interviews with 17 current and former Apple employees and four security experts, and new filings made in a court case in the United States last week provide rare insight into the compromises Mr. Cook has made to do business in China. They offer an extensive inside look — many aspects of which have never been reported before — at how Apple has given in to escalating demands from the Chinese authorities.

Two decades ago, as Apple’s operations chief, Mr. Cook spearheaded the company’s entrance into China, a move that helped make Apple the most valuable company in the world and made him the heir apparent to Steve Jobs. Apple now assembles nearly all of its products and earns a fifth of its revenue in the China region. But just as Mr. Cook figured out how to make China work for Apple, China is making Apple work for the Chinese government.

Mr. Cook often talks about Apple’s commitment to civil liberties and privacy. But to stay on the right side of Chinese regulators, his company has put the data of its Chinese customers at risk and has aided government censorship in the Chinese version of its App Store. After Chinese employees complained, it even dropped the “Designed by Apple in California” slogan from the backs of iPhones.

China’s leader, Xi Jinping, is increasing his demands on Western companies, and Mr. Cook has resisted those demands on a number of occasions. But he ultimately approved the plans to store customer data on Chinese servers and to aggressively censor apps, according to interviews with current and former Apple employees.

“Apple has become a cog in the censorship machine that presents a government-controlled version of the internet,” said Nicholas Bequelin, Asia director for Amnesty International, the human rights group. “If you look at the behavior of the Chinese government, you don’t see any resistance from Apple — no history of standing up for the principles that Apple claims to be so attached to.”

While both the Trump and Biden administrations have taken a tougher line toward China, Apple’s courtship of the Chinese government shows a disconnect between politicians in Washington and America’s wealthiest company.

Mr. Cook has been on a charm offensive in China, making frequent, statesmanlike visits and meeting with top leaders. On one trip in 2019, he toured the Forbidden City, met with a start-up and posted about the trip on the Chinese social platform Weibo.

Behind the scenes, Apple has constructed a bureaucracy that has become a powerful tool in China’s vast censorship operation. It proactively censors its Chinese App Store, relying on software and employees to flag and block apps that Apple managers worry could run afoul of Chinese officials, according to interviews and court documents.

A Times analysis found that tens of thousands of apps have disappeared from Apple’s Chinese App Store over the past several years, more than previously known, including foreign news outlets, gay dating services and encrypted messaging apps. It also blocked tools for organizing pro-democracy protests and skirting internet restrictions, as well as apps about the Dalai Lama.
In 2014, Apple hired Doug Guthrie, the departing dean of the George Washington University business school, to help the company navigate China, a country he had spent decades studying.

One of his first research projects was Apple’s Chinese supply chain, which involved millions of workers, thousands of plants and hundreds of suppliers. The Chinese government made that operation possible by spending billions of dollars to pave roads, recruit workers, and construct factories, power plants and employee housing.

Mr. Guthrie concluded that no other country could offer the scale, skills, infrastructure and government assistance that Apple required. Chinese workers assemble nearly every iPhone, iPad and Mac. Apple brings in $55 billion a year from the region, far more than any other American company makes in China.

“This business model only really fits and works in China,” Mr. Guthrie said in an interview. “But then you’re married to China.”

The Chinese government was starting to pass laws that gave the country greater leverage over Apple, and Mr. Guthrie said he believed Mr. Xi would soon start seeking concessions. Apple, he realized, had no Plan B.

“For Chinese authorities, this is no longer about, ‘How much money are you pouring into China?’ This is about, ‘What are you giving back?’” Mr. Guthrie said.

Mr. Guthrie delivered his warning to Mr. Cook’s top deputies, including Phil Schiller, a longtime marketing chief; Eddy Cue, head of internet software and services; Lisa Jackson, the company’s government affairs chief; and Jeff Williams, its operations chief, who is widely viewed as Mr. Cook’s right-hand man.

As Mr. Guthrie was delivering his warnings, Apple set about keeping the Chinese government happy. Part of that effort was new research and development centers in China. But those R&D centers complicated Apple’s image as a California company. At a summit for its new Chinese engineers and designers, Apple showed a video that ended with a phrase that Apple had been inscribing on the backs of iPhones for years: “Designed by Apple in California.”

The Chinese employees were angered, according to Mr. Guthrie and another person in the room. If the products were designed in California, they shouted, then what were they doing in China?

“The statement was deeply offensive to them,” said Mr. Guthrie, who left Apple in 2019 to return to his home in Michigan. “They were just furious.”

The next iPhone didn’t include the phrase.
In November 2016, China approved a law requiring that all “personal information and important data” that is collected in China be kept in China.

It was bad news for Apple, which had staked its reputation on keeping customers’ data safe. While Apple regularly responded to court orders for access to customer data, Mr. Cook had rebuffed the F.B.I. after it demanded Apple’s help breaking into an iPhone belonging to a terrorist involved in the killing of 14 people in San Bernardino, Calif. Now the Chinese government had an even broader request.

Other companies faced a similar dilemma in China, but Apple was uniquely exposed because of its high profile and acute dependence on the country.

Apple’s iCloud service allows customers to store some of their most sensitive data — things like personal contacts, photos and emails — in the company’s data centers. The service can back up everything stored on an iPhone or Mac computer, and can reveal the current location of a user’s Apple devices. Most of that data for Chinese customers was stored on servers outside China.

Apple’s China team warned Mr. Cook that China could shut down iCloud in the country if it did not comply with the new cybersecurity law. So Mr. Cook agreed to move the personal data of his Chinese customers to the servers of a Chinese state-owned company. That led to a project known inside Apple as “Golden Gate.”

Apple encrypts customers’ private data in its iCloud service. But for most of that information, Apple also has the digital keys to unlock that encryption.

The location of the keys to the data of Chinese customers was a sticking point in talks between Apple and Chinese officials, two people close to the deliberations said. Apple wanted to keep them in the United States; the Chinese officials wanted them in China.

The cybersecurity law went into effect in June 2017. In an initial agreement between Apple and Chinese officials, the location of the keys was left intentionally vague, one person said.

But eight months later, the encryption keys were headed to China. That surprised at least two Apple executives who worked on the initial negotiations and who said the move could jeopardize customers’ data. It is unclear what led to the change.

Documents reviewed by The Times do not show that the Chinese government has gained access to the data. They only indicate that Apple has made compromises that make it easier for the government to do so.
With the keys in China, the government has two avenues to the data, security experts said: demand it — or take it without asking.

The Chinese government regularly demands data from Chinese companies, often for law-enforcement investigations. Chinese law requires the companies to comply.

U.S. law has long prohibited American companies from turning over data to Chinese law enforcement. But Apple and the Chinese government have made an unusual arrangement to get around American laws.

In China, Apple has ceded legal ownership of its customers’ data to Guizhou-Cloud Big Data, or GCBD, a company owned by the government of Guizhou Province, whose capital is Guiyang. Apple recently required its Chinese customers to accept new iCloud terms and conditions that list GCBD as the service provider and Apple as “an additional party.” Apple told customers the change was to “improve iCloud services in China mainland and comply with Chinese regulations.”

The terms and conditions included a new provision that does not appear in other countries: “Apple and GCBD will have access to all data that you store on this service” and can share that data “between each other under applicable law.”

Under the new setup, Chinese authorities ask GCBD — not Apple — for Apple customers’ data, Apple said. Apple believes that gives it a legal shield from American law, according to a person who helped create the arrangement. GCBD declined to answer questions about its Apple partnership.

In the three years before China’s cybersecurity law went into effect, Apple never provided the contents of a user’s iCloud account to the Chinese authorities and challenged 42 Chinese government requests for such data, according to statistics released by the company. Apple said it challenged those requests because they were illegal under U.S. law.

In the three years after the law kicked in, Apple said it provided the contents of an undisclosed number of iCloud accounts to the government in nine cases and challenged just three government requests.
Apple still appears to provide far more data to U.S. law enforcement. Over that same period, from 2013 through June 2020, Apple said it turned over the contents of iCloud accounts to U.S. authorities in 10,781 separate cases.

Chinese officials say their cybersecurity law is intended to protect Chinese residents’ data from foreign governments. People close to Apple suggested that the Chinese authorities often don’t need Apple’s data, and thus demand it less often, because they already surveil their citizens in myriad other ways.

But the iCloud data in China is vulnerable to the Chinese government because Apple made a series of compromises to meet the authorities’ demands, according to dozens of pages of internal Apple documents on the planned design and security of the Chinese iCloud system, which were reviewed for The Times by an Apple engineer and four independent security researchers.

The documents show that GCBD employees would have physical control over the servers, while Apple employees would largely monitor the operation from outside the country. The security experts said that arrangement alone represented a threat that no engineer could solve.

“Chinese intelligence has physical control over your hardware — that’s basically a threat level you can’t let it get to,” said Matthew D. Green, a cryptography professor at Johns Hopkins University.

Apple said it designed the iCloud security “in such a way that only Apple has control of the encryption keys.”

The documents also show that Apple is using different encryption technology in China than elsewhere in the world, contradicting what Mr. Cook suggested in a 2018 interview.

The digital keys that can decrypt iCloud data are usually stored on specialized devices, called hardware security modules, that are made by Thales, a French technology company. But China would not approve the use of the Thales devices, according to two employees. So Apple created new devices to store the keys in China.

The documents, from early 2020, indicated that Apple had planned to base the new devices on an older version of iOS, the software underpinning iPhones, which is among the most targeted systems by hackers. Apple also planned to use low-cost hardware originally designed for the Apple TV. That alarmed the security researchers.

But Apple said that the documents included outdated information and that its Chinese data centers “feature our very latest and most sophisticated protections,” which would eventually be used in other countries.

The Chinese government must approve any encryption technology that Apple uses in China, according to two current Apple employees.

“The Chinese are serial iPhone breakers,” said Ross J. Anderson, a University of Cambridge cybersecurity researcher who reviewed the documents. “I’m convinced that they will have the ability to break into the servers.”

Apple has tried to isolate the Chinese servers from the rest of its iCloud network, according to the documents. The Chinese network would be “established, managed, and monitored separately from all other networks, with no means of traversing to other networks out of country.” Two Apple engineers said the measure was to prevent security breaches in China from spreading to the rest of Apple’s data centers.

Apple said that it sequestered the Chinese data centers because they are, in effect, owned by the Chinese government, and Apple keeps all third parties disconnected from its internal network.

In Cupertino, Calif., Apple engineers have been racing to finish designs for the new Chinese iCloud. In a presentation to some engineers last year, according to slides viewed by The Times, managers made clear that the stakes were high.

“There will be immense pressure to get it done. We agreed to this timeline three years ago,” said one slide. “Important people put their reputations on the line. iCloud needs influential friends in China.”

The documents showed that Apple’s deadline to start storing data in the new Chinese data centers was June 2021.
Six months later, Mr. Guo submitted his app again, with changes to elude Apple’s software. Trieu Pham, an app reviewer in Cupertino, was assigned the app. He didn’t find anything that violated Apple’s rules. On Aug. 2, he approved it.

Three weeks later, Trystan Kosmynka, Apple’s app review chief, sent an email to several managers at 2:32 a.m. The subject line was “Hot: Guo.” The Chinese government had spotted Mr. Guo’s new app, and Mr. Kosmynka wanted to know how it had gotten published.

“This app and any Guo Wengui app cannot be on the China store,” he wrote, according to the emails filed in the court case. “Can we put the necessary pieces in place to prevent that ASAP.”

Apple pulled the app and began investigating. A resulting report said the app was published because the “China hide process was not followed,” according to court documents. It said that Mr. Pham, the app reviewer, should have sent the app to Apple’s Chinese language specialists, who had been trained on which topics to block in the Chinese App Store, including Mr. Guo.

When Apple managers questioned Mr. Pham, he told them the app didn’t violate any policies. The managers responded that the app criticized the Chinese government, Mr. Pham said in court documents, and that this was enough for rejection.

Six months later, Apple fired Mr. Pham. In response, he sued the company, accusing it of pushing him out to appease the Chinese government.

Apple said it removed Mr. Guo’s app in China because it had determined it was illegal there. Apple said it fired Mr. Pham because of poor performance.

Mr. Guo’s media outlets have a history of peddling misinformation. The exact nature of the apps in the 2018 case was unclear, though court documents said they discussed Chinese Communist Party corruption.

Phillip Shoemaker, who ran Apple’s App Store from 2009 to 2016, said in an interview that Apple lawyers in China gave his team a list of topics that couldn’t appear in apps in the country, including Tiananmen Square and independence for Tibet and Taiwan. He said Apple’s policy was matter-of-fact: If the lawyers believed a topic was off-limits in China, then Apple would remove it there.

On Chinese iPhones, Apple forbids apps about the Dalai Lama while hosting those from the Chinese paramilitary group accused of detaining and abusing Uyghurs, an ethnic minority group in China.

The company has also helped China spread its view of the world. Chinese iPhones censor the emoji of the Taiwanese flag, and their maps suggest Taiwan is part of China. For a time, simply typing the word “Taiwan” could make an iPhone crash, according to Patrick Wardle, a former hacker at the National Security Agency.

Sometimes, Mr. Shoemaker said, he was awakened in the middle of the night with demands from the Chinese government to remove an app. If the app appeared to mention the banned topics, he would remove it, but he would send more complicated cases to senior executives, including Mr. Cue and Mr. Schiller.

Apple resisted an order from the Chinese government in 2012 to remove The Times’s [New York Times newspaper] apps. But five years later, it ultimately did. Mr. Cook approved the decision, according to two people with knowledge of the matter who spoke on the condition of anonymity.

Apple recently began disclosing how often governments demand that it remove apps. In the two years ending June 2020, the most recent data available, Apple said it approved 91 percent of the Chinese government’s app-takedown requests, removing 1,217 apps.

In every other country combined over that period, Apple approved 40 percent of requests, removing 253 apps. Apple said that most of the apps it removed for the Chinese government were related to gambling or pornography or were operating without a government license, such as loan services and livestreaming apps.

Yet a Times analysis of Chinese app data suggests those disclosures represent a fraction of the apps that Apple has blocked in China. Since 2017, roughly 55,000 active apps have disappeared from Apple’s App Store in China, according to a Times analysis of data compiled by Sensor Tower, an app data firm. Most of those apps have remained available in other countries.
More than 35,000 of those apps were games, which in China must get approval from regulators. The remaining 20,000 cut across a wide range of categories. Apps that mapped users’ runs, edited selfies or taught sexual positions were removed. So were apps that allowed users to message privately, share documents and browse websites the Chinese government had blocked. More than 600 news apps also disappeared.

Apple disputed those figures, saying that some developers remove their own apps from China. Apple said that since 2017, it had taken down 70 news apps in response to Chinese government demands.

The discrepancy between Apple’s disclosures and the Times analysis is in part because Apple is removing apps before China’s internet censors even complain. Apple does not disclose such takedowns in its statistics.

Mr. Shoemaker said he and his team rationalized removing apps by framing them as simply enforcing a country’s laws. Similar steps were taken in places like Saudi Arabia and Russia, he said. “At the same time, we didn’t want to get hauled up in front of the Senate to talk about why we’re quote ‘censoring apps in China,’” he said. “It was a tightrope we had to walk.”
https://www.nytimes.com/2021/05/17/tech ... -data.html

The goal of Apple is making money for their shareholders.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: "Apple plans to scan US iPhones for child abuse imagery"

50
The goal of Apple is making money for their shareholders.
That's the goal of any corporation or business. As for China, Russia or any other authoritarian state you play by their rules or you don't play. The other cloud players all do the same thing because of the market size. Now Apple is moving much of its manufacturing out of Chine to India Just as Google is doing the same.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Who is online

Users browsing this forum: Bing [Bot], Semrush [Bot] and 2 guests