{"id":25179,"date":"2021-08-08T15:50:10","date_gmt":"2021-08-08T13:50:10","guid":{"rendered":"https:\/\/www.apfelpatient.de\/?p=25179"},"modified":"2021-08-08T15:51:06","modified_gmt":"2021-08-08T13:51:06","slug":"icloud-photos-imessage-what-you-need-to-know-about-apples-child-protection","status":"publish","type":"post","link":"https:\/\/www.apfelpatient.de\/en\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen","title":{"rendered":"iCloud Photos &amp; iMessage: What you need to know about Apple&#039;s child protection"},"content":{"rendered":"<p class=\"has-drop-cap\"><strong>Apple&#039;s recent announcement about child protection in relation to the rating of iCloud photos and iMessage notifications has caused a lot of discussion around the world. In addition to security researchers and other observers, normal users like you and me are of course also debating. But there is something that we all need to be aware of. Despite Apple&#039;s new feature - which is limited to the USA for now - nothing changes in Apple&#039;s data protection guidelines. Below we will examine the most important questions together and try to provide more clarity.<\/strong><\/p>\n\n\n\n<p>Apple released a new set of tools on August 5, 2021 <a href=\"https:\/\/www.apfelpatient.de\/en\/news\/imessage-siri-icloud-photos-apple-expands-child-protection-features\">announced<\/a>to help protect children online and reduce the spread of Child Sexual Abuse Material (CSAM). These include new features in iMessage, Siri, and Search, as well as a mechanism that scans iCloud Photos for known CSAM images.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\" id=\"h-csam-scan-reaktionen-fallen-unterschiedlich-aus\">CSAM scan: reactions vary<\/h3>\n\n\n\n\n\n\n\n<p>In addition to fans, data protection and online security experts and other observers have also mixed opinions on Apple\u2019s <a href=\"https:\/\/www.apple.com\/child-safety\/\" target=\"_blank\" rel=\"noreferrer noopener\">announcement<\/a> But interestingly, many people underestimate how widespread the practice of scanning image databases for CSAM really is. Apple is certainly not the first company on the market to do this - it has just attracted more attention. Moreover, Apple is by no means giving up on privacy protection. So, let&#039;s get started.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Apple&#039;s privacy features<\/h3>\n\n\n\n<p>The company&#039;s suite of parental controls includes the aforementioned iCloud photo scanning, as well as updated tools and resources in Siri and Search. It also includes a feature designed to flag inappropriate images sent to or from minors via iMessage. As Apple noted in its original announcement, all of the features were designed with privacy in mind. Both the iMessage feature and the iCloud photo scan, for example, leverage the device&#039;s intelligence.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">\u201cScanner\u201d compares mathematical hashes<\/h3>\n\n\n\n<p>Additionally, the new iCloud Photos &quot;scanner&quot; doesn&#039;t actually scan or analyze the images on the user&#039;s iPhone. Instead, it compares the mathematical hashes of known CSAMs with the images stored in iCloud. If a collection of known CSAM images has been stored in iCloud, the account is flagged and subjected to a manual investigation by Apple. Only then, if the match is actually positive, is the account suspended and a report sent to the National Center for Missing &amp; Exploited Children (NCMEC). There are elements in the system that ensure that the error rate is negligible - one in a trillion, in fact. This is due to the aforementioned &quot;detection threshold,&quot; which Apple does not want to explain in more detail. Which is of course understandable and a good thing.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"403\" src=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-1024x403.png\" alt=\"\" class=\"wp-image-25181\" srcset=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-1024x403.png 1024w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-300x118.png 300w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-768x302.png 768w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-1536x604.png 1536w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-750x295.png 750w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14-1140x449.png 1140w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-08-um-15.45.14.png 1596w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption>Image: Apple<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">New iMessage system is even more privacy-friendly<\/h3>\n\n\n\n\n\n\n\n<p>In addition, we should not forget that the matching is only done in conjunction with iCloud Photos. This means that if a user disables iCloud Photos in the system settings, the system will also remain disabled. The iMessage system is even more privacy-friendly. It only applies to accounts belonging to children and is therefore opt-in, not opt-out. In addition, it does not generate reports that are sent to external parties - only the children&#039;s parents are notified that an inappropriate message has been received or sent. The bottom line is that the new features in iCloud Photos and iMessage work completely independently of each other and therefore actually - technically - have nothing to do with each other. Their only commonality is protecting children.<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>The iCloud Photos Verifier hashes images without examining them for context and compares them to known CSAM collections and creates a report that is sent to the hash collection manager NCMEC.<\/li><li>The child account iMessage feature uses on-device machine learning, does not compare images to CSAM databases, and does not send reports to Apple\u2014only to the parent&#039;s Family Sharing Manager account.<\/li><\/ul>\n\n\n\n<p>So let&#039;s answer a few important questions on the topic. Let&#039;s take the most common questions that have been documented on the Internet so far.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"555\" src=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-1024x555.png\" alt=\"Apple parental controls\" class=\"wp-image-25144\" srcset=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-1024x555.png 1024w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-300x163.png 300w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-768x416.png 768w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-1536x832.png 1536w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-750x406.png 750w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36-1140x618.png 1140w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-05-um-21.46.36.png 1600w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption>Image: Apple<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">So will Apple scan all my photos?<\/h3>\n\n\n\n<p>Apple doesn&#039;t scan your photos. It checks the numerical value associated with each photo against a database of known illegal content to see if they match. So the system doesn&#039;t see the image, it sees the neural hash. It also only checks images that have been uploaded to iCloud. That means the system can&#039;t detect images with hashes that aren&#039;t in the database.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"607\" src=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-1024x607.png\" alt=\"iCloud Photos CSAM scan\" class=\"wp-image-25155\" srcset=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-1024x607.png 1024w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-300x178.png 300w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-768x456.png 768w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-750x445.png 750w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00-1140x676.png 1140w, https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/Bildschirmfoto-2021-08-06-um-16.51.00.png 1514w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption>Image: Apple<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Can I deactivate the comparison?<\/h3>\n\n\n\n<p>As mentioned above, Apple has told several US media <a href=\"https:\/\/www.apfelpatient.de\/en\/news\/icloud-photos-csam-check-can-be-deactivated\">confirmed<\/a>that the system can only detect CSAM in iCloud Photos, so if you turn off iCloud Photos, you won&#039;t be detected.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">iMessage: Can I still send nude photos to my partner?<\/h3>\n\n\n\n\n\n\n\n<p>If you are of legal age, Apple will not flag you. As mentioned above, this mechanism only applies to child accounts and must first be activated - i.e. opt-in. This means that Apple will not stop you from sending nude photos to your partner.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Can I still take pictures of my children in the bathtub?<\/h3>\n\n\n\n<p>Here, too, some have misunderstood Apple&#039;s new features. As I said, CSAM detection cannot recognize the context of individual images. That is, the mechanism cannot distinguish a rose from a genital organ. (I know, weird comparison!) It does not recognize one or the other. It simply compares the image hashes of the photos stored in iCloud with a database of known CSAM maintained by the NCMEC. In other words, if your children&#039;s images are not hashed in some way in the NCMEC database, then they will not be flagged by the iCloud photo scanning mechanism.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Help \u2013 I am falsely accused of possessing CSAM<\/h3>\n\n\n\n<p>Apple&#039;s system is designed in such a way that false positive results occur at a rate of 1:1,000,000,000,000. The chance of being struck by lightning is 1:15,000. Even winning the lottery jackpot is more likely. If this unlikely event occurs, a manual check is initiated at Apple. At the very latest, it should be clear that it is an error. In short: Neither pictures of your children in the bathtub nor naked pictures of yourself will produce such a false positive.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Which countries will receive these features?<\/h3>\n\n\n\n<p>Apple&#039;s parental controls will initially only be rolled out in the US, but Apple has confirmed that it will consider rolling out these features in other countries after considering legal options. This would suggest that Apple is at least considering a rollout beyond the US.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Which devices are affected?<\/h3>\n\n\n\n<p>The new measures will appear on iOS 15, iPadOS 15, watchOS 8 and macOS 12 Monterey, i.e. iPhone, iPad, Apple Watch and Mac. Except for CSAM detection, which will only be available on iPhone and iPad.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">When will these new measures come into force?<\/h3>\n\n\n\n<p>Apple says all features will be released \u201clater this year,\u201d meaning in 2021. This means the whole thing could go live with the launch of the new software generation or be activated in a later point version.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Data protection concerns: You are of course justified<\/h3>\n\n\n\n<p>Despite Apple&#039;s explanations and extensive disclosures, many users have privacy concerns. Many cryptographic and security experts are also rather concerned. For example, some experts fear that this type of mechanism could make it easier for authoritarian governments to crack down on abuse. Although Apple&#039;s system is only designed to detect CSAM, they fear that repressive governments could force Apple to revise it to also detect dissenting or anti-government messages. For example, prominent digital rights nonprofit the Electronic Frontier Foundation notes that a similar system originally designed to detect CSAM was reworked to create a database of &quot;terrorist&quot; content. Apple has since also commented on this, explaining the following - quote from one of my previous <a href=\"https:\/\/www.apfelpatient.de\/en\/news\/csam-detection-apple-could-expand-the-system-on-a-country-basis\">Article<\/a>:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>Apple&#039;s new CSAM detection system will be limited to the United States at launch. To avoid the risk that some governments might try to abuse the system, Apple confirmed that the company will consider a possible global expansion of the system after a legal assessment on a country-by-country basis.<\/p><\/blockquote>\n\n\n\n<p>Therefore, it is not expected that the feature will ever be available worldwide, but certain markets will remain excluded from it.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Apple&#039;s CSAM system is not new<\/h3>\n\n\n\n\n\n\n\n<p>Whenever Apple announces something new, it is discussed worldwide. Unlike many companies, the bitten apple receives a lot of attention. This latest case was no exception. This leads some to believe that the company&#039;s new child protection measures are somehow unique. In reality, they are not. As The Verge reported in 2014, Google has been scanning Gmail users&#039; inboxes for known CSAM since 2008. Some of these scans have even led to the arrest of people sending CSAM. Google is also working on flagging CSAM and removing it from search results. But that&#039;s not all. Microsoft originally developed the system that the NCMEC uses. The PhotoDNA system, donated by Microsoft, is used to scan image hashes and detect CSAM, even when an image has been altered. And Facebook?<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">CSAM scanning is now common practice<\/h3>\n\n\n\n<p>Well, Facebook started using it in 2011, while Twitter adopted it in 2013. It&#039;s also used by Dropbox and similar services. So is Apple a latecomer? In a way, yes. Cupertino has already admitted that it scans for CSAM. In 2019, the company updated its privacy policy to say that it scans for &quot;potentially illegal content, including child sexual exploitation material.&quot; Apple has now simply tweaked it to make it better. So while Apple is uniquely positioned as a privacy-respecting company, this type of CSAM scanning is commonplace among internet companies\u2014and for good reason.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Conclusion<\/h3>\n\n\n\n<p>While privacy is a fundamental human right, any technology that curbs the spread of CSAM is also inherently good. Apple has managed to develop a system that works toward the latter goal without significantly compromising the privacy of the average person. (Photo by Unsplash \/ Carles Rabada)<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li><a href=\"https:\/\/www.apfelpatient.de\/en\/tips-tricks\/ios-15-live-text-copy-paste-text-from-photos-how-to-do-it\">iOS 15 Live Text: Copy &amp; paste text from photos \u2013 here\u2019s how<\/a><\/li><\/ul>\n\n\n\n<h6 class=\"wp-block-heading translation-block\" id=\"h-kennt-ihr-schon-unsere-amazon-storefront-dort-findet-ihr-eine-handverlesene-auswahl-von-diversen-produkten-f-r-euer-iphone-und-co-viel-spa-beim-st-bern\"><em>Have you already visited our Amazon Storefront? There you&#039;ll find a hand-picked selection of various products for your iPhone and other devices \u2013 <span class=\"has-inline-color has-vivid-red-color\"><a href=\"https:\/\/www.amazon.de\/shop\/apfelpatientofficial\" class=\"ek-link\" data-wpel-link=\"exclude\" rel=\"follow noopener\" target=\"_self\"><span style=\"text-decoration: underline\" class=\"ek-underline\">enjoy browsing<\/span><\/a> !<\/span><\/em><\/h6>\n\n\n\n<h6 class=\"wp-block-heading translation-block\" id=\"h-der-beitrag-enthalt-partnerlinks\">This post contains <a data-type=\"URL\" data-id=\"https:\/\/www.apfelpatient.de\/partnerprogramm\" href=\"https:\/\/www.apfelpatient.de\/en\/partner-program\" data-wpel-link=\"internal\" target=\"_self\">affiliate links<\/a>.<\/h6>","protected":false},"excerpt":{"rendered":"<p>CSAM: Apple&#039;s parental control announcement regarding iCloud Photos and iMessage ratings has sparked a lot of discussion.<\/p>","protected":false},"author":2,"featured_media":25178,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jnews-multi-image_gallery":[],"jnews_single_post":{"subtitle":"","format":"standard","video":"","gallery":"","source_name":"","source_url":"","via_name":"","via_url":"","override":[{"single_blog_custom":"","sidebar":"","second_sidebar":"","share_position":"","share_float_style":"","post_date_format":"","post_date_format_custom":"","post_reading_time_wpm":"","zoom_button_out_step":"1","zoom_button_in_step":"1","number_popup_post":"1"}],"image_override":[{"single_post_thumbnail_size":"","single_post_gallery_size":""}],"trending_post_position":"","trending_post_label":"","sponsored_post_label":"","sponsored_post_name":"","sponsored_post_url":"","sponsored_post_logo":"","sponsored_post_desc":""},"jnews_primary_category":{"id":"2"},"footnotes":""},"categories":[2],"tags":[34,3,360,33,573,4,638,21,620],"class_list":["post-25179","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-allgemein","tag-apple-dienste","tag-ios","tag-ios-15","tag-ipados","tag-ipados-15","tag-macos","tag-macos-monterey","tag-watchos","tag-watchos-8"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.5 (Yoast SEO v27.5) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>iCloud-Fotos &amp; iMessage: Das musst du zu Apples Kinderschutz wissen<\/title>\n<meta name=\"description\" content=\"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.apfelpatient.de\/en\/generally\/icloud-photos-imessage-what-you-need-to-know-about-apples-child-protection\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"iCloud-Fotos &amp; iMessage: Das musst du zu Apples Kinderschutz wissen\" \/>\n<meta property=\"og:description\" content=\"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.apfelpatient.de\/en\/generally\/icloud-photos-imessage-what-you-need-to-know-about-apples-child-protection\" \/>\n<meta property=\"og:site_name\" content=\"Apfelpatient\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/apfelpatientOfficial\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/apfelpatientOfficial\" \/>\n<meta property=\"article:published_time\" content=\"2021-08-08T13:50:10+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-08-08T13:51:06+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1600\" \/>\n\t<meta property=\"og:image:height\" content=\"899\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Milan\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:description\" content=\"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Milan\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"NewsArticle\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen\"},\"author\":{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/#organization\",\"name\":\"Apfelpatient\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/\"},\"headline\":\"iCloud-Fotos &#038; iMessage: Das musst du zu Apples Kinderschutz wissen\",\"datePublished\":\"2021-08-08T13:50:10+00:00\",\"dateModified\":\"2021-08-08T13:51:06+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen\"},\"wordCount\":1705,\"publisher\":{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/#organization\",\"name\":\"Apfelpatient\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/\"},\"image\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.apfelpatient.de\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/carles-rabada-ktWur2xM1hs-unsplash.jpg\",\"keywords\":[\"Apple Dienste\",\"iOS\",\"iOS 15\",\"iPadOS\",\"iPadOS 15\",\"macOS\",\"macOS 12 Monterey\",\"watchOS\",\"watchOS 8\"],\"articleSection\":\"Allgemein\",\"inLanguage\":\"en-US\",\"copyrightYear\":\"2021\",\"copyrightHolder\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/#organization\"}},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen\",\"name\":\"iCloud-Fotos & iMessage: Das musst du zu Apples Kinderschutz wissen\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.apfelpatient.de\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/carles-rabada-ktWur2xM1hs-unsplash.jpg\",\"datePublished\":\"2021-08-08T13:50:10+00:00\",\"dateModified\":\"2021-08-08T13:51:06+00:00\",\"description\":\"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/carles-rabada-ktWur2xM1hs-unsplash.jpg\",\"contentUrl\":\"https:\\\/\\\/www.apfelpatient.de\\\/wp-content\\\/uploads\\\/2021\\\/08\\\/carles-rabada-ktWur2xM1hs-unsplash.jpg\",\"width\":1600,\"height\":899,\"caption\":\"Photo by Unsplash \\\/ Carles Rabada\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/allgemein\\\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Startseite\",\"item\":\"https:\\\/\\\/www.apfelpatient.de\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"iCloud-Fotos &#038; iMessage: Das musst du zu Apples Kinderschutz wissen\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/#website\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/\",\"name\":\"Apfelpatient\",\"description\":\"Alles rund um Apple!\",\"publisher\":{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/#organization\",\"name\":\"Apfelpatient\",\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/\"},\"alternateName\":\"Apfelpatient\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.apfelpatient.de\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/#\\\/schema\\\/person\\\/7dc56ad6c8e9824b76d2c08d4c140e13\",\"name\":\"Milan\",\"logo\":{\"@id\":\"https:\\\/\\\/www.apfelpatient.de\\\/#\\\/schema\\\/person\\\/image\\\/\"},\"description\":\"Hallo und herzlich willkommen auf meinem Technik-Blog! Als gro\u00dfer Apple-Fan berichte ich hier \u00fcber alles, was mit Apple zu tun hat: von den neuesten News und spannenden Ger\u00fcchten \u00fcber hilfreiche Tipps und Tricks bis hin zu ausf\u00fchrlichen Produkttests. Wenn du genauso technikbegeistert bist wie ich, bist du hier genau richtig!\",\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/apfelpatientOfficial\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/milan-jovicic-42aa0231b\"],\"url\":\"https:\\\/\\\/www.apfelpatient.de\\\/en\\\/author\\\/apfeladmin\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"iCloud Photos &amp; iMessage: What you need to know about Apple&#039;s child protection","description":"CSAM: Apple&#039;s parental controls announcement regarding iCloud Photos and iMessage ratings has sparked a lot of discussion.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.apfelpatient.de\/en\/generally\/icloud-photos-imessage-what-you-need-to-know-about-apples-child-protection","og_locale":"en_US","og_type":"article","og_title":"iCloud-Fotos & iMessage: Das musst du zu Apples Kinderschutz wissen","og_description":"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.","og_url":"https:\/\/www.apfelpatient.de\/en\/generally\/icloud-photos-imessage-what-you-need-to-know-about-apples-child-protection","og_site_name":"Apfelpatient","article_publisher":"https:\/\/www.facebook.com\/apfelpatientOfficial","article_author":"https:\/\/www.facebook.com\/apfelpatientOfficial","article_published_time":"2021-08-08T13:50:10+00:00","article_modified_time":"2021-08-08T13:51:06+00:00","og_image":[{"width":1600,"height":899,"url":"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg","type":"image\/jpeg"}],"author":"Milan","twitter_card":"summary_large_image","twitter_description":"CSAM: Apples Ank\u00fcndigung zum Kinderschutz in Bezug auf die Bewertung von iCloud-Fotos und iMessage hat f\u00fcr eine Menge Diskussionen gesorgt.","twitter_misc":{"Written by":"Milan","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"NewsArticle","@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#article","isPartOf":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen"},"author":{"@type":"Organization","@id":"https:\/\/www.apfelpatient.de\/en\/#organization","name":"Apfelpatient","url":"https:\/\/www.apfelpatient.de\/en\/"},"headline":"iCloud-Fotos &#038; iMessage: Das musst du zu Apples Kinderschutz wissen","datePublished":"2021-08-08T13:50:10+00:00","dateModified":"2021-08-08T13:51:06+00:00","mainEntityOfPage":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen"},"wordCount":1705,"publisher":{"@type":"Organization","@id":"https:\/\/www.apfelpatient.de\/en\/#organization","name":"Apfelpatient","url":"https:\/\/www.apfelpatient.de\/en\/"},"image":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage"},"thumbnailUrl":"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg","keywords":["Apple Dienste","iOS","iOS 15","iPadOS","iPadOS 15","macOS","macOS 12 Monterey","watchOS","watchOS 8"],"articleSection":"Allgemein","inLanguage":"en-US","copyrightYear":"2021","copyrightHolder":{"@id":"https:\/\/www.apfelpatient.de\/en\/#organization"}},{"@type":"WebPage","@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen","url":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen","name":"iCloud Photos &amp; iMessage: What you need to know about Apple&#039;s child protection","isPartOf":{"@id":"https:\/\/www.apfelpatient.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage"},"image":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage"},"thumbnailUrl":"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg","datePublished":"2021-08-08T13:50:10+00:00","dateModified":"2021-08-08T13:51:06+00:00","description":"CSAM: Apple&#039;s parental controls announcement regarding iCloud Photos and iMessage ratings has sparked a lot of discussion.","breadcrumb":{"@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#primaryimage","url":"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg","contentUrl":"https:\/\/www.apfelpatient.de\/wp-content\/uploads\/2021\/08\/carles-rabada-ktWur2xM1hs-unsplash.jpg","width":1600,"height":899,"caption":"Photo by Unsplash \/ Carles Rabada"},{"@type":"BreadcrumbList","@id":"https:\/\/www.apfelpatient.de\/allgemein\/icloud-fotos-imessage-das-musst-du-zu-apples-kinderschutz-wissen#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Startseite","item":"https:\/\/www.apfelpatient.de\/"},{"@type":"ListItem","position":2,"name":"iCloud-Fotos &#038; iMessage: Das musst du zu Apples Kinderschutz wissen"}]},{"@type":"WebSite","@id":"https:\/\/www.apfelpatient.de\/#website","url":"https:\/\/www.apfelpatient.de\/","name":"apple patient","description":"Everything about Apple!","publisher":{"@type":"Organization","@id":"https:\/\/www.apfelpatient.de\/en\/#organization","name":"Apfelpatient","url":"https:\/\/www.apfelpatient.de\/en\/"},"alternateName":"Apfelpatient","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.apfelpatient.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":["Person","Organization"],"@id":"https:\/\/www.apfelpatient.de\/#\/schema\/person\/7dc56ad6c8e9824b76d2c08d4c140e13","name":"Milan","logo":{"@id":"https:\/\/www.apfelpatient.de\/#\/schema\/person\/image\/"},"description":"Hello and welcome to my technology blog! As a big Apple fan, I report on everything to do with Apple: from the latest news and exciting rumors to helpful tips and tricks and detailed product tests. If you are as enthusiastic about technology as I am, you have come to the right place!","sameAs":["https:\/\/www.facebook.com\/apfelpatientOfficial","https:\/\/www.linkedin.com\/in\/milan-jovicic-42aa0231b"],"url":"https:\/\/www.apfelpatient.de\/en\/author\/apfeladmin"}]}},"_links":{"self":[{"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/posts\/25179","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/comments?post=25179"}],"version-history":[{"count":0,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/posts\/25179\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/media\/25178"}],"wp:attachment":[{"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/media?parent=25179"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/categories?post=25179"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.apfelpatient.de\/en\/wp-json\/wp\/v2\/tags?post=25179"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}