Unquestionably, data privacy is of enormous concern to many internet users. This is why the push for stricter data privacy on online platforms and encryption has increased with the rise of social media use. While a majority of online users worry about the way their data is used and seen by the tech giants, those who are the victims of online crimes often advocate for more intrusion, not less. A key example to consider is the fight against online pedophilia.
The Rise of Online Pedophilia
The rising threat of online pedophilia cannot be overlooked. The age of online communication has rendered conversations with strangers normal. With children owning their own tablets and laptops, and hence having more freedom as to how they spend their time online, pedophiles have resorted to social media platforms and gaming websites in order to lure children into sharing explicit material with them. Europe has surpassed the United States this year by leading the way with most pedophilic activity online. The issue has become even more urgent, aslockdown procedures have caused children to spend more time online, resulting in more access by offenders. These offenders often pose as children, or an adult-confidant, and captivate children by threatening to share their explicit images if they do not obey them. Obeying them almost always means sharing even more explicit pictures, and sometimes videos. Such material then ends up on dark-web “communities”, the place notorious for all kinds of illegal activity.
Although a key player in the fight for online anonymity and privacy, the dark-web is more often than not used for illegal purposes. Though it is infamous for hosting platforms dedicated to drug and organ sales, a study shows that over 80% of dark web visits relate to pedophilia, even though pedophilic websites account for only 2% of websites on the dark-web. This popularity implies that people engaging in these urges and acts are skilled and determined in using the internet as a hotspot for them and their “communities”.
The mention of the dark-web often makes the issue sound more technical and advanced, yet the material that ends up on there is often trafficked through websites such as Facebook Messenger, Microsoft Bing, Amazon Video Services and Dropbox. This is why victims call for more scanning and control over data that is exchanged on these platforms, as the identification of illegal materials, such as those displaying abuse, could save lives. With the rushing in of the technological era, such materials become more accessible, and hence, more dangerous.
Technological Accessibility
Child sexual abuse and pedophilia is nothing new. These acts predate the technological boom, where the exchange of illegal material was done through transfers of VCRs and tapes. The advancement of technology has given pedophiles more anonymity and security, and the ability to conduct criminal activity from the comfort of their own home, resulting in a surge of online pedophilic activity. There being no geographical boundaries also makes it harder for law enforcement to be involved. Both new material, and the resharing of online illegal material is a major issue. Disturbingly, those who were abused and filmed years ago could still face the consequences of their online material, as offenders often seek out the victims in real life.Sextortion cases where children are tricked and then blackmailed often end up in suicide, as the victims have to carry the weight of their abuse for as long as the material is shared and viewed.
With the amount of reported content growing 50% in the last year, the number being 45 milliononline photos and videos of children being abused, the role tech giants play in combatting online pedophilia becomes more significant. While law enforcement agencies are underfunded and understaffed, the responsibility falls on tech companies to do their best to find and report illegal material. Facebook reported 85% of the total sharing of such data last year, with other companies, such as Apple, reporting dramatically fewer images at 3000 photos in total, and zero videos. Dropbox on the other hand, made 25,000 reports of 250,000 photos,as one report often includes a bundle of image and video exchanges made by one user. Microsoft, on the other hand, bears a huge part of the burden as its Bing search engine provides images of children being sexually abused when a key term is searched. The numbers indicate that the world’s most popular platforms are swarming with illegal content. These reports are detected by companies through automated scans“that only recognise previously flagged material”. This is done by comparing the images with examples of previous illegal material. If it is a match, the company is alerted. Though they bear the responsibility, most argue that tech companies are “turning a blind eye”. One issue is that the industry lacks a common standard for identifying abusive content, and the other is that there is little done to identify new material.
Tech companies are legally required to report images of child abuse; however this is only applicable to when they find them. Finding and flagging such images and videos is not these companies’ forté. As studies show, the criminals use cutting-edge technology to avoid the police and the tech companies are not sufficiently equipped to deal with the issue. The steps taken by tech companies are found to be inconsistent, largely unilateral and failing to be aggressive by law enforcement agencies. Amazon doesn’t even look for imagery, whereas Apple fails to scan its cloud storage. The encryption of the Apple and Facebook messaging apps, as Facebook has recently announced its intention to do so, is troubling as exchanged material cannot be scanned. Dropbox, Google and Microsoft do scan it, but only when the material is shared. Offenders evade suspicion and detection by sharing log-in details instead of the materials themselves, resulting in their crimes going undetected.
The companies explain their lack of sufficient measures through their privacy policies. Amazon says that “privacy of customer data is critical to earning our customers’ trust”, with Facebook pushing for the same argument in response to the reaction over the encryption of their Messenger app. Though this gives regular users relief, knowing that their information is out of sight, it gives online pedophilia communities more space to practice their crimes. Though it is found to be lacking, there are still some useful tools generated by these companies that raise hope on the matter.
What methods are in place right now?
Microsoft and Hany Farid created a photo and image detection technology in 2009. The technology is called PhotoDNA. The system works by converting all images into square, black and white images for consistency. It then proceeds to break it up into grids, and numbering each grid according to the visual features it bears which creates the fingerprints. The image is then compared against known illegal ones, and if two fingerprints match, the system is alerted. Almost none of the photos detected last year would have been found if it weren’t for PhotoDNA. Though Microsoft extensively uses PhotoDNA for its own services, it is not available to other platforms.
Google, on the other hand, has its own video-detection technology that it makes available to other companies. And so does Facebook. One problem? The fingerprints generated by these respective technologies do not match each other, and hence cannot collaborate for the greater good. One good news is that in 2017, the tech industry approved a process for sharing video fingerprints so that all companies could cooperate in the fight.
Another system created by Microsoft is called Project Artemis. Instead of detecting images, however, this one detects speech patterns in online chatrooms to detect whether or not a child is being groomed by the other user. There are numbered ratings to gauge the sad likelihood of grooming, and if the chat receives a certain score, a human moderator is alerted to watch the specific chatroom. When a threat is identified, it is reported to law enforcement. Microsoft has been using this technology for Xbox and Skype, platforms where offenders are very likely to be operating. The tool is luckily available for free to companies that also provide online chat functions, with the help of a nonprofit organisation called Thorn.
One factor needed for these tools to be used efficiently in the fight against online pedophilia is that consumers need to consent to their data being scanned this way, raising a dilemma of priority between the fight against data privacy and that against online pedophilia.
The Fight Against Data Privacy
It is an obvious problem that law enforcement officers acting as children online to catch the offenders is not enough of a tool to combat this growing issue. Full collaboration of the tech companies is urgently needed in order to tackle online pedophilia, and regulated access to exchanged material on their platforms is the easiest and fastest way to do it. The companies need to scan and regulate the content of the data swapped online in order to identify and save children, as well as stop the circulation of illegal and disturbing images and videos.
The accessible tools that are mentioned above can only function if the messages are not encrypted. The news of Facebook encrypting its Messenger app was met with relief by regular users, however considering the popularity of the app amongst offenders, is it really something to be happy about? The dilemma between the fight for data privacy and the fight against it to tackle online pedophilia is a tough one, and the moral questions posed by either argument are definitely worthy of more efforts and investments by these tech giants.