On Friday, Google announced its machine learning tool named Magritte became open source. According to information sent to Gizmodo, the tool detects objects in images or videos and automatically applies blur when they appear on the screen. Google mentions that the object doesn’t matter and blur can be applied to license plates or tattoos, for example.
Google also mentioned that the code is useful for video journalists who want to anonymize topics they speak to with “high accuracy.” Magritte is in and of itself a very interesting tool that can be used well outside the realm of digital privacy. We don’t have to say it, but of course it could be used to censor more NSFW content around the web (it’s porn, folks, it’s always porn). The tool joins a host of others “Privacy” focused Tools that Google developers have published on the web.
Besides Magritte, Google also praises another so-called privacy-enhancing technology (PET) called Fully Homomorphic Encryption Transpiler, a term that sounds like something straightforward star trek Script. The code lets programmers or developers work by encrypting data in a sentence so programmers can work on it without being able to access personal user information. Google has made the FHE Transpiler open source last year and has been used by the company ever since duality for performing data analysis on normally constrained datasets. Duality claimed the data could be processed “even on unsecured systems” as it “complies with all different data protection laws at the same time”.
That’s a big claim, of course, but promises compliance with certain regulations in some cases. Those of the European Union General Data Protection Regulation, for example, forces researchers to implement some level of data security for personally identifiable information, which can be anything from a person’s name to their email address, phone number, or government ID. There are now in the US a jumble of state and federal privacy laws which have not stopped many companies from buying or selling personal data of all stripes. Really, most companies do both great and small (along with military and Prosecutionfor that matter) have not been forced to anonymize much or some of the data they work with.
Google’s open-source FHE transpiler seems like a good tool to give researchers the opportunity to read helpful data while keeping users’ private information private, there won’t be much fuss while it still exists no overarching data protection law in the USA
In its publication, Google praised the benefits of PET projects and its Protected data processing Initiative. The company went on to say, “We believe that every internet user in the world deserves world-class privacy, and we will continue to work with organizations to achieve that goal.” The company also mentioned it.work on it End-to-end encryption for Gmailwhich would be a great development for one of the world’s largest email platforms.
That ignores Google’s of course own role in the current privacy issues we see today. The company recently Paid $392 million settle a dispute against 40 attorneys general after the company allegedly misled users about when it stripped users’ location data.
https://gizmodo.com/google-privacy-ai-magritte-1849923870 Google Opens Up Tool to Make Your Privates Private