In the future data privacy could be one of the biggest issues of our time with growing supervision capabilities from big enterprises and governments. China is a telling example of what could happen if data privacy and anonymity on the internet is completely eroded. On the other hand, there is also a big potential benefit that could come from evaluating the vast data sources we have and using them for good, like stopping terror attacks and other criminal activity before it happens.
Palantir is a US based Data Mining enterprise that specializes in helping departments of state to evaluate big amounts of data. In the past it has come under criticism for its work with secret services and lack of transparency. One of the main tasks of Palantir is preemptively stopping criminal offenses . The name ‘Palantir’ is a homage to the Lord of the Rings books from J.R.R. Tolkien . In the books the so called ‘Palantíri’ are crystal balls, that can be used to communicate over long distances.
Palantir was founded in 2004 by Alex Karp, Peter Thiel, Joe Lonsdale, Stephen Cohen and Nathan Gettings. The investment firm In-Q-tel (a subsidiary of the CIA) helped finance it . Since then, Palantir grew steadily. It has more than 2500 employees and operates from 25 international company locations. The main location was moved from the Silicon Valley and is now in Denver.
In a recent interview the CEO Alex Karp said the following about the company’s goals:
“The core mission of our company always was to make the West, especially America, the strongest in the world, the strongest it’s ever been, for the sake of global peace and prosperity, and we feel like this year we really showed what that would mean”
In 2020 many things are different. People work and study from home, wear face masks and are facing restrictions in their fundamental rights. These measures and restrictions were taken to bring the global pandemic under control. More than 800.000 people have died as a result of Covid-19 (SARS-CoV-2) (25.08.2020).
“Let’s build an app for it” is the simple answer for many things. Therefore, it is no surprise that a lot of people asked for an app to fight Covid-19. In Germany, the “Corona Warn App” was published on June 16, 2020  . Controversial topics being discussed in the public were:
How does such an app work?
What are the benefits?
Will the number of false positives become too high?
How secure is it?
Does this restrict privacy?
Today, that the app is available for 2 months and has been downloaded more than 17.5 million times. This might be a good time to give answers to the questions above.
In the course of attending the lecture “Secure Systems” I became aware of a blog post by Geoff Huston on how the Domain Name System (DNS) handles “no such domain name” (NXDOMAIN) responses and which possible attack vectors could result from this. His analysis showed how little effort is necessary to perform a Denial of Service (DoS) attack against random authoritative name servers. After a presentation on this subject I decided to delve a little bit deeper into this topic and I came across the fuss about the new DNS over HTTPS (DoH) protocol earlier this year. The juicy findings during my research inspired me to write an own blog post about it. As with any technology, there are two sides to every coin. It always depends on which perspective you take and what hidden agenda you may pursue. For that reason, this blog post is not intended as a critique of the DoH protocol itself, which can be a valued addition to the internet and appears to have helpful uses. Therefore, my focus was on how DoH might currently be implemented having regard to the overall context. Herein, I will not go into technical details of the DoH protocol and thus refer to the corresponding RFC 8484 containing all these information.
It is widely known that tech companies, like Apple or Google and their partners collect and analyse an increasing amount of information. This includes information about the person itself, their interaction and their communication. It happens because of seemingly good motives such as:
Recommendation services: e.g. word suggestions on smartphone keyboard
Customizing a product or service for the user
Creation and Targeting in personalised advertising
Further development of their product or service
Simply monetary, selling customer data (the customer sometimes doesn’t know)
In the process of data collection like this clients’ or users’ privacy is often at risk. In this case privacy includes confidentiality and secrecy. Confidentiality means that no other party or person than the recipient of sent message can read the message. In the special case of data collection: no third party or even no one else but the individual, not even the analysing company should be able to read its information to achieve proper confidentiality. Secrecy here means that individual information should be kept secret only to the user.
Databases may not be simply accessible for other users or potential attackers, but for the company collecting the data it probably is. Despite anonymization/pseudonymization, information can often be associated to one product, installation, session and/or user. This way conclusions to some degree definite information about one very individual can be drawn, although actually anonymized or not even available. Thus, individual users are identifiable and traceable and their privacy is violated.
The approach of Differential Privacy aims specifically at solving this issue, protecting privacy and making information non-attributable to individuals. It tries to reach an individual deniability of sent/given data as a right for the user. The following article will give an overview of the approach of differential privacy and its effects on data collection.