*Hey wow, it's got some Serbia in it.
======================================================================
EDRi-gram
fortnightly newsletter about digital civil rights in Europe
EDRi-gram 17.8, 24 April 2019
Read online: https://edri.org/edri-gram/17-8/
Contents
1. Will Serbia adjust its data protection framework to GDPR?
2. EU Parliament deletes the worst threats in the TERREG
3. Strategic litigation against civil rights violations in police laws
4. Protecting personal data world wide: Convention 108+
5. Facebook Custom Audience illegal without explicit user consent
6. What the YouTube and Facebook statistics aren't telling us
7. Recommended Action
8. Recommended Reading
9. Agenda
10. About
1. Will Serbia adjust its data protection framework to GDPR?
After a process that took more than five years, the National Assembly
of Serbia finally adopted a new Law on Personal Data Protection in
November 2018. The law closely follows EU’s General Data Protection
Regulation (GDPR), almost to the point of literal translation into
Serbian of some parts of the text. That was expected, due to Serbia’s EU
membership candidacy. However, it seems it will be very difficult to
implement the new legislation in practice - and thereby actually make a
difference, as there are numerous flaws that were overlooked when the
law was drafted and enacted.
There is not a high level of privacy culture in Serbia and therefore the
majority of people are not aware of how the state and the private sector
are collecting and handling their personal data. The recent affair with
new high-tech surveillance cameras in Serbia’s capital city Belgrade,
which were supplied by Huawei and have facial and vehicle license plate
recognition capabilities, shows that little thought is invested in how
intrusive technologies might impact citizens’ privacy and everyday
lives. The highest-ranking state officials for internal affairs, the
Minister of Interior and the Director of Police, have announced in the
media that these cameras are yet to be installed in Belgrade, while a
use case study on Huawei’s official website claimed that the cameras
were already operational. Soon after EDRi member SHARE Foundation, a
Serbian non-profit organisation dedicated to protecting and improving
human rights in the digital environment, published an article with
information found in Huawei’s “Safeguard Serbia” use case, the study
miraculously disappeared from the company website. However, an archived
version of the page is still available.
Considering that the adaptation period provided in the law is only nine
months after its coming into force - compared to two years under the
GDPR, the general feeling is that both the public and the private sector
will have many difficulties in adjusting their practices to the
provisions of the new law.
In the past years, we have witnessed many cases of personal data
breaches and abuse, the largest one undoubtedly being the case of the
now defunct Privatization Agency, when more than five million people,
almost the entire adult population of Serbia, had their personal data -
such as names and unique master citizen numbers, exposed on the
internet. The agency was ultimately shut down by the government, and
no-one was held accountable as the legal proceeding was not completed in
time.
Although the Serbian law contains key elements of the GDPR, such as
principles relating to processing of personal data and data subjects’
rights, its text is very complicated to understand and interpret, even
for lawyers. One of the main reasons for this is the fact that the law
contains provisions related to matters in the scope of EU Directive
2016/680, the so-called “Police Directive”, which deals with processing
of personal data by competent authorities for the purposes of the
prevention, investigation, detection or prosecution of criminal offences
or the execution of criminal penalties and on the free movement of such
data. The law also fails to cover video surveillance, particularly
important aspect of personal data processing. The Commissioner for
Information of Public Importance and Personal Data Protection, Serbia’s
Data Protection Authority, and civil society organisations have pointed
out these and other flaws on several occasions, but the Ministry of
Justice ignored these comments.
In addition to filing a complaint to the Commissioner, citizens are also
allowed under the law to seek court protection of their rights, creating
a “parallel system” of protection which can lead to legal uncertainty
and uneven practice in the protection of citizens’ rights. Regarding
data subjects’ rights, the final text of the law includes an article
with limitations to these rights, which omitted that they can only be
restricted by law. In practice, this would mean that state institutions
or private companies processing citizens' personal data may arbitrarily
restrict their rights as data subjects.
To make matters even more complicated, the Serbian National Assembly
still hasn’t appointed the new Commissioner, the head of the key
institution for personal data protection reform. The term of the
previous Commissioner ended in December 2018, and the public is still in
the dark as to whom will be appointed and when. There are also fears,
including on behalf of civil society and experts on the topic, that the
new Commissioner might not be up to the task in terms of expertise and
political independence.
New and improved data protection legislation, adapted for the world of
mass data collection and processing via artificial intelligence
technologies, is a key component of a successful digital transformation
of society. In Serbia it is, however, usually considered as a procedural
stepto join the EU. A personal data protection framework which meets
high standards set in the GDPR in practice is of great importance for
the digital economy, particularly for Serbia’s growing IT sector. If all
entities processing personal data can demonstrate that they are indeed
GDPR-compliant in their everyday practices, and not just “on paper”,
there will be more opportunities for investments in Serbia’s digital
economy and for Serbian companies to compete in the European digital market.
It will take a lot of effort to improve the standards of data protection
in Serbia, especially with a data protection law which will be difficult
to implement in practice. Therefore, it is of utmost importance that the
National Assembly appoints a person with enough expertise and
professional integrity as the new Commissioner, so that the process of
preparing both the private and public sector for the new regulations can
be expedited. As the application of the new Law on Personal Data
Protection starts in August 2019, it should be regarded as just the
beginning of a new relationship towards citizens’ data, which requires a
lot of hard work to accomplish. Otherwise, the law will remain just a
piece of paper with no practical effect.
This article was originally published at
https://policyreview.info/articles/news/will-serbia-adjust-its-data-protection-framework-gdpr-practice/1391
SHARE Foundation
https://www.sharefoundation.info/en/
Law on Personal Data Protection (only in Serbian, 13.11.2018)
http://www.pravno-informacioni-sistem.rs/SlGlasnikPortal/eli/rep/sgrs/skupstina/zakon/2018/87/13/
Outgoing Serbia’s Commissioner warns of data protection law (23.10.2018)
http://rs.n1info.com/English/NEWS/a430066/Outgoing-Serbia-s-Commissiner-warns-about-shortcomings-in-draft-law-on-data-protection.html
Serbian Data Protection Commissioner: NGOs call for transparency
(04.12.2018)
https://edri.org/ngos-transparency-dpc-serbia/
(Contribution by Bojan Perkov, EDRi member SHARE Foundation, Serbia)
2. EU Parliament deletes the worst threats in the TERREG
Today, 17 April 2019, the European Parliament (EP) adopted its Report on
the proposed Terrorist Content Regulation. Although it has been
questioned whether this additional piece of law is necessary to combat
the dissemination of terrorist content online, the European Union (EU)
institutions are determined to make sure it sees the light of day. The
Regulation defines what "terrorist content" is and what the take-down
process should look like. Fortunately, Members of the European
Parliament (MEPs) have decided to include some necessary safeguards to
protect fundamental rights against overbroad and disproportionate
censorship measures. The adopted text follows suggestions from other EP
committees (IMCO and CULT), the EU's Fundamental Rights Agency, and UN
Special Rapporteurs.
"The European Parliament has fixed most of the highest risks that the
original proposal posed for fundamental rights online," said Diego
Naranjo, Senior Policy Advisor at EDRi. "We will follow closely next
stages' developments, since any change to today's Report could be a
potential threat to freedom of expression under the disguise of
unsubstantiated 'counter-terrorism' policies," he further added.
European Digital Rights (EDRi) and Access Now welcome the improvements
to the initial European Commission (EC) proposal on this file.
Neverthless, we doubt the proposal's objectives will be achieved, and
point that no meaningful evidence has yet been presented on the need
for a new European counter-terrorism instrument. Across Europe, the
inflation of counter-terror policies has had disproportionate impact on
journalists, artists, human rights defenders and innocent groups at
risk of racism.
"The proposed legislation is another worrying example of a law that
looks nice, politically, in an election period because its stated
objective is to prevent horrendous terrorist content from spreading
online. But worryingly, the law runs the severe risk of undermining
freedoms and fundamental rights online without any convincing proof that
it will achieve its objectives," said Fanny Hidvegi, Europe Policy
Manager at Access Now. "During the rest of the process, the very least
the EU co-legislator must do is to maintain the basic human rights
safeguards provided by the European Parliament's adopted text," she
further added.
The next step in the process are trilogues negotiations between the
European Commission, the European Parliament and Member States.
Negotiations are expected to start in September / October 2019.
Terrorist Content Regulation: Successful “damage control” by LIBE
Committee (08.04.2019)
https://edri.org/terrorist-content-libe-vote/
CULT: Fundamental rights missing in the Terrorist Content Regulation
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/
Terrorist Content: IMCO draft Opinion sets the stage right for EP
(18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/
Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/
3. Strategic litigation against civil rights violations in police laws
Almost every German state has expanded or is preparing to expand police
powers. The police authorities are now more often allowed to interfere
with civil rights, even before a specific danger has been identified.
They are also given new means to conduct secret surveillance online.
EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil
Rights) is taking legal action against all changes in police powers that
violate civil rights. GFF has already lodged constitutional complaints
against the police laws in the states of Bavaria and Baden-Württemberg.
In Germany, police powers are defined on the state level, not the
federal level. At the moment, there is a clear trend to expand these
powers across nearly all German federal states. The development has been
pioneered by Bavaria, where in May 2018, the police was endowed with
powers nearly comparable to those of secret services. The amendment in
question introduced the term of “impending danger”, meaning that the
police is allowed to encroach on civil rights in various ways when
merely assuming that a dangerous situation could develop — which can
virtually always be justified. The police can thus use far-reaching
measures like online searches and telecommunications surveillance as
preventive instruments.
Trend towards expanded police powers
While Bavaria is the most blatant example, several other states have
subsequently introduced police laws that encroach on civil rights.
Baden-Württemberg, Saxony-Anhalt, Rhineland-Palatinate, Hesse,
Mecklenburg-Western Pomerania, North Rhine-Westphalia, and Brandenburg
already amended their police laws.
The amendments differ, but all of them introduce questionable measures
that police authorities may now use. Many federal states introduced
online searches and telecommunication surveillance. This is an
unprecedented way of encroaching on the fundamental right to
confidentiality and integrity of information technology systems. At the
same time, it means that police authorities may take advantage of
security gaps and thereby destabilise the general IT security.
Other new police powers include the use of electronic shackles and
bodycams, the extension of video surveillance in public places, the
possibility of extended DNA analysis, the extension of maximum detention
periods and the technical upgrading of the police (including hand
grenades, stun guns and drones).
Legal action against excessive expansion of police powers
GFF and its partners have already filed constitutional complaints
against the new police laws in Bavaria and Baden-Württemberg and are
currently investigating possible action against the changes in the
police laws of the states of North Rhine-Westphalia and Hesse. GFF is
also critically involved in the reform debates in the other state
parliaments and plans to take legal action against the further expansion
of police powers in Germany.
Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/
Germany: New police law proposals threaten civil rights (05.12.2018)
https://edri.org/germany-new-police-law-proposals-threaten-civil-rights/
Overview of police law changes in the German states prepared by Amnesty
International and GFF (only in German)
https://freiheitsrechte.org/home/wp-content/uploads/2019/04/2019-03_Uebersicht_neue_Polizeigesetze_GFF_Amnesty.pdf
(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)
4. Protecting personal data world wide: Convention 108+
Almost one year after the General Data Protection Regulation (GDPR)
entered into force in the European Union (EU), the question often arises
about what could other countries around the world do to protect their
citizens' personal data. Although there are countries that have data
protection laws in place, many still do not, or have laws that are only
partially adequate.
The need for a global data protection
Given the existing (and increasing) data flows, having different degrees
of data protection in different regions is a threat to those countries
and regions that are advanced in their legislations (such as EU,
Uruguay, Argentina, and Japan). Harmonisation is also key to ensuring
that enforcement is equally strong everywhere, and companies have no
possibility to engage in “forum shopping”.
Currently, the global standard for data protection could be the updated
Convention 108 (“Convention 108+”). This Convention, even though it was
developed by the Council of Europe, can be signed and ratified by any
country around the world. The modernised Convention 108 brings a number
of improvements to the previous text:
- Any individual is covered by its protection, independently of their
nationality, as long as they are within the jurisdiction of one of the
parties who have ratified the Convention.
- Definitions are updated, and the scope of application includes both
automated and non-automated processing of personal data.
- The catalogue of sensitive data has been extended to include genetic
and biometric data as well as trade-union membership or ethnic origin.
- There are now requirements to notify without undue delay any security
breaches.
- Data subjects are granted new rights, namely the right not to be
subject to a decision which affects the data subject which is based
solely on an automated processing.
How to get there
While working to improve data protection at national or regional levels,
an additional effort should be made to be sure that signing and
ratifying Convention 108+ is part of any agenda. On 9 April 2019, the
European Council adopted a decision that authorises EU Member States to
ratify Convention 108+. This should be done without undue delay. At the
same time, the possibilities the Convention 108+ offers for a global
data protection campaign will be discussed with activists from around
the world during the RightsCon 2019 conference.
Modernised Convention for the Protection of Individuals with Regard to
the Processing of Personal Data – Consolidated text
http://search.coe.int/cm/Pages/result_details.aspx?ObjectId=09000016807c65bf
The modernised Convention 108: novelties in a nutshell
http://rm.coe.int/modernised-conv-overview-of-the-novelties/16808accf8
Explanatory Report to the Protocol amending the Convention for the
Protection of Individuals with regard to Automatic Processing of
Personal Data
https://rm.coe.int/cets-223-explanatory-report-to-the-protocol-amending-the-convention-fo/16808ac91a
(Contribution by Diego Naranjo, EDRi)
5. Facebook Custom Audience illegal without explicit user consent
Online shops and marketers routinely share customer data with Facebook
to reach them with targeted advertising. Turns out that in many cases
this is illegal. A ground-breaking decision by a German Data Protection
Authority (DPA) recently ruled that matching customers’ email addresses
with their Facebook accounts requires their explicit consent.
Cold medicine when you catch the flu, outdoor clothing when you want to
go hiking, diapers after you searched for baby care – targeted
advertising on Facebook is everywhere. What many users don’t understand
is how exactly advertisers target them on Facebook.
Facebook’s Custom Audience tool is one of many ways in which advertisers
can find specific audiences on the platform. The tool allows them to get
their message through to people they already know, such as clients from
their online shops or subscribers of their newsletters. It is one of the
foundations of Facebook’s billion-dollar advertising business. It is
also illegal, the way it is often used today.
Here’s how Custom Audience works: Advertisers upload a list with
customer contact information like email addresses or phone numbers.
Facebook then matches these with its own data to identify the desired
audience. “In none of the cases we investigated, had companies informed
their users, subscribers or customers that their contact information
will be shared with Facebook”, explained Kristin Benedikt, head of the
internet division at the Bavarian Data Protection Authority, in an
interview with netzpolitik.org. Her office recently banned advertisers
from using the tool and uploading people’s data to Facebook without
explicit user consent. The Higher Administrative Court of the federal
state of Bavaria upheld the decision in late 2018, after an online shop
had appealed it.
“We are certain that Facebook obtains additional information about users
from matching email addresses, regardless of whether a person is already
registered with Facebook. At the very least, custom audience data shows
Facebook that a user is also a customer of a particular company or
online store. This may seem harmless in many cases, but we have observed
insurance companies that have uploaded email addresses, also online
shops for very specific products. When an online pharmacy or an online
sex shop shares their customer list with Facebook, we cannot rule out
that this reveals sensitive data. The same applies when someone visits
the online shop of a political party or subscribes one of their
newsletters. In all of these instances, custom audiences reveal granular
insights. Facebook adds this information to existing profiles and
continues to use it, without notifying users or giving them a chance to
object,” Benedikt elaborated.
Wide-ranging implications for other Facebook tools
Defenders of the tool such as the data broker Acxiom point to the fact
that the data matching only happens after the data has been hashed.
Hashing is a popular pseudonymisation technique that turns the
advertisers’ customer data such as email addresses or phone numbers into
short fingerprints before they are matched by Facebook, which does the
same with its own data. In our interview, Kristin Benedikt explains that
from a data protection perspective this doesn’t change anything: “When
one of the partners in the process can translate the hash code, the
procedure cannot be anonymous. The whole purpose of Custom Audience is
to find and address selected users.”
Benedikt argues that the decision has implications for the use of other
Facebook tools, such as Lookalike Audience and the Facebook Pixel, even
though the regulator only looked at the use of the specific version of
Facebook Custom Audience that relies on contact lists. The Lookalike
Audience tool allows advertisers to reach out specifically to people who
have similar data profiles to those in their existing databases. The
Facebook Pixel allows them to target people on Facebook who have
previously used their websites and apps.
“In our opinion usage of the pixel method also requires user consent in
order to be permissible. Data processing under the pixel method is
particularly extensive, tracking users across different websites and
devices. This also applies to non-Facebook users. For users visiting a
website, tracking is neither expectable nor recognisable. Only those who
are technically sophisticated can detect data processing in the
background. This is neither transparent nor does the user have a real
choice here,” said Benedikt.
Other European DPAs are showing interest
The case was decided under the federal German data protection law before
the General Data Protection Regulation (GDPR) came into force in the EU
in May 2018. “Nevertheless, we think that the relevant principles still
hold under the GDPR”, Benedikt explained. She stressed that her office
rules out that advertisers could rely on another legal basis for the
data transfer. “At most, there would be the so-called balancing of
interests. But in a case like this, in which the processing is opaque,
the interests of data subjects in the protection of their data clearly
outweighs the companies’ interest in advertising and sales.”
German Data Protection Authorities are organised between the 16 federal
states and the federal government. Benedikt explained that the Bavarian
enforcement action has been coordinated with other German DPAs, giving
reason to believe that this interpretation of the law is not unique to
the Bavarian DPA.
According to Benedikt, DPAs in other European countries have also
expressed interest in the court’s decision, “and asked us for the basis
of our prohibition of using Custom Audiences. So far we only received
encouraging feedback. From our perspective, it actually is a very clear
matter anyhow.”
After netzpolitik.org published the interview, a PR agency that
represents Facebook reached out to them and pointed them towards the
following section in the terms and conditions for Facebook’s Custom
Audience tool:
“Facebook will not give access to or information about the custom
audience(s) to third parties or other advertisers, use your custom
audience(s) to append to the information we have about our users or
build interest-based profiles, or use your custom audience(s) except to
provide services to you, unless we have your permission or are required
to do so by law”. While this passage can give the impression that
Facebook would not add Custom Audience data to existing profiles, it
leaves more than enough room for exception and shifts responsibility to
advertisers (“unless we have your permission”).
Netzpolitik.org has asked Facebook’s PR agency to explain how Facebook
actually uses Custom Audience data, and specifically comment on claims
that Facebook adds the data it obtains from advertisers to existing user
profiles. Facebook declined repeatedly to answer.
This article was originally published at
https://netzpolitik.org/2019/facebook-custom-audience-illegal-without-explicit-user-consent-bavarian-dpa-rules/
(Contribution by Netzpolitik.org, Germany)
6. What the YouTube and Facebook statistics aren't telling us
After the recent attack against a mosque in New Zealand, the large
social media platforms published figures on their efforts to limit the
spread of the video of the attack. What do those figures tell us?
Attack on their reputation
Terrorism presents a challenge for all of us - and therefore also for
the dominant platforms that many people use for their digital
communications. These platforms had to work hard to limit the spread of
the attacker’s live stream. Even just to limit the reputational damage.
And that, of course, is why companies like Facebook and YouTube
published statistics afterwards. All of that went to show that it was
all very complex, but that they had done their utmost. YouTube reported
that a new version of the video was uploaded every second during the
first hours after the attack. Facebook said that it blocked one and a
half million uploads in the first 24 hours.
Figures that are virtually meaningless
Those figures might look nice in the media but without a whole lot more
detail they are not very meaningful. They don't say much about the
effectiveness with which the spread of the video was prevented, and even
less about the unintended consequences of those efforts. Both platforms
had very little to say about the uploads they had missed, which were
therefore not removed.
In violation of their own rules
There's more the figures do not show: How many unrelated videos have
been wrongfully removed by automatic filters? Facebook says, for
example: "Out of respect for the people affected by this tragedy and the
concerns of local authorities, we're also removing all edited versions
of the video that do not show graphic content." This is information that
is apparently not in violation of the rules of the platform (or even the
law), but that is blocked out of deference to the next of kin.
However empathetic that might be, it also shows how much our public
debate depends on the whims of one commercial company. What happens to
videos of journalists reporting on the events? Or to a video by a
victim’s relative, who uses parts of the recording in a commemorative
video of her or his own? In short, it's very problematic for a dominant
platform to make such decisions.
Blind to the context
Similar decisions are already taken today. Between 2012 and 2018,
YouTube took down more than ten percent of the videos of the Syrian
Archive account. The Syrian Archive is a project dedicated to curating
visual documentation relating to human rights violations in Syria. The
footage documented those violations as well as their terrible
consequences. YouTube’s algorithms only saw “violent extremism”, and
took down the videos. Apparently, the filters didn't properly recognise
the context. Publishing such a video can be intended to recruit others
to armed conflict, but can just as well be documentation of that armed
conflict. Everything depends on the intent of the uploader and the
context in which it is placed. The automated filters have no regard for
the objective, and are blind to the context.
Anything but transparent
Such automated filters usually work on the basis of a mathematical
summary of a video. If the summary of an uploaded video is on a list of
summaries of terrorist videos, the upload is refused. The dominant
platforms work together to compile this list, but they're all very
secretive about it. Outsiders do not know which videos are on it. Of
course, that starts with the definition of “terrorism”. It is often far
from clear whether something falls within that definition.
The definition also differs between countries in which these platforms
are active. That makes it even more difficult to use the list; platforms
have little regard for national borders. If such an automatic filter
were to function properly, it would still block too much in one country
and too little in another.
Objecting can be too high a hurdle
As mentioned, the published figures don't say anything about the number
of videos that were wrongfully removed. Of course, that number is a lot
harder to measure. Platforms could be asked to provide the number of
objections to a decision to block or remove content, but those figures
would say little. That's because the procedure for such a request is
often cumbersome and lengthy, and often enough, uploaders will just
decide it's not worth the effort, even if the process would eventually
have let them publish their video.
One measure cannot solve this problem
It's unlikely that the problem could be solved with better computers or
more human moderators. It just isn't possible to service the whole world
with one interface and one moderation policy. What is problematic is
that we have allowed to create an online environment dominated by a
small number of dominant platforms that today hold the power to decide
what gets published and what doesn’t.
What the YouTube and Facebook statistics aren’t telling us (18.04.2019)
https://www.bitsoffreedom.nl/2019/04/18/what-the-youtube-and-facebook-statistics-arent-telling-us/
What the YouTube and Facebook statistics aren’t telling us (only in
Dutch, 08.04.2019)
https://www.bitsoffreedom.nl/2019/04/08/wat-de-statistieken-van-youtube-en-facebook-ons-niet-vertellen/
(Contribution by Rejo Zenger, EDRi member Bits of Freedom; translation
to English by two volunteers of Bits of Freedom, one of them being Joris
Brakkee)
7. Recommended Action
Join our Brussels team!
We are looking for an experienced, strategic and dedicated Head of
Policy to join our team in Brussels. This is a full-time, permanent
position and the start date is expected to be 1 July. Send your
application by 21 May 2019!
https://edri.org/edri-is-looking-for-a-new-head-of-policy/
Join our Brussels team (for a while)!
We are looking for an interim Executive Director to replace our current
Executive Director during her maternity leave from mid-July 2019 to
mid-January 2020. Closing date for applications is 30 April 2019!
https://edri.org/edri-is-looking-for-an-interim-executive-director/
8. Recommended Reading
CCBE Recommendations on the protection of fundamental rights in the
context of "national security" 2019
https://www.ccbe.eu/fileadmin/speciality_distribution/public/documents/SURVEILLANCE/SVL_Guides_recommendations/EN_SVL_20190329_CCBE-Recommendations-on-the-protection-of-fundamental-rights-in-the-context-of-national-security.pdf
Who’s using your face? The ugly truth about facial recognition (19.04.2019)
https://www.ft.com/content/cf19b956-60a2-11e9-b285-3acd5d43599e
9. Agenda
06.05.2019, Berlin, Germany
re:publica 19 – tl;dr #rp19
https://re-publica.com/en/page/republica-2019-tldr
07.05.2019, Bucharest, Romania
5th SEEDIG Annual meeting
https://seedig.net/
11.06.2019, Tunis, Tunisia
RightsCon Tunis 2019
https://www.rightscon.org/
13.09.2019, Berlin, Germany
Netzpolitik-Conference
https://netzpolitik.org/
08.11.2019, Brussels, Belgium
Freedom not Fear 2019
https://www.freedomnotfear.org/
12. About
EDRi-gram is a fortnightly newsletter about digital civil rights by
European Digital Rights (EDRi), an association of civil and human rights
organisations from across Europe. EDRi takes an active interest in
developments in the EU accession countries and wants to share knowledge
and awareness through the EDRi-gram.
All contributions, suggestions for content, corrections or agenda-tips
are most welcome. Errors are corrected as soon as possible and are
visible on the EDRi website.
Except where otherwise noted, this newsletter is licensed under the
Creative Commons Attribution 3.0 License. See the full text at
http://creativecommons.org/licenses/by/3.0/
Newsletter editor: Heini Jarvinen - edrigram@edri.org
Information about EDRi and its members: http://www.edri.org/
European Digital Rights needs your help in upholding digital rights in
the EU. If you wish to help us promote digital rights, please consider
making a private donation.
https://edri.org/donate/
- EDRi-gram subscription information
subscribe by e-mail
To: edri-news-request@mailman.edri.org
Subject: subscribe
You will receive an automated e-mail asking to confirm your request.
Unsubscribe by e-mail
To: edri-news-request@mailman.edri.org
Subject: unsubscribe
- Newsletter archive
Back issues are available at:
http://www.edri.org/newsletters/
- Help
Please ask edrigram@edri.org if you have any problems with subscribing
or unsubscribing.